Authors: John Holguin, Jeffrey Chen, Alexa Ajello
Faculty Mentor: Dr. Wayne Giang
College: Herbert Wertheim College of Engineering
Low health insurance literacy and lack of knowledge are among the most common obstacles preventing informed health insurance enrollment decisions. Websites are common digital sources of information, but Virtual Benefits Counselors (VBC), decision-aids that attempt to guide users through the enrollment decision process, are now also used. However, little is known about how these different sources of information support users’ decision strategies. The goal of this poster is to describe the process used to understand and compare user interaction strategies with VBC decision aids information provided on Human Resources (HR) websites. Transcript data was taken from a think-aloud experiment where participants verbalized their thoughts while interacting with one of the two systems while making an enrollment decision. This data was coded using a preliminary set of categories: decision-making, information search, and navigation. After the initial categorization of data, further sub-categories were derived to accommodate emerging patterns of factors through discussion of the research team. The process revealed several important decision factors considered by participants, including coverage, financial considerations, and prior knowledge; different information search strategies were also identified. These results will help us understand and support how individuals navigate health insurance information to make informed decisions.
Hello!
I was wondering why your sample size is only 12. How does this affect your data?
Hey Joshua,
Our sample size is only 12 participants because that was all of the trials that we had run up to this point sadly. However we are continuing to add new data and new participants (We have a total of 16 currently). This sample size, though relatively small, is still large enough for us to make these introductory conclusions. With the onset of more and more data we will be able to either reinforce or redefine our conclusions.
Thanks!
Hey! So I was wondering if you guys had any explanation for the lack of variation in the data between the two systems that were tested.
The lack of variation could be explained by a systemic issue, depending on how we tagged each instance as an individual instance or a block, and the small sample size. Given a larger sample size, with a more refined tagging procedure, there might be more noticeable differences in data from the two systems.
Great work on this poster, Alexa, Johnny, and Jeffrey! And very nice answers to the questions as well.
Hi guys, great poster!
What kind of information searching strategies were the participants engaging in? When you say they are similar, where they searching for similar information or engaging with the system in a similar way?
Hi Emma,
We classified the participants’ information search strategies as either passive searches, active related searches, or specific goal searches. When we say the strategies between the two systems are similar, we mean that the number of instances of each type of information search were roughly the same between the two systems, indicating that participants engaged in the different systems in a similar way.
Really nice study and presentation. Very clearly presented, and I especially appreciated the background section that set it all out. And what an interesting study. Nice!
Hi there! You guys really did an amazing job! Great poster! It’s so exciting to see three of you are so creative!
Technically, the VBC (Alex) is for sure to help people to make better decisions in many ways, so how do you think why the results in this study didn’t show the great difference between VBC and website. Plus, is there any future plan based on your current findings?