Usable but not understood: a case study on chatbots and HIPAA

Daniel Tkacik

Jul 21, 2022

chat bot

Source: CyLab

A new study by CyLab Ph.D. student Sarah Pearman looks at optimal designs for an online HIPAA consent process.

Meet Dani. Dani tripped and hurt her ankle while she was hiking and might need an X-ray. Worried about how much it might cost, Dani logs in to the website of her health insurer, “HealthCo,” and clicks on a chatbot to see if she can learn how much the X-ray might cost her given her specific insurance plan.

But before the chatbot does anything else, it presents her with a Health Insurance Portability and Accountability Act (HIPAA) authorization document that she must read and agree to before asking the bot her questions. That’s because the bot will need consent from Dani before sending any of her health data to the Cloud.

This is a big challenge, says CyLab’s Sarah Pearman, as studies have shown people are unlikely to read these authorization documents because they are typically long, jargon-filled, and confusing.

“The healthcare context adds even more pressure,” says Pearman, a Ph.D. student in Societal Computing. “You’re even less likely to read something like this when you are in pain and trying to seek healthcare in a timely fashion.”

The healthcare context adds even more pressure.

Sarah Pearman, Ph.D. student, Societal Computing

In a new study presented at last week’s Privacy Enhancing Technologies Symposium, Pearman presented a case study on the redesign of an online HIPAA authorization.

In the study, the researchers used user feedback to iterate on the original consent flow design in terms of usability and comprehension. The team incorporated known best practices for notice and consent as they redesigned the prototype.

On all versions of the flow design, users first would log in to the “HealthCo” website, click on the chatbot icon, and then click “Let’s chat.” Users were then presented with the HIPAA Authorization step—a step they needed to complete before conversing with the chatbot—and Pearman’s team designed three different versions of this step, aiming to make it usable and understandable. The researchers then surveyed over 700 participants to compare the three different versions of the consent flow design.

The final version of their consent flow design performed well in terms of usability—roughly 90 percent of users found it to be easy to access the chatbot—but gains in understanding were limited. That is, until people were prompted to read the authorization a second time, after which, people were generally less willing to use the chat bot.

“When they re-read, they learned something new—that Google Cloud was getting their data—and they were not happy about it and no longer wanted to use the thing that they had previously agreed to use,” says Pearman. “That is important to know.”

What’s more, users tended to overestimate the power of HIPAA to prevent disclosures of their health data. For example, over 60 percent believed HIPAA prevents anyone or company from sharing healthcare info, but it doesn’t.

“HIPAA does set rules about the sharing of healthcare information, but it doesn't apply to everyone and every entity,” says Pearman. “It applies to healthcare providers, health insurance companies, and other adjacent entities that interact with healthcare.”

Pearman hopes that the privacy technology community can help create technologies that make this kind of decision easier for users.

“If you are using a chatbot, or a voice interface like a smart speaker, perhaps that conversational interface should be able to just talk to you directly,” she says. “They could explain privacy details, answer privacy questions, and walk you through a consent process in a more natural way.”

Paper reference

User-friendly yet rarely read: A case study on the redesign of an online HIPAA authorization

  • Sarah Pearman, Carnegie Mellon University
  • Ellie Young, New College of Florida / Carnegie Mellon University
  • Lorrie Cranor, Carnegie Mellon University