Most of the data I work with is self-report, provided by a user to a database via a device like a computer or a mobile phone. No live counselor or coach processes that information before it’s crunched in the database and appropriate content selected for the user to read.
There are drawbacks to this method to be sure. We don’t have the luxury of interpreting non-verbal cues like facial expression or tone of voice that could give nuance to a user’s words. We can’t be as sensitive about follow-up questions as we would be in a live conversation, since any follow-ups and their associated skip logic are pre-written. And we don’t allow users an opportunity to add color commentary, which leads to occasional frustrated feedback from users who really want to explain their specific circumstances related to their health.
One drawback this type of data does not have as much as people might expect though, is veracity. Surprisingly, when talking to the computer, users don’t lie.
In my work, we’ve often found that when we compare self-reported health data to verified sources like medical claims or health records, people are generally pretty honest with what they tell the computer. Now another new report confirms that people are not only pretty honest with self-report, but they may be even more comfortable telling embarrassing health information to a computer than a person.
As a psychologist, I think this makes total sense. There are many factors that would make it more difficult to tell a human being embarrassing health information than a computer, such as:
- Social desirability. We know what the “right” behavior is and don’t want to admit
to doing the wrong behavior. As a result, it’s difficult to confess to a pizza addiction or exceeding the recommended number of alcoholic beverages several times per week.
- Feeling judged. This is a slight variation on the above. We want to be liked and admired, and when we have to tell a live human being something less than savory about ourselves, it can be hard. So sometimes we lie, or fudge the truth, hoping to look a little better in the eyes of the doctor.
- Not wanting “the talk.” We have seen a similar reaction doing user testing for online programs. People with health issues, especially ones that aren’t new, often know what they should be doing; they’re just not doing it for some reason. Hearing the speech about what they should change feels tiresome and unwanted, and so a person might fail to disclose information that would prompt it.
- Wanting private information to stay private. Doctors are human beings, and human beings are social. Patients may worry (and sometimes, unfortunately, with good reason) that their embarrassing confessions could become water cooler conversation. An online program ostensibly offers more privacy than a live human being since it’s far less likely to get a little tipsy at happy hour and share a funny story about a patient.
The last point, about privacy, also ties into why it is so important that the infrastructure around your web intervention be well-organized and -communicated. Terms of service and privacy agreements are critical. These documents are notoriously long and impractical for the average user to sift through, and it may not be possible to radically remodel them in the short term while maintaining appropriate legal protection. However, especially after conducting many rounds of user testing, I think it’s important to create a condensed, user-friendly “cheat sheet” to the terms of service and especially the data privacy for any online intervention. The questions it should answer include:
- Who will see my data? (Our users often worry their physicians or HR managers will have access to individual data. They do not. This needs to be made clear to users.)
- What are you going to do with my data? (We create summary reports for customers describing their population’s health behaviors. We also do data analysis for research. And it’s worth emphasizing here again that we never identify any individual user within the database.)
- Where else are you getting data about me? (We might link to another data source like an EMR, or we may receive data from a different online program or tool. Users shouldn’t ever be surprised that we know something about them, and they should have explicitly agreed to have those pieces of data linked.)
In the USC research, participants’ willingness to disclose information was directly related to their belief that their information would be private and protected. If we want to truly recognize the value that online coaching interventions could bring to users, it’s critically important that we think about the often taken-for-granted infrastructure around them and revise them to give users a better level of education, understanding, and comfort.