How do you build technology that people trust?
The world gives us so many examples why we shouldn’t trust technology. Many Americans recently had their personal financial data put at risk by Equifax. It’s looking increasingly likely that Facebook deliberately shaped people’s information exposure in ways that influenced a presidential election. And there are reports that hackers can hijack connected home devices with high frequency voice commands not detectable by human ears. Yet, we persist in creating digital solutions for health, finance, and other incredibly personal topics and ask people to trust them–to trust us.
As designers, there are some ways we can imbue our products with better trustworthiness. I’m going to leave aside what is probably the most important one–deserve people’s trust, by treating their data fairly and being honest and forthcoming about how you plan to use it–because if you’re not already doing that then none of the rest of this matters.
Here are three factors you can tinker with to help people trust your product:
Give it a personality. In one study, researchers found people were more trusting of autonomous vehicles the more the navigation system was perceived as human-like (Waytz, Heafner, & Epley, 2014). We’re psychologically built to trust people over things. They may not be the most fully fleshed characters, but people will chat with Siri and Alexa because they seem enough like a human that our minds can fill in the gaps. Can you make your product seem more human?
Ironically, the fact that a digital tool isn’t human may sometimes make people trust it over their real live doctor. One study found people are more likely to disclose embarrassing health information to a “virtual human” when they believed it was totally automated than when they believed it was piloted by a real person (Lucas, Gratch, King, & Morency, 2014). Sharing embarrassing information may be easier if there’s no fear of (human) judgment. So perhaps the trick is less making your technology just human enough.
Show value quickly. “Demonstrate value” is not just the first step in the D.E.N.N.I.S. System; it’s also a good way to get people to trust your technology more. A study of TripAdvisor found that people trust the reviews that provide high quality information (Filieri, Alguezuai, &McLeay, 2015). Giving people value quickly will earn their trust. Can you deliver your users an insight early in your onboarding?
Give people (perceived) control. A major reason people distrust tech tools is they don’t feel in control of their experience. Offer people opportunities to shape how they use technology (hello, Facebook newsfeed). With automated processes, allowing users to tinker with algorithms increases their trust in them (Dietvorst, Simmons, & Massey, 2016), even if the users’ customizations ultimately make the algorithm worse. At the very least, transparency around algorithms helps improve trust (Verberne, Ham, & Midden, 2012). How can you open your process to users and help them feel more in control of their experience?
True trust is the product of an ongoing relationship, not a design quirk. These examples show how product designers can help foster more trust from users through their design choices, but they’re not a substitute for treating users’ data and time respectfully over the lifespan of your product.
References:
Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Overcoming algorithm aversion: People will use imperfect algorithms if they can (even slightly) modify them. Management Science. https://doi.org/10.1287/mnsc.2016.2643
Filieri, R., Alguezaui, S., & McLeay, F. (2015). Why do travelers trust TripAdvisor? Antecedents of trust towards consumer-generated media and its influence on recommendation adoption and word of mouth. Tourism Management, 51, 174-185.
Lucas, G. M., Gratch, J., King, A., & Morency, L. P. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94-100.
Verberne, F. M., Ham, J., & Midden, C. J. (2012). Trust in smart systems sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars. Human Factors: The Journal of the Human Factors and Ergonomics Society, 54(5), 799-810.
Waytz, A., Heafner, J., & Epley, N. (2014). The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology, 52, 113-117.
YES – Perceived safety and perceived control are often more important to people than the real thing! I’ve looked into this too – http://marli.us/2014/08/11/creating-an-aura-of-safety/ and http://www.uxbooth.com/articles/the-perception-of-control/