Case Study: The Meaningless Countdown Clock and Autonomy Support

Case StudyA common best practice in user experience design is providing users with some kind of indicator of their progress on a task. Whether it’s the Domino’s pizza tracker alerting the hungry that their meal is on its way, a completion bar on a questionnaire letting the test-taker know how many sections remain, or a countdown clock alerting those waiting on line their expected hold time, progress trackers help to support autonomy and sustain engagement in a task.

Recently I had a fairly upsetting experience trying to upgrade the software that I use for this site. I typically upgrade when prompted and had never had any issues until this one. My first attempt to upgrade brought the site down; when I tried to back it up from the hosting service I use, the site actually disappeared from the database. Well, shit.

Long story short, the hosting service was able to successfully restore my website in all its glory after a live web chat. I was very pleased with the service I received from the technician. However, there was a really irritating UX aspect to my live web chat experience.

When I first filled out the form requesting a chat, I was told my wait time was approximately 9 minutes. I spent the first few minutes puttering around my home and folding a bunch of laundry. I returned to my computer worried I had been away too long only to see there were 7 minutes remaining in the countdown. Weird. I decided to wait out the last 7 minutes at the computer. At some point, I again realized that time as I was experiencing it bore no relation to the “time” captured by the web chat software.

I decided to test it by turning on the timer on my phone during the last “minute” of my wait:

Online timer, meet smartphone timer. One of you represents actual time.
Online timer, meet smartphone timer. One of you represents actual time.

Eventually, the technician did enter the chat. When he did, my timer looked like this:

Over two minutes elapsed with just one minute left.
Over two minutes elapsed with just one minute left.

In this case, the company followed the best practice of having a progress meter of sorts telling customers how long their wait time would be. However, they made a few critical mistakes that meant their timer design caused me more frustration than delight:

The timing was not accurate. This is annoying to users because it lets them know that the progress tracking is not actually meaningful–it doesn’t align with the metrics it should. If the tracker bar isn’t meaningful, then it might as well not be part of the site. It certainly doesn’t support autonomy and user choice if it doesn’t provide accurate or meaningful information.

Yet the wording implies accuracy. Anyone who’s worked in customer service knows that it’s impossible to predict exact wait times. Many customers are understanding of this, especially if companies use words like “estimated” or “approximate” when describing wait times. But look at what this particular company said:

The unedited view
The unedited view

There’s nothing here about the wait time being approximate or an estimate. As a user, I’m not given the information I need to correctly interpret this tool as a loose guide rather than a promise of service in one minute.

The combination of implied accuracy and total lack thereof saps user autonomy. Providing a tool that isn’t precise without any guidance indicating how the tool might be used (as a very loose estimate of wait time) deprives the user of the ability to use any information from the tool to modulate his or her behavior.

As I said, my story ended well and I was very happy with the actual service I received from the technicians at my hosting company. Yes, technicianS. Like the genius I am, I attempted to update my site again after the first try failed. The results were predictable. I went scurrying back to my hosting company for help.

My wait time was 13 minutes.