People lie on surveys and focus groups, often unwittingly

Raymond Chen

Philip Su’s discussion of early versions of Microsoft Money triggered a topic that had been sitting in the back of my mind for a while: That people lie on surveys and focus groups, often unwittingly. I can think of three types of lies offhand. (I’m not counting malicious lying; that is, intentional lying for the purpose of undermining the results of the survey or focus group.)

First, people lie about the reasons why they do things.

The majority of consumers who buy computers claim that personal finance management is one of the top three reasons they are purchasing a PC. They’ve been claiming this for more than a decade. But only somewhere around 2% of consumers end up using a personal finance manager.

This is one of those unconscious lies. People claim that they want a computer to do their personal finances, to organize their recipes, to mail-merge their Christmas card labels.

They are lying.

Those are the things people wish they would use their computer for. That’s before the reality hits them of how much work it is to track every expenditure, transcribe every recipe, type in every address. In reality, they end up using the computer to play video games, surf the web, and email jokes to each other.

Just because people say they would do something doesn’t mean they will. That leads to the second class of focus group lying: The polite lie, also known as “say what the sponsor wants to hear”.

The following story is true, but the names have been changed.

A company conducted focus groups for their Product X, which had as its main competitor Product Q. They asked people who were using Product Q, “Why do you use Product Q instead of Product X?” The respondents gave their reasons: “Because Product Q has feature F,” “Because Product Q performs G faster,” “Because Product Q lets me do activity H.” They added, “If Product X did all that and was cheaper, we’d switch to it.”

Armed with this valuable insight, the company expended time, effort, and money in adding feature F to Product X, making Product X do G faster, and adding the ability to do activity H. They lowered the price and sat back and waited for the customers to beat a path to their door.

But the customers didn’t come.

Why not?

Because the customers were lying. In reality, they had no intention of switching from Product Q to Product X at all. They grew up with Product Q, they were used to the way Product Q worked, they simply liked Product Q. Product Q had what in the hot bubble-days was called “mindshare”, but what in older days was called “brand loyalty” or just “inertia”.

When asked to justify why they preferred Product Q, the people in the focus group couldn’t say, “I don’t know; I just like it.” That would be perceived as an “unhelpful” answer, and besides it would be subconsciously admitting that they were being manipulated by Product Q’s marketing! Instead, they made up reasons to justify their preference to themselves and consequently to the sponsor of the focus group.

Result: Company wastes tremendous effort on the wrong thing.

(Closely related to this is the phenomenon of saying—and even believing—”I’d pay ten bucks for that!” Yet when the opportunity arises to buy it for $10, you decline. I do this myself.)

The third example of lying that occurred to me is the one where you don’t even realize that you are contradicting yourself. My favorite example of this was a poll on the subject of congestion charging on highways in the United States. The idea behind congestion charging is to create a toll road and vary the cost of driving on the road depending on how heavy traffic is. Respondents were asked two questions:

  1. “If congestion charging were implemented in your area, do you think it would reduce traffic congestion?”
  2. “If congestion charging were implemented in your area, would you be less likely to drive during peak traffic hours?”

Surprisingly, most people answered “No” to the first question and “Yes” to the second. But if you stop and think about it, if people avoid driving during peak traffic hours, then congestion would be reduced because there are fewer cars on the road. An answer of “Yes” to the second question logically implies an answer of “Yes” to the first question.

(One may be able to explain this by arguing that, “Well, sure congestion charging would be effective for influencing my driving behavior, but I don’t see how it would affect enough other people to make it worthwhile. I’m special.” Sort of how most people rate themselves as above-average drivers.)

What I believe happened was that people reacted by saying to themselves, “I am opposed to congestion charging,” and concluding, “Therefore, I must do what I can to prevent it from happening.” Proclaiming on surveys that it would never work is one way of accomplishing this.

When I shared my brilliant theories with some of my colleagues, one of them, a program manager on the Office team, added his own observation (which I have edited slightly):

A variation of two of the above observations that often shows up in the usability lab:

A user has spent an hour battling with the software. At some point the user’s expectation of how the software should behave (the “user model”) diverged from the actual behavior. Consequently, the user couldn’t predict what will happen next and is therefore having a horrible time making any progress on the task. (Usually, this is the fault of the software design unintentionally misleading the user—which is why we test things.) After many painful attempts, the user finally succeeds, gets hints, or is flat-out told how the feature works. Often, the user stares mutely at the monitor for five seconds, then says: “I suppose that makes sense.”

It’s an odd combination of people wanting to give a helpful answer with people wanting to feel special. In this case, the user wants to say something nice about the software that any outside observer could clearly tell was broken. Additionally, the users (subconsciously) don’t want to admit that they were wrong and don’t understand the software.

Usability participants also have a tendency to say “I’m being stupid” when those of us on the other side of the one-way glass are screaming “No you’re not, the software is broken!” That’s an interesting contrast—in some cases, pleading ignorance is a defense. In other cases, pleading mastery is. At the end of the day, you must ignore what the user said and base any conclusions on what they did.

I’m sure there are other ways people subconsciously lie on surveys and focus groups, but those are the ones that came to mind.

[Insignificant typos fixed, October 13.]

0 comments

Discussion is closed.

Feedback usabilla icon