FEATURE1 July 2010

This month we... became a fake respondent

Robert Bain goes undercover and signs up to a raft of online panels. He vows to fill in all the surveys. He regrets it.

Are online respondents who they say they are? Do they fill in surveys properly? Are the surveys any good? The best way to find out, we decided, was to sign up to a bunch of panels ourselves. For a month, I would become a fake respondent.

To start with I created a fake name and email address, and decided to give my occupation as IT manager to make sure I didn’t get rejected for being a journalist in the MR industry. Apart from that I would be as honest as the surveys allowed me to be. Obviously there is nothing remotely scientific about this report – it’s just an account of one person’s experience – so to prevent anyone leaping to unfounded conclusions we have chosen not to name the companies whose surveys we did. Suffice to say that if you work in research you know them all. We should also state that we did not redeem any of the incentives, even though I earned enough cash to buy a pint and a half down the local.

Day 1
I began my adventure by spending an afternoon going to the sites of all the panels I could think of and signing up. I tried for eleven but only got in to ten – the last one wanted US residents only. Once you’ve joined, the first thing you have to do is fill in lots of profiling questionnaires about your life and interests – like a cross between a Facebook profile and a tax return. For one company I had to do 15 of these, with grids of up to 90 tick boxes, while another made me answer 11 questions about a car I don’t own. It’s going to be a tough month.

Day 2
I completed four surveys today and got invites for many more. One of them made me promise that I’d “read each question thoroughly and respond to each question thoughtfully and honestly”. I felt slightly affronted that it was me who had to promise to behave, rather than the survey, but it didn’t matter as I soon got kicked out anyway. After lunch I started on another, which reached 50% on the completion bar before kicking me out on the grounds that it was “full”. Do I spot a pattern?

Day 3
The day begins with the oddest question yet: “Please select hamster from the list below: Cat, dog, bird, hamster, mouse.” No explanation is offered.

Day 4
Received an email this morning from one of the firms I signed up with on Day 1, asking me some rather probing screening questions. Am I the only person who uses the email account where the invites get sent? Why did I join the panel? How many online surveys do I do in a month? (I say 30, although with 27 days to go that could be an underestimate). We’ll see if they send me any more surveys now I’ve admitted being a panel junkie.

Day 9
I return from the weekend to find 18 invites in my inbox. Is that a lot? There’s a nice survey about an online drinks promotion, then another about YouTube that uses a slick interface to mimic the site. Then things take a turn for the worse: I share my feelings about what sort of car rental services I might use, for what purpose and in what areas, only to be told that the survey in question is full. The same thing happens in the next survey. When I finally find one that accepts me, it asks me to rate statements including “I look for a domestic appliance to be my ally” and “I want a funny and pleasant to use domestic appliance”. If these surveys were people, I wonder, what sort of people would they be?

Day 10
Three surveys kicked me out this morning after I had opened my heart to them about crisps and other personal matters. Have I been spotted as a fake? If so, why do they let me begin the questionnaires?

Day 11
A word of warning: not all the surveys billed as an interesting opportunity to express my point of view live up to the claim. In my book, comparing different combinations of different soft drinks at different prices does not count as an interesting opportunity to express my point of view.

Day 17
My stamina is failing and I’m checking my survey invites less often. Still, the one I clicked through to today was unusually pretty. Each row of the answer grid only appeared after I’d completed the row above, so I was always on the bottom row. A nice, effective design. Then it kicked me out.

Day 22
A revelation this morning: a survey that explained its screening procedure and warned me that it couldn’t guarantee I’d qualify. As a result I’m not nearly so annoyed when I get kicked out a few moments later. Later, a survey being run by a major MR agency breaks the record for the number of radio buttons: 1,302 in a 217¥6 grid. Ouch.

Day 31
With just minutes to go before the month is done, I click through to one last survey and begin to fill it in. It turns out to be a huge forest of buttons, and after 20 minutes I give up without finishing. What sweet relief.

Out of the shadows
In 31 days my fake respondent persona received 150 email invitations, clicked through to 99 surveys, started 73, got kicked out of 39, completed 30, crashed out of three and gave up on one.

“Everything we hear about marketing in 2010 suggests that indifference to brands and brand messages needs to be understood, not underestimated. Most online surveys aren’t helping”

So what did I learn? As to whether I was spotted as an imposter, it’s unclear. I received and completed surveys from half of the ten firms I signed up with, but I was screened out of more than half of the surveys I began, usually after answering demographic questions, and often after questions about my behaviour and opinions too. Whether they realised I was a fake or a duplicate, or whether they were just rejecting me on my answers, I don’t know.

I was asked about my survey-taking habits four times. Two companies asked me when I signed up if I was a member of other panels. After I told them I was a member of quite a few, the first of them sent me no more invitations, but the second continued to invite me to surveys (including studies being run by the first).

Two surveys from another company screened me out after I told them how many studies I’d taken part in recently – but the same company had already let me do a number of surveys without checking.

Doing lots of surveys for a month isn’t really enough to judge companies on their panel management, but it is enough to judge the general quality of online surveys. The first thing that struck me is that their design is pretty poor, in both its usability and creativity. Some surveys are slick and pleasant to navigate, but most aren’t. In the worst cases a combination of clunky systems, sloppy presentation, haughty instructions and awkward layout can create a feeling of disrespect, even rudeness, towards the user.

The way surveys are managed is often clunky too. Links remained active on panel websites for surveys that were closed or for which, according to what the company already knew about me, I was not eligible.

The second thing I noticed was that surveys are not good at being upfront and honest with the user. The single most common outcome of starting a survey (and the single biggest annoyance) was that I would spend time working through questions only to be told I wasn’t eligible. Was I being rejected for being fake? Perhaps in the cases where I was blocked right at the start, but in many instances I was allowed to get well into the survey before being turfed out without explanation, and denied the incentive.

The third point is that surveys often provide little room for the truth, forcing respondents to feign opinions when the only sensible answer is Don’t know / Don’t care. Apart from being frustrating for the survey taker (I’d rather not be asked my opinion than be asked and not be able to answer) this will surely only leave researchers with data that looks complete but isn’t. Everything we hear about marketing in 2010 suggests that indifference to brands and brand messages needs to be understood and not underestimated. Most online surveys aren’t helping.

Doing several surveys a day for a month would make anyone sick of them, so my feelings may be exaggerated compared to a typical panel member. But if the MR industry’s aim is for respondents to be wanting rather than just willing to do surveys, my experience over the past month tells me it’s got work to do.