speeder

The impact of 'speeders' in online surveys

‘Speeders’ are respondents who complete a survey, in what is perceived to be, an impossibly quick time. How these speeders affect the quality of data is often in debate.

It has always been my view that speeders should be removed from the data; how can a respondent completing a survey too quickly be engaged with the topic and give valid, insightful responses? But does this only apply to an incentivised study? Or a study using panellists who have been ‘professionalised’ by their tenure? Typically, there are measures in place to stop or at least make it easy to spot someone who is just speeding through a survey, but do they present a danger to data integrity?


As curious researchers do, I conducted a mini case study into this idea. I looked into several projects and analysed the answers provided by the speeders with the hypothesis in mind that their answers would be inconsistent. I was intrigued as to what I would find and whether it would change my view. Would the responses be well-thought through? Would I find the respondent had just skipped through to the ‘submit’ button?

A few caveats to bear in mind:

-       The surveys I used in my case study were not incentivised

-       Respondents were not forced to answer any question

-       Surveys were all CAWI, on a PC not a mobile device

-       Surveys were all using databases supplied by the client

Firstly, I needed to understand what a speeder looked like – how quick is too quick? As the length of interview (LOI) differs across surveys, taking the average LOI I analysed respondents who had completed the survey in less than a third of this time – so for a survey with an average LOI of 15 minutes, I analysed all respondents who completed the survey in under 5 minutes.

This revealed some interesting findings:

1.)    Yes, one respondent hadn’t answered any questions and had skipped ahead to the submit button. I happily removed this person from my data.

2.)    The remainder of respondents had answered most questions, and their answers were consistent and insightful, I even had relevant responses in open boxes which I wasn’t expecting.

3.)    Some questions had been skipped, particularity towards the end.

4.)    There was no straight lining (when respondents click on the same response each time, e.g. the first response option in a scale or all `strongly agree` in a grid).

Reasons for skipping through surveys are varied, but all link to a lack of engagement with the survey itself, either the topic or the design. For example, the survey is full of long grid questions, repetitive questions or the question wording is long and difficult to read. Respondents want to give feedback, but if your survey is too long, too boring or too confusing they will close out or race to the end.

From my case study I learnt that I am right to be wary of speeders and as part of data cleaning, speeders should always be identified and flagged for examination. However, it is not necessary to remove all of them if their answers are consistent and relevant as I found.  Respondent attentiveness can sometimes diminish throughout the survey so it is important to assess speeders in a variety of ways. Checks of their interview duration, if they have given consistent and relevant responses, if they have written open ended responses (if these opens are enforced, do answers make sense?) and checks of any straight lining are the best ways of identifying a disingenuous respondent.

By Jenny Tipler, Senior Research Executive

 

This entry was posted in Surveys, tagged Surveys, Quantitative, Research tips, Respondents, Engagement and posted on January 3, 2017


<< Back