The quality debate dominates the online market research industry. Much of the focus is on the behaviour of professional or untrustworthy survey respondents, but what about our responsibilities to provide respondents with a decent survey experience?
As a panel provider, we host and offer sample for hundreds of thousands of surveys every year, as well as collecting survey satisfaction data from our panel members for each of the surveys they complete. Through reviewing this data, we have been able to identify the survey design features that drive satisfaction and dissatisfaction, and to assess the long-term impact on panel retention.
Once the survey is in the field, how can we hold the respondent's interest? We found that survey length and reward alone do not determine survey satisfaction, but the design of the survey can have a positive or negative impact on the panellist's survey experience and their continued active involvement in the panel.
So what makes a survey experience negative? Panellists wish to give an honest opinion and dislike it when they are unable to because they have to choose an answer that does not accurately reflect their opinion. It's very important to provide options such as 'Other', 'Neither agree nor disagree' and 'Don't know' so that the respondent is not forced into providing a false opinion.
Another issue, which is a frequent complaint, is repetitive question sets, where the same questions are asked repeatedly for different products or services. This makes it difficult for the panellist to maintain interest in the topic.
Understandably, these questions need to be asked, so how do we ask them without the panellist losing interest? We have had great success by making the questions interactive using Flash and rich media. A progress bar informing the respondent of their progress through the survey also creates a positive experience (as long as the survey isn't too long), so that the respondent can track their progress and have an expectation of how long it will take to complete the study.
Clarity is another common complaint. Although this can be an issue where translations are required, more common complaints relate to the wording of questions and understanding of what was required of the respondent. Online surveys lack the assistance of an interviewer to explain questions to the participant, so clear instructions are crucial. To ensure that the survey questions and instructions are clear and relevant to the panellist, we suggest rigorous beta or pre-testing of the survey.
This all sounds pretty simple, right? Well, it is, but when timelines are under pressure, these simple rules are often overlooked. It really is back to basics for questionnaire design in a lot of the surveys that we see, so, when writing a survey, remember the following recommendations. First, respect and trust respondents, as they want to give their honest opinions. Next, keep the questions clear and uncomplicated. Use new techniques and innovation for continued engagement, and ask your online fieldwork agency to advise on proven solutions.
The rise of 'self-service' questionnaires has also stimulated lots of debate in the research industry, and in my opinion this analogy sums it up - DIY is all well and good, but just because the power tools are available at lower cost, it doesn't mean that anyone is qualified to use them.
There is also a real threat to data quality from the 'DIY' research phenomenon, with unskilled questionnaire writers able to apply badly designed and worded scripts to the respondent. As they say, 'garbage in, garbage out'.
Panel companies often have the greatest knowledge of what works best when surveying their panel, and also have a responsibility to gatekeep, and look after the respondent experience, so don't be afraid to ask your panel provider for their input. Work together with the client, researcher and online panel owner so that, together, we can ensure the best survey experience possible.
Chris Dubreuil is vice-president of client development, UK, at Research Now. Contact him at email@example.com.