The best customer survey design tips, from Glasgow to the world!
Whether you’re conducting marketing research completely in-house, or you wish to do most of the research design yourself before outsourcing it, good customer survey design is essential.
The following Top 10 tips are by no means an exhaustive source of advice on the subject – plenty of books exist providing excellent instruction – but if you follow these rules you’ll definitely be on the right track. Good luck!
1. Build up rapport
How would you feel if you were suddenly stopped by a complete stranger and fired a salvo of questions about the goods and services you had purchased or your attitudes to sensitive topics? Would you feel embarrassed? Offended? Confused? Weary? All of the above?
It’s fair to say that if you’re an online sex-toy retailer conducting customer satisfaction research, your sample will consist of relatively open-minded and relaxed people, given the products they have already purchased. However, that does not mean it’s okay to immediately start probing respondents about the more private aspects of their lives.
Regardless of the data collection method, ensure there’s an initial introduction, an explanation for the research and general areas to be covered in the customer survey, and a thank you for their participation. Do this before asking any questions.
For extra reassurance, address confidentiality issues and respondents’ right to anonymity. Most respondents are happy to proceed on the explicit agreement that they will not receive a deluge of subsequent promo emails or direct mail based on their individual survey responses.
Always start with general questions before honing in on specific aspects; this allows the respondent to familiarise themselves with the survey process and increase their comfort levels. Once the respondent has answered some of these, the survey can progress to more detailed issues. As a general rule, the more sensitive the questions, the further back in the survey they go. When you want to know about respondents’ age, sex, region, orientation, socio-economic group etc. (known as classification data), this is nearly always asked close to the end. The idea being that by this time, with rapport already built up, respondents will be less likely to refuse such requests.
2. Laying down the law
There’s a time and a place for rules and regs and also for when to back off. Although customer survey respondents very much have the upper hand (they can choose not to respond, or to exit halfway through a survey), that doesn’t mean to say that some conditions can’t be laid down – nicely.
Ensure respondents answer questions in the required way before moving to the next one. Without such controls, you may well find that as the survey continues and questions become more detailed the respondent base decreases. This obviously isn’t good for data analysis.
If doubts exist about being seen to ‘force’ a response (especially regarding sensitive question areas), offer an alternative. Provide a ‘Prefer not to say’ option; anyone who doesn’t want to respond to that particular question but still wants to help with the rest of the survey should never be ejected. A missing response or two from an otherwise completed survey is far better than an incomplete survey – and premature exit – due to a respondent’s discomfort or dissatisfaction.
3. Watch your grammar!
This question appeared in a visitor survey on a UK sex toy retailer website:
Do you know if your partner owns a vibrator?
Those responding ‘Yes’ were immediately asked further questions about vibrators. So what’s wrong with it?
It’s misleading. Think about the subsequent findings, based on the original wording.
“X% of those questioned know if their partner owns a vibrator or not.”
It’s hard to believe that the purpose of the question is to really ask respondents if they know if their partner owns a vibrator.
The survey designers really wanted to ascertain levels of ownership – something along the lines of “X% of respondents’ partners own a vibrator” – far more commercially valuable data! But it hasn’t come out that way at all. Whilst the questionnaire designer most likely wanted the survey to be friendly and engaging, in doing so he/she has potentially ruined the data due to poor question phrasing.
Questions must be free from ambiguity, otherwise the survey – and the subsequent data – is open to response error, and being rendered useless for the purpose it was designed for.
The question should therefore be phrased simply as follows:-
Does your partner own a vibrator?
4. Consider possible respondent groups and sub-groups when designing questions.
Give serious consideration to how the research can be used to identify different respondent segments.
Let’s consider the previous question again. Rather than simply asking a ‘Yes/No’ question, why not offer several response options allowing respondents to answer the question, AND potentially identify some new user segments?
Change the question to:-
How many vibrators does your partner own?
5 or more
By rewording the question and providing these options, the original question is still being answered, but now there is the opportunity to gain much greater insight. At the analysis stage the responses can be netted to provide (for instance) the following ownership segments: None, Light (1-2), Medium (3-4), and Heavy (5 or more). Can you see how invaluable this is, especially when analysing the rest of the data? You can now look at responses to all the other questions based on levels of ownership, and see whether any key findings are found along these lines.
This is a key reason for thinking carefully about the questionnaire design process and not rushing things. Questions must be phrased properly to both avoid response error and confusion AND to add as much value as possible to the analysis. Remember to get the most bang for your buck when it comes to your research!
One final point: you may now be sorely tempted to eradicate all Yes/No/Don’t know questions from your survey, replacing them with multiple response questions, accompanied by a raft of response options. And who can blame you, given their potential to add real weight to the analysis?
Employ common sense when considering the potential response options. Keep the number per question to a sensible limit. Furthermore, should you appoint a research agency to conduct the customer survey, many have limits of response options allowed and may charge extra for additional options per question.
Of course, there are questions where a Yes/No/Don’t know choice is either all that’s required, or is the only appropriate option. Generally, Yes/No/Don’t know questions can be asked about FACTS, but never about ATTITUDES.
5. Explain terminology concisely
Never assume respondents possess perfect knowledge. Some many not have heard of the topic, and others may have differing – and incorrect – perceptions of what the term describes. Either way, there’s serious potential for dodgy data as a result.
To minimise the risk of response error, explain any terminology or concepts in a separate paragraph before asking subsequent questions related to it.
The next two questions concern the subject of ‘WIDGETS’. Widgets are defined as:
[insert statements or bullet points, etc.]
Based on the above definition, have you at any time used a widget?
Prefer not to say
A further point to make while we’re on this subject. Whilst providing definitions for participants is absolutely encouraged – to ensure respondents are all ‘singing from the same song sheet’, there’s a difference between explaining something, and providing additional information about a topic which an opinion is being sought on. Most of the time, surveys are seeking respondents’ existing views to a subject, regardless of the level of information they possess.
6. Eliminate ambiguity!
“Do you approve or disapprove of current UK legislation regarding widgets?”
The problem with the above wording is that differing reasons for selecting the same response option exist. Consider those disapproving: some respondents may feel the law doesn’t extend far enough, whilst others feel it’s too constraining. Same response, but two very different camps exist. Either re-word the question and response options, or follow up this question with another, asking respondents which side of the fence they belong to.
Consider also the next examples. If you’re running a U&A (usage and attitude) survey, for instance, it’s vital there’s no room for misinterpretation. Otherwise once more, the potential for dodgy data (and misguided decision-making arising from it) can be high.
“What do you watch movies on?”
This could refer to either a hardware device (DVD player, personal computer, mobile device) or the format/technology used (Blu-ray, DVD, VOD, videotape). Ensure that the phrasing of the question leaves no room for doubt.
“How many movies have you bought in the last year?”
Does this refer to a calendar year (January – December) or the last 12 months (April – April, for instance). Be precise!
When looking at attitudinal questions – such as levels of agreement – in surveys, response options vary. Consider the following examples.
“To what extent do you agree or disagree with the following statements…”
The response options above force respondents to either give a positive or negative response. What about those with neutral or indifferent attitudes? Herding respondents into a camp they don’t feel they belong to will firstly result in incorrect data, and secondly, cast doubt in respondents’ minds as to whether their true attitudes and opinions are actually wanted. At this point, many respondents may exit the survey.
A better option is to provide a mid-point:
Neither agree nor disagree
However, there’s a difference between having a neutral attitude to something and not knowing what one’s attitude is. Inserting a ‘Don’t Know’ option ensures that all conceivable bases are now covered.
Neither agree nor disagree
One of the most popular uses of undertaking marketing research is to assess levels of customer satisfaction with goods or services provided. Respondents can be asked to provide an overall satisfaction rating, provide ratings on individual aspects of service or product attributes, or a combination of both.
When seeking ratings relating to individual aspects and in overall terms, it’s recommended asking for overall ratings before moving to more detailed areas. Two reason exist for this. Firstly, in the vast majority of cases, you’re seeking people’s initial assessment (respondents’ first thoughts are nearly always the most truthful) rather than after they have had time to consider other aspects. Secondly, there’s the risk of overall ratings being unduly influenced – perhaps negatively so – as a direct consequence of thinking about individual aspects in detail beforehand.
9. Keep ’em interested!
Survey respondents will be fairly happy to participate in research if they believe their responses will be absorbed and their relationship with you can potentially improve. But while the subject matter should keep respondents interested there’s more you can do to maximise the chances of a survey completion.
Don’t have an endless battery of grid-type questions (such as measuring agreement/disagreement) – this is one of the biggest causes of respondent non- completion, regardless of how interesting the subject matter is.
If you have to ask these questions, break the questions up to ease the monotony and maintain interest levels. One common method of alleviating this issue is to separate grid questions with a question or two using different question formats. Keep the number of statements to be rated in each grid to no more than ten, preferably less. If more are to be asked, then create another grid.
10. Keep scales consistent
You must be able to view a customer survey from a respondent’s perspective as well as wearing your designer’s hat. How taxing is your survey be to complete? Keep things simple and straightforward. For lengthy surveys featuring lots of grid questions, this means keeping scales consistent throughout the survey. Don’t have one question offering response options of: Strongly Agree, Agree somewhat, Disagree somewhat, Disagree strongly, and have others offering those options and also Don’t know or Neither agree nor disagree options. Each time this occurs respondents have to mentally re-attune, requiring extra effort and time.
There you go: Red Pill’s Top Ten Tips for good customer survey design.
Would you like to know more about marketing research? Check out the Guide to the Marketing Research Process – a step-by-step guide to the key stages in a typical research project.
Are you a lingerie designer-seller or retailer? Then download the ‘Vital Statistics’ customer research guide – packed full of key question areas to be asking your customers, to really help your future marketing efforts. Download it now!
Remember, if you’re in Glasgow or further afield, don’t hesitate to get in touch to discuss how good customer surveys can help your business not only survive but thrive.