Why surveys should always be piloted
This morning I completed an almost incomprehensible marketing survey. Here’s an example of one of the questions.
Like all of the other questions in the survey, you have to answer it to proceed to the next question. There’s a fixed range of answers that can be selected, with nowhere for me to indicate that I didn’t understand the question. Most of the questions were like this, so my best guess is that YouGov’s client will end up with statistical noise and a sprinkling of confirmation bias.
My suspicion is that the survey wasn’t piloted before release with its target audience. If it had been, simple ambiguities (does 2030 mean half past eight tonight or is it something due to happen in 14 years?) would have been picked up, questions would have been rephrased to make them comprehensible to the lay-person and the ability to answer ‘don’t know/don’t understand’ would have been provided.
But even if such changes had been made, it’s doubtful that anything insightful will result from the survey. The client would have been far better to employ a qualitative research method to explore such hypothetical questions. A good first question would be to ask for a definition of a luxury brand, rather than making the assumption that the client, YouGov and the survey’s audience all share the same perspective. As it stands they’re likely to get some nice charts with average scores to a couple of decimal places, but little insight into what consumers really think.