We’ve recently had a perfect example of what’s wrong with surveys. They’re so sensitive to data samples, question details, and other factors that they are most useful only when taken in a very narrow context.
Take employment numbers, for example. The ADP Employment Report has small businesses losing 170,000 jobs in January. But the Surepayroll scorecard reported small businesses actually increased jobs by 0.3% in January.
Who’s right? That’s not the point. Probably both. The difference could be explained by different samples, different wording of the questions, or different definitions. Given the statistics issued by the federal government, I’d suspect that ADP is closer.
More important, however, is how we use survey data in business. Way too often we assume that if it’s the result of market research, then it’s data, and we can trust it. Way too often we go against our better judgement because we have data.
I’m not saying surveys aren’t useful. Ironically, I’m doing a survey of the small business credit crunch right now, as I write this, on the Huffington Post. What I am saying is that surveys should be handled with care because they have a dangerous way of giving us a shortcut for dealing with problems we’d otherwise address with judgment and common sense.
What a select group of people say about something can be useful. But it isn’t necessarily truth.
“Way too often we assume that if it’s the result of market research, then it’s data, and we can trust it.”
In terms of “bad” research, this rings so true. Initially, the thought of bad connotes a mistake. If that is the case, then common sense tells us to find the cause of the mistake so that it can be avoided the next time.
However, it is definitely foolhardy to move forward, with bad data gathered from research, to calculations, strategizing and decision-making. In that instance, no research is the superior to bad research.
Thanks for sharing this and your other posts. I am learning a lot!