A recent survey conducted by the Committee to Determine the Intelligence of Marketers (CDIM), an independent think-tank in Princeton NJ, recently found that:
4 out of 5 respondents feel that marketing is a "dead" profession.
60% reported having little if any respect for the quality of marketing programs today.
Fully 75% of those responding would rather be poked with a sharp stick straight into the eye than be forced to work in a marketing department.
In total, the survey panel reported a mean response of 27% when asked, "on a scale of 0% to 100%, how many marketers suck?"
This has been a test of the emergency BS system. Had this been a real, scientifically based survey, you would have been instructed where to find the nearest bridge to jump off.
Actually, it was a "real" "survey." I found five teenagers in a local shopping mall loitering around the local casual restaurant chain and asked them a few questions. Seem valid?
Of course not. But this one was OBVIOUS. Every day we marketers are bamboozled by far more subtle "surveys" and "research projects" which purport to uncover significant insights into what CEOs, CFOs, CMOs, and consumers think, believe, and do. Their headlines are written to grab attention:
-- 34% of marketers see budgets cut.
--71% of consumers prefer leading brands when shopping for .
And my personal favorite:
-- 38% of marketers report significant progress in measuring marketing ROI, up 4% from last year.
Who are these "marketers"? Are they representative of any specific group? Do they have anything in common except the word "marketing" on their business cards?
Inevitably such surveys blend convenience samples (e.g. those willing to respond) of people from the very biggest, billion-dollar-plus marketers to the smallest $100k annual budgeteers. They mix those with advanced degrees and 20 years of experience in with those who were transferred into a field marketing job last week because they weren't cutting it in sales. They commingle packaged goods marketers with those selling industrial coatings and others providing mobile dog grooming.
If you look closely, the questions are often constructed in somewhat leading ways, and the inferences drawn from the results conveniently ignore the statistical error factors that frequently wash away any actual findings whatsoever. There is also a strong tendency to draw conclusions year-over-year when the only thing in common from one year to the next was the survey sponsor.
As marketers, we do ourselves a great disservice whenever we grab one of these survey nuggets and embed it into a PowerPoint presentation to "prove" something to management. If we're not trustworthy when it comes to vetting the quality of research we cite, how can we reasonably expect others to accept our judgment on subjective matters?
So the next time you're tempted to grab some headlines from a "survey" -- even one done by a reputable organization -- stop for a minute and read the fine print. Check to see if the conclusions being drawn are reasonable given the sample, the questions, and the margins of error. When in doubt, throw it out.
If we want marketing to be taken seriously as a discipline within the company, we can't afford to let the "marketers" play on our need for convenience and simplicity when reporting "research" findings. Our credibility is at stake.
And by the way, please feel free to comment and post your examples of recent "research" you've found curious or questionable.
by Pat LaPointe
Pat LaPointe is Managing Partner at MarketingNPV -- specialty consultants on marketing measurement and metrics, and publishers of MarketingNPV Journal.
Courtesy of http://www.mediapost.com