Whether it’s about an upcoming election, attitudes
towards race in the criminal justice system, or which celebrity will take home
the award for best performer – people love polls!
But how much do they really love them? Do they love them
enough to care about what goes into making them?
Otto Von Bismark once said that “Laws are like
sausages, it is better not to see them being made”. However, I would like to paraphrase what Bismark
said and say that polls are
akin to sausages in that people like to consume them but don’t really care much
to see how they’re made.
It’s easy for someone (myself included) to discover a survey
or study in the news, immediately generate an opinion about the subject matter,
and move on without really asking how those findings were reached. However,
the information to which an individual is exposed has the probability of influencing
their attitudes, opinions, and even future behaviour.
In a recent article
in Public Opinion Quarterly, Kevin K. Banda, a Political
Science professor at the University of Nevada-Reno stated that “citizens want
to collect adequate information to make accurate decisions while minimizing the
costs they face. This accuracy motivation along with citizens’ disinterest in
politics encourages them to rely on easily accessible cues when forming
attitudes about candidates”[i].
These accessible cues Banda refers to are alternatively known as heuristics or
a kind of cognitive short-cut.
Simply put: polls are heuristics. They’re easy to
absorb snapshots of what the public thinks, and these snapshots generate
opinions.
Questionnaire design is an essential part of an
opinion poll. Researchers go through great pains considering how the use or
removal of even a single word can influence someone’s response to a specific
question. It is important for consumers
of polling data to understand this and journalists can help educate readers by
including question wording directly in
as opposed to alongside their
articles.
When Patrick Brown won the Progressive Conservative
Party’s leadership contest, the Toronto Star published the findings of a Forum Research poll
which asked respondents a series of questions about their attitudes towards
Brown and some of the views that he has championed during his political career.
The article reported attitudes towards provincial voting intentions, creationism,
same-sex marriage, and the sex-education curriculum. While
it is appreciated and encouraged that the Toronto Star provided a link to Forum
Research’s press release, the columnist failed to disclose in their article
what questions Forum asked. I use this article as an example to highlight the
importance of disclosing survey questions in media publications and not as a
means to criticize the Toronto Star or the respective columnist directly.
The Market Research and Intelligence
Association (MRIA), the regulatory body within the polling and market research
industry, states in their Code of Conduct that for all reports of survey
findings the Client (in this case the Toronto Star) has released to the public,
the Client must be prepared to release the following details on request:
- Sponsorship of the survey
- Dates of interviewing
- Methods of obtaining the interviews (telephone, Internet, mail or in-person)
- Population that was sampled
- Size description and nature of sample
- Size of the sample upon which the report is being released
- Exact wording of questions upon which the release is based
- An indication of what allowance should be made for sampling error
The
Toronto Star article in question met three of the eight requirements above:
method of contact (in this case IVR), the sample size (1,001 people), and the
allowance for sampling error (considered accurate to within three
percentage points, 19 times out of 20).
I understand that it’s
awkward for a journalist to include methodological minutiae such as the dates
that the survey was in the field or perhaps going deep into a firm’s sampling
frame. Being a journalist in not easy! They are subject to tight and frequent
deadlines, word counts, constant research and fact-checking, and of course
writing a story that people will want to read. Question wording, however, is
not a methodological triviality and if a journalist is willing to report the
findings of a question, they should at least include the question that was
asked.
I applaud and encourage journalists
for incorporating polling data in their publications. However, my criticism
lies in the fact that readers are not being told the whole story when question wording is withheld. A journalist may
believe that they are informing their readers by including a firm’s press
release alongside their article, however, Banda’s comments on heuristics has
shown that the probability of a reader following up on external information to
further their knowledge (in this case learning what questions were asked) is
low.
By including what was asked
directly into the article, a journalist enables a more potent heuristic for
people to digest without placing a significantly greater amount of effort on
the reader. If the questions asked (in this case voting intentions, same-sex
marriage, etc.) were included in company with the poll’s findings in the
article, then readers would be able to acquire a more comprehensive and
three-dimensional view of what public opinion is like in Ontario because they have
a better idea of where these attitudes and opinions are coming from.
Market researchers and
pollsters spend a lot of time thinking about what questions to ask on a survey, how to frame these questions, when
and where to ask these questions
in the survey, and why these
questions were asked. If journalists are in the business of selling sausage,
they should make clear to their readers a little bit more about what goes into
making them.
[i] Banda, Kevin K. "Negativity and Candidate
Assessment." Public
Opinion Quarterly 78.3 (Fall
2014): 707-20. Print.