Previously, I spoke with Don Mills of Corporate Research Associates (who should be very active in the upcoming Nova Scotia election), Frank Graves of EKOS Research, and Darrell Bricker of Ipsos-Reid.
Léger Marketing has been an active political polling firm in Quebec for a very long time, but has moved over to online in the last few years. This is what Christian Bourque had to say about the methodology:
308: Léger Marketing has used online
polling for some time now. Why was the decision made to move over to that
methodology?
CB:
We made the decision based on the fact that we control the sample. It is
our panel. We control it from recruitment to data collection to data
cleaning. We started the panel in 2004 and felt comfortable using it for
political polling after almost three years of comparative polling telephone to
Web. Really, we have focused on Web-based electoral polling since 2007.
308: What are the advantages of
conducting your polls online instead of over the telephone?
CB: Cost, timing is crucial
and no social desirability bias. We can get up and running faster if the
demand from the client requires quicker turnaround. Rushing telephone
projects can mean “burning” samples to quickly fill quotas. Given that media
clients do not have a lot to spend on polling we have been able to produce
larger samples at a fair cost compared to telephone. The “honesty”
factor also works in favour of the Web.
308: What are the disadvantages?
CB: Controlling for potential
differences between panel members and the general public is always something we
need to control for. Our panel is over 70% RDD recruitment coming
from our call center, so we are already very confident about the source. Panelists here are profiled at length so we can compare them, not
only on socio-demographics but we can do it as well on technology variables,
health-related profiling questions and we could even weight on beer brand to
conform to market share statistics if we wanted to.
308: How is your online panel recruited and what steps
do you take to ensure the sample is representative?
CB: Most of the panel is
recruited from our telephone studies and telephone-based recruitment. It
has a higher cost compared to online recruiting but generates more reliable and
loyal panelists. We profile on over 90 variables over time so get a good
grasp on who these panelists are and we stratify samples at the invitation
stage to ensure that the output will not require important corrections or
weighting. We also have a data cleaning and data quality protocol that
gets rid of speedsters, straightliners and potential fraudsters.
308: What challenges do you face in building a
representative sample of the population, considering that not everyone has
access to the Internet and the potential for opt-in panels to attract a
different sort of respondent?
CB: 86% of Canadians went
online last week. That’s more that households who have a landline
phone. In a market of rapidly decreasing response rates over the
telephone, no methodology should feel they can take the high road and look down
on the others.
308: There are debates in the industry about the
problems surrounding online polling not being probabilistic, despite some good
performances. Why is this, or isn't it, a problem?
CB: Compare the results of
probabilistic vs. non-probalistic polls in BC, Albera, Ontario, Quebec and the last
few federal elections and you will not find a clear pattern of who has done
best or worse. You either feel very strongly about one or other, but it
comes down to faith and preferences more than any clear conclusions one can
reach from historic data.
308: Léger Marketing has a very long history of
polling in Quebec. How has political polling changed over that time?
CB: Like everywhere else,
declining participation in elections is making our work more challenging. How do we account for that 30% to almost 50% who simply do not show up on
election day. Should we move to “likely voter” models in Canada? If
you cross-tabulate participation by age and voting intent by age, you can
explain most of the differences between polls and election results in the
recent past (except Alberta). But age is not the only factor. Disengagement and cynicism need to be factored in too, outside of age.
When doing comparative
polling or comparing our historical results and those of the competition,
differences between phone and web tend to be rather small and not necessarily
consistent over time. We found that only weighting by age and sex will
tend to produce slightly more left-leaning results on the Web. We have
been using a more complex weighting scheme over the past six years to account for
that (education, income and household composition are now factored in).
I'd also like to add the
following:gGiven the recent critique and, some would say, controversy in the
market, I believe we, as an industry, should come to agree to greater
disclosure mechanisms to allow the industry to develop a better understanding of
the changing landscape out there. This would benefit us
all.
**************
Abacus Data is a relative newcomer to the political polling world, having first appeared in 2010. Abacus uses an online panel for its polling, but has used other methodologies in the past. I thought David Coletto could give us an interesting perspective as he has worked in the industry in an era where online polling was always an option, as opposed to the other major players who cut their teeth in the telephone-only age.
308: Why has Abacus settled on
using online panels for political polling, and what are the advantages and
disadvantages of the decision?
DC:
Abacus Data decided to exclusively use online panels for political polling as
we decided that the advantages outweighed the disadvantages.
Advantages
Online
panel research provides for a variety of advantages over live telephone or IVR
research. Online research allows a large
number of respondents to be contacted simultaneously, meaning the study can be
completed much faster than with other methods. Also, the nature of online design allows for great flexibility in the
visual appearance of the survey and in question design. Such design aspects allow for the creation of
scale questions, visual sliders, drag and drop, or even the presentation of
audio and video to respondents. Further,
online research allows for broad flexibility in sample design to target groups
of respondents along virtually any screening criteria. Finally, online research is considerably more
affordable than telephone or IVR, making it attractive for smaller firms and
repeat projects.
Disadvantages
As
online polling involves drawing sample from a panel of potential respondents,
it does not constitute the entire population and therefore cannot be considered
a true random sample – this is the primary disadvantage of online polling. Further, with online research it is currently
not possible to verify that the respondent is exactly who they claim to
be. While this is also true of IVR, it
is much easier to verify an identity with a live telephone interview. Online polling can also result in certain
coverage bias, especially among lower income and older groups.
308: What are the costs of online
polling compared to over the telephone?
DC:
The capital costs of setting up and maintaining a call center are high, just as
building a panel is expensive. However,
licensing costs of online are much more affordable than contracting out to a
call center. Online allows us to be a
full service firm in house and still be small, meaning Abacus is able to
control the research process from beginning to end.
Actually
carrying out the online research requires somewhat less effort than live
telephone, as there is no need for a paid bank of phone interviewers. More importantly, online has far more quality
control, as all responses can be easily monitored as they arrive. Further, there is no need to observe or
control for interviewer bias.
308: What are the challenges faced
in building a representative sample?
DC:
There are challenges, but not around representativeness of demographics or
regional variables, but rather psychographic representation like interest in
politics, political participation and engagement in public issues are likely
greater for anyone who answers a survey. However, large panel management firms have a vested interest in
maintaining quality panels and ensuring that samples are as representative as
possible.
308: There are
debates in the industry about the problems surrounding online polling not being
probabilistic, despite some good performances. Why is this, or isn't it, a
problem?
DC:
It is a problem, but it is something that all survey firms must face. Probability issues become more significant
when respondents are over-surveyed, meaning they change their behaviour or
attitudes because they are surveyed often. We have an in-house policy to screen out frequent survey
participants. Abacus tries to solve the
problem by making the sample as representative as possible, use minimal
weighting, weeding out survey frequent takers, and using high quality large
panels.
The
problem with probability sampling extends to telephone surveys however with
large portions of the population refusing to answer surveys whether because of
increased call screening or because of the refusal to respond.
The
growing use of cell phone or internet based calling will continue to make
telephone surveys more difficult, more expensive, and therefore less
representative.
308: What role does weighting
play in good polling?
DC:
Although weighting plays an important role in helping us to make our samples
representative of the population, our data is not heavily weighted.
We
use balanced sampling and interlocking quotas, similar to stratified sampling
strategy, to ensure that the respondents captured are as representative as
possible and heavy weighting is not required.
Weighting
is particularly challenging for IVR polling, because certain demographics are
more likely to answer the phone. Therefore, IVR surveys bias towards women and older demographics.
308: What are the challenges
involved in building a representative sample of voters, rather than just of the
entire population?
DC:
The number one challenge is trying to predict who will actually vote, as people
are less likely to admit that they do not vote, or that they don’t plan on
voting.
Further,
we know that those who answer surveys are likely to be more engaged than those
who don’t.
To
address these challenges, online research allows us to use varied question
types and measure likelihood to vote in different ways. By being transparent
with the models and try to forecast what the electorate will look like versus
the general population, Abacus attempts to be as clear and accurate as
possible.
In
Canada, this problem is evolving. As a
result of BC and Alberta, we are taking this issue seriously and will test a
number of models in the next election we participate in.
308: As a relatively new polling
firm, what challenges did you face in getting into the market?
DC:
We face a small-c conservative industry that is adverse to change and
innovation, and, quite frankly, are a little threatened by a new crop of
researchers like us that are testing the established ways.
I
think early on we established our credibility by demonstrating that our online
research methodology could accurately forecast the 2011 federal election and
2011 Ontario provincial election.
We,
like many other pollsters, failed to really understand what was going on in
Alberta, using a methodology we no longer use (IVR). In BC, our only poll was conducted before the
leaders debate, so our performance there is difficult to judge.
308: How is the business of
polling evolving?
DC:
I foresee more smaller players emerging.
The
business of polling will be completely online, within the next 10 years nobody
will answer their phone unless they know who is calling, if we are using
telephones at all.
So,
the industry needs to perfect and refine how we conduct internet surveys. As Google is showing us, there will be new
ways to generate sample that are only emerging as alternatives now, many of
which lean towards indirectly observing behavior rather than asking direct
questions.
308: What has to be done to
ensure that online polling can produce good results in the future?
DC:
If you mean being able to predict elections, the question is predicting who is
going to turn out to vote. I do think
online polling is producing good results now for our clients, whether it’s
testing new marketing concepts or public support for policy proposals, or the
potential for new product success.
Eric,
ReplyDeleteIn yesterday's Globe and Mail you commented that much of the BQ vote is "parked" for the time being. I found your statement odd since the BQ vote has remained mostly unchanged from 2011 increasing from 23% to 26% in the latest polls. Whereas the Liberal vote has risen from 14.2% to 32.4 and the NDP vote has collapsed from 43% to 22.6%.
I am missing something or was it merely a typo on the part of the G&M?
Thanks
I was referring to the fact that the latest polls have the BQ above their 2011 support (though, as you point out, modestly so). I may have been over-stating things, but in the context of the BQ going from around 15% in Quebec to 25% in the last few months it is hard to figure out why, as Paillé has little media presence. So why has the BQ made gains from their recent lows?
DeleteThanks Eric.
DeleteYes that makes sense. With the recent inaccuracies in polling perhaps the recent BQ lows were nothing more than statistical noise.
Two hypotheses, not necessarily mutually exclusive:
Delete1. When former Bloc supporters who voted NDP last election begin to lose faith in or become disappointed with the NDP, they simply default back to the Bloc; although I can't immediately think of a reason they'd be disappointed with the NDP at this particular moment.
2. Bloc support is directly linked to PQ support. Around May-June when the PQ government's support was really dismal, Bloc support was correspondingly hitting major lows of 4-5% nationally (<20% in QC). No provincial polls have been done in QC since June but I suspect PQ support has probably increased in large part thanks to Marois' acclaimed handling of the Lac-Mégantic disaster.
Dom
P.S. Éric, in case you haven't come across it yet, just a heads-up that Nanos has belatedly released a mid-July Ontario provincial poll, and the Lib-PC numbers are practically the opposite of what Forum was reporting around the same time.
Thanks very much, I had not seen that. When did it go up?
DeleteI regularly check in on Nanos' website cause I try to keep the lists of polls up to date on Wikipedia and I didn't notice it until a few days ago. But curiously the PDF release is stamped July 17.
DeleteDom
I do that too, but usually Nanos puts new numbers in the news feed. I never check the Ontario tab. Added to the aggregation, thanks.
DeleteDom,
DeleteWhile I generally agree with your hypotheses it appears to me that something quite different has happened.
Bloc support that had been around 35% from 1992-2011 halved in 2011 when the bulk went to the NDP. That support has not stayed with the Dippers but, appears to have migrated to the Liberal party which is up roughly 20% since 2011. In other words, from the numbers at least, it does not appear BQ support has migrated back to the BQ but, to the Liberal party since, BQ support is roughly equal to its 2011 election result.
This of course seems somewhat odd to me.
I've been going through old polling data recently, and this actually isn't the first time the BQ has been so low. They were in the high-20s, low-30s in 2002-2003 (the LPC was at 50% at the time in Quebec) and 2007. I'm still going through it all, but it is interesting to see that the BQ has been in troughs before - just not when it counted on election night.
DeleteGreat interviews Eric.
ReplyDeleteYou can see why the pollsters are playing darts in the dark as far as results are concerned.
They are taking a sample of the universe that is flawed because at least 16% of the population is excluded... don't go on the internet.
Then once they have their panels they are locked into their bias that would be flagrant by the people actually willing to be part of the panel...... People who who be part of the panel would by their acceptance of this role be a special representation of the population as a whole.
Then they establish some sort of monetary value to actual participation in the poll either direct payment of an entry to a lottery . This is a real pittance... (Angus Reid pays About $2/hr) to do extensive intrusive surveys. What sort of person values their time at $2/hr?
Add that to a fill in quota problem... need 1000 responses... send out say 10,000 requests to your 100,000 panel and stop the survey when the first 1000 people that respond.
out of that you are going to get 900 people with compulsive survey disorder who are just waiting to fill in another survey.
Then you have to take the results and adjust them to what you think the general population is.
The pollsters have no idea what the general population is and the criteria to do the adjustments.....
Employment status, age, income, sex, political affiliation, religion, accommodation, number of kids, amount of government cheques currently cashing, health, drug use..... the pollster has to decide which of these characteristics are significant to adjust the weighting.
If they somehow establish that 20% of the general population is a 40-50 year old bi-sexual female raising 2 kid with 3 cats that earns > $100,000 they will give the one person that meets this criteria that they happen to catch in their survey 10 times the weighting as anyone else. AND that could be one of my 20 internet ids!!!