Voter Opinion Research Using Facebook Has Great Potential
If you are a likely voter who happens to be 44 or younger, I implore you to participate in my telephone survey. Unfortunately for me, I know that only 1 in 10 of you are likely to say yes. If you would rather not have me contact you on your cell phone, maybe I can entice you through Facebook? You don’t have to talk to anyone, you just can just click through the questions on your phone, tablet, laptop or computer. Nobody will bother you, and your opinion will be heard. Do we have a deal?
Interviewing Voters Using Online Sample is Problematic
We have written in this blog before about how online sample sucks for voter opinion surveys. It’s nothing personal against online panel providers. It’s just a fact that online sample does not meet our needs as a survey research firm. Our clients need voter opinion surveys to be fielded quickly, with a 2 or 3 day turnaround. Our clients needs at least 300 interviews to trust the survey results. Our clients are running for public office in districts below the state level. Online sample struggles to meet these needs.
The Size of the Facebook Community is Why There is Hope
What makes Facebook so promising for voter opinion research is the massive size of the Facebook community, particularly among millennials and among all registered voters. For example, there are approximately 3.6 million registered voters in Colorado and the size of the Facebook community in the state is 3.3 million individuals. Obviously some of those 3.3 million are not eligible voters, but these statistics are very exciting for political pollsters like Magellan Strategies because the truth is most online panels are just not large enough or representative enough to conduct a 500n survey at the state level. And don’t even ask about using online sample for a Congressional or state legislative district voter opinion survey.
What We Learned From Surveying Florida, Pennsylvania, New Hampshire and Nevada
Our survey testing using Facebook showed a lot of promise but was also disappointing at the same time. In August we fielded a typical voter opinion survey in Florida, Pennsylvania, New Hampshire and Nevada. Not surprisingly, the unweighted results were all over the map. In Florida we only had 47 respondents and the sample was too old, too Republican and too white compared to Florida’s likely voter demographics for a Presidential election. It was the same story in New Hampshire and Nevada – only 63 and 52 respondents respectively, and those samples did not reflect the likely voting populations of either state accurately. Our Pennsylvania survey where things got a bit more interesting. We had 561 respondents – which at the very least is a large enough number that we can dig into the responses and learn some lessons.
Encouraging Takeaways From Our Pennsylvania Survey
So what did we really learn? For one thing, too few Democrats participated in the surveys. We project that 50% of the electorate in Pennsylvania this cycle will be Democrats, while our survey only showed 18% of our respondents were Democrats. That gap is explained by the huge number of independent add “other party” in the survey. Only 10% to 11% of the likely voting population are going to be independent in Pennsylvania. However a whopping 44% of our survey respondents identified as independent.
Beware of Politically Active Respondents Sharing the Survey
The reason our poll skewed independent so hard was due because it was shared by Pennsylvania Students for Johnson, the Libertarian Party of Pennsylvania, Westmoreland County Libertarians, and Butler County Libertarians. Unsurprisingly, then, 62% of our respondents chose Gary Johnson in the Presidential ballot test. Say what you will about Gary Johnson’s appeal to voters who are completely dissatisfied with Donald Trump and Hillary Clinton, appeal which we’ve demonstrated here in Colorado, but this Pennsylvania ballot test was obviously not accurate.
Should Respondents be Able to Review the Survey Results?
We also learned that it matters whether the respondents are allowed to see the poll results. It sparks definite public interest and increases the response rate, but it also leads to a slew of comments trashing the results as inaccurate before devolving into partisan bickering. YOU try explaining to the Facebook masses that the poll was never meant to be scientific or predictive in any way, and that we simply wanted to conduct a test to expose the advantages and disadvantages of polling on Facebook…
Keep On Testing!
The bottom line is Facebook has definite appeal as a survey data collection tool. At best, it can reach populations (likely voters age 44 and younger, minorities, etc…) who are difficult to reach in traditional telephone surveys. We could then take those Facebook results and incorporate them into results from more traditional surveys, the end result being – hopefully – a more accurate portrait of the likely voting electorate. I say hopefully because there are still so many potential pitfalls when conducting this kind of research on Facebook.
The response rate will not always be great. When it is great, why was it great? Did someone share the poll far and wide, thus skewing the results beyond anything that could reasonably be expected? Will younger voters and minorities make up a significant portion of the results, or will their responses be drowned out by Facebook’s older population? No offense to the seniors, but we’d rather talk to you on the phone.
Suffice to say that we will keep doing our best to solve these issues in our quest to make Facebook a viable piece of our polling repertoire. It very well could be that there are certain kinds of political research where Facebook becomes an incredibly valuable tool. We’ll keep you posted!