Sunday, August 23, 2009

Online Surveys: Are They Reliable?

There remains a great deal of debate in the public opinion research industry about the reliability of so-called online research. Online research is typically when a panel is recruited by a company or group of companies to regularly participate in opinion or market research. Each panelist volunteers to participate in research studies conducted by the company or companies. Then, a sampling of the people who have opted-in to the online research panel is selected to participate in any one of these surveys. I don't believe the industry should express paralyzing fear over this kind of methodology and at the same time I join many others who have significant concern about the methodology and how it is promoted.

There are two main reasons one should be skeptical about online panels. The first is when the researcher promotes the results as a representative sample of the broader audience and includes a margin of error. The American Association of Public Opinion Research has some very interesting thoughts on this subject. In fact, the first sentence of the organization's narrative on online surveys reads "The reporting of a margin of sampling error associated with an opt-in or self-identified sample (that is, in a survey or poll where respondents are self-selecting) is misleading" (http://www.aapor.org). Like AAPOR, I believe reporting the results of an online survey as if they are somehow reliant on a normal random sample of a (target) population is misleading. These online surveys are sampling a database of people who have volunteered to participate in these kinds of surveys and not a random sampling of the population whose opinions you seek (i.e. voters). If the survey is not a random sample survey then one can not, and should not apply random probability and calculate a margin of error. Most of these online panel surveys are not based on a scientific random sampling but they are promoted as if they are, which is troubling. At best such surveys will suffer from unknown bias and could possibly skew towards what some have said is a younger, more politically active audience unrepresentative of the broader population. Particularly as a stand-alone methodology, I tend to agree with noted pollster and regular contributor to www.pollster.com , Mark Blumenthal who said "I'd just be, I'd be more cautious about online surveys." Secondly, online surveys are notorious for producing wild results inconsistent with traditional survey methodologies. Again, I will quote AAPOR's website to conclude my concern with these kinds of surveys.

"Surveys based on self-selected volunteers do not have that sort of known relationship to the target population and are subject to unknown, non-measurable biases. Even if opt-in surveys are based on probability samples drawn from very large pools of volunteers, their results still suffer from unknown biases stemming from the fact that the pool has no knowable relationships with the full target population.
"AAPOR considers it harmful to include statements about the theoretical calculation of sampling error in descriptions of such studies, especially when those statements mislead the reader into thinking that the survey is based on a probability sample of the full target population. The harm comes from the inferences that the margin of sampling error estimates can be interpreted like those of probability sample surveys."

Now let me explain why I don't believe the industry should be incredibly fearful of this type of methodology, generally speaking. Technology is creating incredible accessibility to vast numbers of people which can only improve the way opinion and issues research is conducted. In other words, the industry is evolving and the eventual outcomes should be embraced after the reliability and efficacy of such advancements is determined. There will be a time when Blackberries, iPhones and other personal communications devices will have features and applications that allow for research of the user's opinions. We need to maximize the best way to use these technologies to serve our clients and the public while not sacrificing reliability or misleading the end consumers (clients, news media, general public). At MBE we are working on some exciting technologies that can enhance how we conduct our research, and anyone in the industry worth their salt should be doing the same thing. Online surveys are not completely "bad." For instance, I believe as a census survey of a particular audience they can be fairly efficient and cost effective. For example, if a membership organization has a great many members, the organization has access to their email addresses and the ability to design a questionnaire and analyze the results, then an online survey can be appropriate. Especially when the results are not promoted as a random sampling of some broader audience complete with a margin of error. For example the membership organization could conclude that X% of its membership answered a certain question a certain way and that would be appropriate. Also, I believe statistical analysis always calls for a degree of practical thought. The more completed interviews the membership organization received the more reliable the aggregate answers would be (i.e. 400 respondents is more reliable than 40). Making strategic membership decisions based on those 400 respondents would be appropriate.

In the end, online surveys promoted as a scientific random sample surveys are probably not a good idea.

Monday, June 29, 2009

Mountaintop Mining and Public Opinion in West Virginia

The last few weeks have seen the U.S. Senate Subcommittee on Environmental and Public Works hold hearings to examine mountaintop removal and a Hollywood actress and other environmental activists arrested while protesting a mine site in Southern West Virginia. As a result, the subject of coal mining and mountaintop mining are receiving significant media attention.

During the past several years I have measured, examined and studied the attitudes and opinions of West Virginians related to energy, coal mining and mountaintop mining thoroughly and likely more so than any other public opinion research company in America. Moreover, as a firm, we have lived through the debate - from Judge Haden's ruling to Darryl Hannah's arrest and all points in between - we have been there. Said simply, we understand this subject.

Public perceptions of the energy and mountaintop mining issues are too complex to be fully analyzed or appreciated by a single survey, blog posting or news story. Such analysis requires a certain depth, appreciation of current events and trends and familiarity with public opinion on other issues related to this debate. This posting is not intended to answer every question on the subject but serve as starting point in helping readers understand how the public views these issues.

For example, MBE conducted a well publicized survey of 601 registered voters (maximum sampling variation of +/-4% at 95% confidence level). Respondents were asked to name their greatest concern as a voter (the question was unaided). Over half (51%) cited issues related to economic improvement or job creation. Less than 2% named mountaintop mining as their primary issue of concern. This is not to suggest mountaintop mining is an unimportant issue but clearly demonstrates how significant are economic issues. The same survey finds that three in five voters oppose "a total ban on mountaintop mining in West Virginia" and 59% agree a total ban would have a "negative effect on West Virginia's economy." West Virginians place a premium on economic development and job creation. Further, voters perceive mountaintop mining as a major contributor to the state's economy and job market. As a result, most oppose a ban of the practice. Additionally, an entire posting could be written (and perhaps will be) on other energy/economic issues such as energy independence and national security. Suffice it to say these issues recieve similar considerations and conclusions from voters as the basic economic issues described above.

Even though the issue may not be in the top four or five issues of greatest concern, voters are concerned about aspects mountaintop mining. More often than not, these voters indicate such concerns are related to waterway and stream (quality) issues compared to other issues such as aquatic or other wildlife or scenic beauty.

West Virginia voters are also pay close attention to other areas of the coal industry. For instance, worker safety is usually the top concern expressed by voters when it comes to coal mining. On the environment, West Virginians believe the coal industry has improved its environmental protection during the past 20 years or so, but expect the industry to continue to improve. Voters realize the importance of the industry to the economy but want West Virginia coal companies to lead the way in technological advancements that can sustain the economy for years to come - areas such as coal liquefaction and clean coal technology. Lastly, natives of the mountain state want to be kept informed about the industry's plans and future in the mountain state.

In summary, most West Virginia voters perceive the coal industry - including mountaintop mining - from what seems to be a very moderate and reasonable perspective. They appreciate the contributions the energy industry makes to the economy, they remember the past sins of the coal industry but recognize the improvements it has made in terms of environmental protection and worker safety (and expect perpetual improvement), and hope the industry can remain viable for many years through the use of technology. They expect the industry to be good stewards of the land while providing good, safe, good-paying jobs and paying its fair share of taxes.

For more information visit http://www.markblankenship.com/ or via email mark@markblankenship.com.

Tuesday, June 16, 2009

Judicial Reform and Public Opinion in West Virginia

In West Virginia, Governor Manchin announced former United States Supreme Court Justice Sandra Day O'Connor will serve as honorary chairwoman of the governor's special commission on the state court system. The commission will examine issues ranging from the partisan election of judges to the creation of a intermediate appellate court. The commission is to provide Governor Manchin with its report in November of this year.

Mark Blankenship Enterprises (MBE) has conducted a number of public opinion surveys in West Virginia on this topic during the last year. Revisiting some of this data helps illuminate how voters perceive some of the very issues which the governor's commission is to examine.

In September of 2008 MBE conducted a random sample survey of 600 registered West Virginia voters (yielding a maximum sample variation of +/- 4% at a 95% confidence level). Sixty percent (60%) disagree electing "supreme court justices by partisan election in which judges are nominated by their political parties such as republican or democrat ... creates a fair court system."

Survey respondents were also read (by live telephone interviewers) a list of three possible ways judges in West Virginia could be selected and asked which they prefer as the best method. The possibilities included "partisan election of judges in which judges are nominated by their political parties such as republican or democrat" which 19% of voters preferred; "nonpartisan election of judges in which the candidate’s political party is unknown" preferred by 36% of voters; and "merit selection of judges in which the governor appoints and legislature approves judges based on the judge’s experience and qualifications" preferred by 37%. Another 8% didn't know which method they prefer.

Two-thirds (73%) of West Virginia voters prefer a judicial selection method other than the current partisan election method. This certainly suggests there is public support for examination and reform such as the commission put forth by the governor. Because a majority or strong plurality doesn't exist for either of the other two questioned selection methods, deeper research is needed to better understand the voter's perceived benefits and concerns with merit selection or non-partisan election. (Methodology Note: Interviewers used random-digit dialing procedure to interview respondents. The numbers are generated by computer to achieve maximum representation in all West Virginia counties. This technique is designed to produce a sample of registered voters that is representative of the entire population in such areas as age, gender, race, and family income. Both listed and unlisted telephone households had an equal chance of being selected in the sample).

It is also important to note survey after survey conducted by our firm in West Virginia indicates the issue which evokes the greatest concern among West Virginia voters is the economy and/or job creation. Moreover, voters regularly express a perceived connection, to some extent, between the the judicial and economic systems or climates of the state. Because the economy is such an intensely concerning issue to voters and because of this perceived correlation between the economy and the judicial system, voters are likely to watch the commission's work with great interest and high expectations. For more information please feel free to visit http://www.markblankenship.com/ or email mark@markblankenship.com.

Thursday, June 11, 2009

Do Cell Phones Affect Telephone Surveys?

At MBE, our clients and colleagues frequently ask about the effect so-called "cell phone only" respondents have on traditional telephone survey research. In other words, with more people deciding to use only a cell phone and giving up their land line based telephone, does the data collected through traditional telephone survey research become less reliable? This certainly is a great question and one that is receiving significant attention in the opinion research industry. The American Association of Public Opinion Research (http://www.aapor.org/) which is the primary professional association for opinion research, has devoted countless hours, conference panels, columns and other writings to this topic during the previous few years. Without a doubt, the issue will require the industry to continually monitor and address.

Some government statistics indicate slightly more than one in 10 (13%) households in the U.S. can no longer be reached via traditional land line telephones. Just six years ago approximately 3% of US households were cell-only. In fact, there are some government researchers who believe the number of households using only a cell phone will more than double during the next year. Looking at just these statistics would indicate cell-only respondents currently pose a major problem in the opinion research industry. However, a closer analysis is required for appropriate planning and development of research methodologies.

Pew Research conducted an extensive study in 2007 (the entire survey can be found at http://people-press.org/report/276/) which provides insight into this challenge. First, the Pew survey found that cell-only respondents tend to be "younger, less affluent, less likely to be married or to own their home, and more liberal on many political questions." This then begs the question do traditional telephone surveys underrepresent these demographics and if so is the total data less reliable? The Pew Survey seems to indicate, and we at MBE agree, the answer is no. Pew conducted the same set of survey questions among a cell-only audience and among a standard land line sample. When the results of cell-only sample were included with the results of the standard land line sample "overall results of the poll (changed) by no more than one percentage point on any of nine key political questions included in the study." It is reasonable to assume that just because a respondent is cell-only does not mean he or she has wildly different attitudes or opinions on most subjects compared to respondents of similar demographics who happen to have a land line, or a land line and a cell phone. One must also consider the subject matter. As the Pew survey demonstrates, cell-only respondents are younger, tend to have less education, are single and less likley to own their own home. So if one is surveying the public on issues such as home mortgages or finance, Social Security benefits, the cost of higher education or marriage related issues, special consideration should be given to sampling cell-only customers. However, if the subject matter is partisan or political most studies indicate traditional land line sampling would be sufficient or appropriate. These issues should be easily managed through discussion and planning with your opinion research firm.

All of this of course applies to current demographic profiles. Moving forward, more and more people will likely become cell-only. As the current cell-only respondents mature, will they acquire a land line and not rely on a cell phone, will they continue to be cell-only, will they use a combination? These answers are hard to predict. Nonetheless, the cell-only population is going to continue to grow and opinion researchers must adapt accordingly.

Because national telemarketing regulations prevent automated dialing assisted calls made to cell phones (such as C.A.T.I.), cell phone respondents must be dialed "by hand" whereas a land line can be dialed, randomly, by a computer and then connected with a live interviewer. This greatly increases the cost per interview of cell-only customers in addition to significantly higher sample costs for cellular telephone numbers when compared to land lines. These costs can quickly become prohibitive.

In closing, in the current environment cell-only customers do not have a major negative affect on the reliability of telephone surveys (excluding a few narrow topics or subject matters) but the issue is still worthy of discussion and strategy. Many are suggesting the creation and development of national cell-only samples or databases and even Internet panels of cell-only respondents which could be used to augment the traditional land line telephone samples. The good news is that the industry is well aware of the short-term and long-term challenges posed by this issue and is moving in a manner that will ensure the integrity and reliability of our research products. For more information, feel free to visit http://www.markblankenship.com/.

Friday, June 5, 2009

Welcome

Thank you for visiting the Public Opinion Research blog published by Mark Blankenship Enterprises, LLC. This blog will cover various research methodologies used to measure and analyze public opinion. We will discuss everything from traditional telephone surveys or polling to focus groups to online web surveys to jury and litigation research and much more. The primary objective of this blog will be to help you, the reader, better understand public opinion research generally, and which quantitative or qualitative research methodologies will best serve your communications needs. We will have a series of guest and expert writers who will contribute to this blog as well. From time to time we will also discuss current events and how such influence and shape public opinion and discuss actual research conducted by Mark Blankenship Enterprises. Relying on our more than 20 years of experience and expertise researching public opinion we hope to make this blog informative, educational and topical.

If you are an attorney seeking to refine and develop case strategy, we will provide insights into the most effective and appropriate litigation research methodologies. If you work for a business or corporation seeking to better understand your customers, the communities in which you operate or improve your image equity, we will discuss community attitude and issues research. If you are an issues advocacy organization or political candidate, we will discuss voter opinion research. If you are a marketing or communications professional, we will discuss effective advertising and consumer research. This blog will provide something for everyone, so sit back and enjoy and check in regularly. Also, feel free to visit http://www.markblankenship.com/.