This page is optimized for mobile devices, if you would prefer the desktop version just click here

6.2 How is public opinion measured?  (Page 4/21)

The ins and outs of polls

Ever wonder what happens behind the polls? To find out, we posed a few questions to Scott Keeter, Director of Survey Research at Pew Research Center.

Q: What are some of the most common misconceptions about polling?

A: A couple of them recur frequently. The first is that it is just impossible for one thousand or fifteen hundred people in a survey sample to adequately represent a population of 250 million adults. But of course it is possible. Random sampling, which has been well understood for the past several decades, makes it possible. If you don’t trust small random samples, then ask your doctor to take all of your blood the next time you need a diagnostic test.

The second misconception is that it is possible to get any result we want from a poll if we are willing to manipulate the wording sufficiently. While it is true that question wording can influence responses, it is not true that a poll can get any result it sets out to get. People aren’t stupid. They can tell if a question is highly biased and they won’t react well to it. Perhaps more important, the public can read the questions and know whether they are being loaded with words and phrases intended to push a respondent in a particular direction. That’s why it’s important to always look at the wording and the sequencing of questions in any poll.

Q: How does your organization choose polling topics?

A: We choose our topics in several ways. Most importantly, we keep up with developments in politics and public policy, and try to make our polls reflect relevant issues. Much of our research is driven by the news cycle and topics that we see arising in the near future.

We also have a number of projects that we do regularly to provide a look at long-term trends in public opinion. For example, we’ve been asking a series of questions about political values since 1987, which has helped to document the rise of political polarization in the public. Another is a large (thirty-five thousand interviews) study of religious beliefs, behaviors, and affiliations among Americans. We released the first of these in 2007, and a second in 2015.

Finally, we try to seize opportunities to make larger contributions on weighty issues when they arise. When the United States was on the verge of a big debate on immigration reform in 2006, we undertook a major survey of Americans’ attitudes about immigration and immigrants. In 2007, we conducted the first-ever nationally representative survey of Muslim Americans.

Q: What is the average number of polls you oversee in a week?

A: It depends a lot on the news cycle and the needs of our research groups. We almost always have a survey in progress, but sometimes there are two or three going on at once. At other times, we are more focused on analyzing data already collected or planning for future surveys.

Q: Have you placed a poll in the field and had results that really surprised you?

A: It’s rare to be surprised because we’ve learned a lot over the years about how people respond to questions. But here are some findings that jumped out to some of us in the past:

1) In 2012, we conducted a survey of people who said their religion is “nothing in particular.” We asked them if they are “looking for a religion that would be right” for them, based on the expectation that many people without an affiliation—but who had not said they were atheists or agnostic—might be trying to find a religion that fit. Only 10 percent said that they were looking for the right religion.

2) We—and many others—were surprised that public opinion about Muslims became more favorable after the 9/11 terrorist attacks. It’s possible that President Bush’s strong appeal to people not to blame Muslims in general for the attack had an effect on opinions.

3) It’s also surprising that basic public attitudes about gun control (whether pro or anti) barely move after highly publicized mass shootings.

Were you surprised by the results Scott Keeter reported in response to the interviewer’s final question? Why or why not? Conduct some research online to discover what degree plans or work experience would help a student find a job in a polling organization.

<< Chapter < Page Page > Chapter >>
MCQ 3 FlashCards 2 Terms 8

Read also:

OpenStax, American government. OpenStax CNX. Dec 05, 2016 Download for free at http://cnx.org/content/col11995/1.15
Google Play and the Google Play logo are trademarks of Google Inc.
Jobilize.com uses cookies to ensure that you get the best experience. By continuing to use Jobilize.com web-site, you agree to the Terms of Use and Privacy Policy.