For the first time in my life, professional pollsters tracked me down and actually asked my opinion on what they considered vital questions on the state of the union. Naturally, I was flattered. But the conversation went downhill in a hurry.
It was a legitimate poll conducted by Pew Research Center of Princeton, New Jersey, and I suspect the woman conducting the poll was a college student. I am also guessing the pollster who called last Friday was bored to death and eager to go party based only on the monotone of her voice.
I must be a bummer to interview. About two minutes into it, she asked several questions on abortion. The responses were I considered myself pro-choice. I am somewhat religious but haven't been to church in years. And, no, I don't believe the government should pay for abortions. I felt set up to appear foolish. I tried to explain the Hyde Amendment prevented the government from paying for abortions and consequently felt the question misleading. Her only response was to repeat the question. Apparently, you can't explain your answers to a pollster.
The other questions covered healthcare reform, climate change and rating Congress and the president on their work to solve these problems, as well as my political party, age, marriage status, ethnic group and ZIP code. I assume she figured out I was a male.
Frankly, she zipped through the questions so fast and so robotic that I'm uncertain exactly what I told her and how I fit in Pew's model matrix.
As a journalist and now blogging as a political columnist, I have mixed emotions over the validity of polls. At best, they are a broad indicator of public sentiment on an issue at a specific point in time. At worst, they are a manipulated weapon to calculate the efficacy of a political point of view.
Which is why I found so poignant an article written by Phil Trounstine and Jerry Roberts in today's Los Angeles Times that skewered a recent poll conducted by the liberal website Daily Kos. The poll showed San Francisco Mayor Gavin Newsome narrowing the gap to nine points behind Calfiornia Atty. Gen. Jerry Brown in the Democratic primary race for governor. Writes Trounstine and Roberts:
Within minutes, the San Francisco Chronicle posted a blog item saying the poll showed the race was "narrowing," comparing it to a June survey, conducted by a different company, that gave Brown a 20-point lead over Newsom. The item was quickly picked up and posted by Rough & Tumble, California's premier political news aggregator. Then it was reported and re-blasted by "The Fix" at the Washington Post, one of the top political sites in the country. Within 12 hours, this characterization of California's race for governor became received wisdom.
There was only one problem with this wisdom: It was wrong.
The incident illustrates how political misinformation and misinterpretation can be more viral than the truth in the Internet News Age, as reporting on polls pulses through the electronic highway, launched by news organizations with little time to evaluate and sift the quality of research. In recent weeks, a series of California political surveys has produced a cacophony of often conflicting analysis, opinion and reporting that served to confuse readers and distort political perceptions.
For example, comparing and measuring the Daily Kos poll, conducted by Research 2000, against the previous poll -- done with a completely different methodology by Moore Methods Research of Sacramento -- created a false equivalency. In fact, a recent follow-up poll by poll director James Moore, who has long experience in California, found that, far from tightening, Brown's lead over Newsom has grown to 29 percentage points.
A poll's methodology -- including the sample size, method of selection and phrasing of questions -- is crucial. The Daily Kos survey, for example, used random digit dialing to reach California adults. To identify them as "likely voters," pollsters asked respondents several questions, including whether they considered themselves Democrats or Republicans. But identifying 600 likely voters didn't provide the number of Democrats and Republicans statistically necessary to measure the primaries, so pollsters called more people until they had 400 self-identified Republicans and 400 self-identified Democrats. Then, as they put it, "quotas were assigned to reflect the voter registration of distribution by county."
After this statistical slicing and dicing, the survey produced a final sample of alleged likely voters that included 18% under age 30 and 19% age 60 and older. But according to a real-world screen of likely voters -- based on actual voting histories -- the June 2010 primary electorate is expected to include about 6% people under 30 and 38% people over 60.
These issues alone would be enough to distort the state of the Brown-Newsom contest. But will any of them surface when the next reporter Googles the California governor's race, looking for standings? Not a chance. Why does it matter? Because misreporting of polls allows campaign spinners not only to boost or suppress candidate fundraising but to manipulate news coverage, frame campaign narratives and shape public perceptions.
The two reporters conclude polling stories should be viewed by readers and voters with great skepticism and news outlets should use greater care in analyzing and distributing survey data.
Amen, brothers.
It was a legitimate poll conducted by Pew Research Center of Princeton, New Jersey, and I suspect the woman conducting the poll was a college student. I am also guessing the pollster who called last Friday was bored to death and eager to go party based only on the monotone of her voice.
I must be a bummer to interview. About two minutes into it, she asked several questions on abortion. The responses were I considered myself pro-choice. I am somewhat religious but haven't been to church in years. And, no, I don't believe the government should pay for abortions. I felt set up to appear foolish. I tried to explain the Hyde Amendment prevented the government from paying for abortions and consequently felt the question misleading. Her only response was to repeat the question. Apparently, you can't explain your answers to a pollster.
The other questions covered healthcare reform, climate change and rating Congress and the president on their work to solve these problems, as well as my political party, age, marriage status, ethnic group and ZIP code. I assume she figured out I was a male.
Frankly, she zipped through the questions so fast and so robotic that I'm uncertain exactly what I told her and how I fit in Pew's model matrix.
As a journalist and now blogging as a political columnist, I have mixed emotions over the validity of polls. At best, they are a broad indicator of public sentiment on an issue at a specific point in time. At worst, they are a manipulated weapon to calculate the efficacy of a political point of view.
Which is why I found so poignant an article written by Phil Trounstine and Jerry Roberts in today's Los Angeles Times that skewered a recent poll conducted by the liberal website Daily Kos. The poll showed San Francisco Mayor Gavin Newsome narrowing the gap to nine points behind Calfiornia Atty. Gen. Jerry Brown in the Democratic primary race for governor. Writes Trounstine and Roberts:
Within minutes, the San Francisco Chronicle posted a blog item saying the poll showed the race was "narrowing," comparing it to a June survey, conducted by a different company, that gave Brown a 20-point lead over Newsom. The item was quickly picked up and posted by Rough & Tumble, California's premier political news aggregator. Then it was reported and re-blasted by "The Fix" at the Washington Post, one of the top political sites in the country. Within 12 hours, this characterization of California's race for governor became received wisdom.
There was only one problem with this wisdom: It was wrong.
The incident illustrates how political misinformation and misinterpretation can be more viral than the truth in the Internet News Age, as reporting on polls pulses through the electronic highway, launched by news organizations with little time to evaluate and sift the quality of research. In recent weeks, a series of California political surveys has produced a cacophony of often conflicting analysis, opinion and reporting that served to confuse readers and distort political perceptions.
For example, comparing and measuring the Daily Kos poll, conducted by Research 2000, against the previous poll -- done with a completely different methodology by Moore Methods Research of Sacramento -- created a false equivalency. In fact, a recent follow-up poll by poll director James Moore, who has long experience in California, found that, far from tightening, Brown's lead over Newsom has grown to 29 percentage points.
A poll's methodology -- including the sample size, method of selection and phrasing of questions -- is crucial. The Daily Kos survey, for example, used random digit dialing to reach California adults. To identify them as "likely voters," pollsters asked respondents several questions, including whether they considered themselves Democrats or Republicans. But identifying 600 likely voters didn't provide the number of Democrats and Republicans statistically necessary to measure the primaries, so pollsters called more people until they had 400 self-identified Republicans and 400 self-identified Democrats. Then, as they put it, "quotas were assigned to reflect the voter registration of distribution by county."
After this statistical slicing and dicing, the survey produced a final sample of alleged likely voters that included 18% under age 30 and 19% age 60 and older. But according to a real-world screen of likely voters -- based on actual voting histories -- the June 2010 primary electorate is expected to include about 6% people under 30 and 38% people over 60.
These issues alone would be enough to distort the state of the Brown-Newsom contest. But will any of them surface when the next reporter Googles the California governor's race, looking for standings? Not a chance. Why does it matter? Because misreporting of polls allows campaign spinners not only to boost or suppress candidate fundraising but to manipulate news coverage, frame campaign narratives and shape public perceptions.
The two reporters conclude polling stories should be viewed by readers and voters with great skepticism and news outlets should use greater care in analyzing and distributing survey data.
Amen, brothers.
No comments:
Post a Comment