Friday, October 26, 2012

Comment on House Effects

Simon Jackman publishes his estimates of pollster house effects.

I comment


I think the post illustrates the fact (which you stress) that estimated house effects are uh estimates with standard errors.  This is made very clear by the fact that the estimated house effects are quite different for Pulse Opinion Research and Rasmussen.  IIRC they are two different names for the exact same pollster (Pulse when the poll is commissioned Rasmussen when they do it on their own with the business logic of getting publicity).

I also think a calculation for Pulse or Rasmussen would be interesting (and fairer than fair to Rasmussen since it would include 11 polls with a small house effect which they present under a separate name).  The a Rasmussen under any other name would smell as sweet estimate will be very close to the Rasmussen called Rasmussen effect, since there are only 11 Rasmussen called Pulse polls).

[Now I can't help wondering about Rasmussen's business plan.  The  poll and publish the results for free to get publicity strategy might be a big mistake if the publicity is bad publicity because the published polls show they are biased (we have to wait for the election to estimate bias rather than house effect but Rasmussen rep minus dem - actual election Rem minus dem averaged 3.8 % in 2010).  I think however that it is a brilliant strategy.  Rasmussen has become famous and much loved by many, because of the huge Rasmussen Republican house effect. I googled Rasmussen and "most accurate pollster" and found a huge number of recent 2012 hits to definite assertions made *after 2010* that Rasmussen is the most accurate.  This in spite of Rasmussen having a huge mean squared error (and as noted a huge mean error) in 2010 (and 2000).  Clearly there are many people who equate "largest Republican House effect" and "most accurate".  A reputation for being reliably Republican is clearly extremely valuable to  Rasmussen.  It is clear that there are many people who sincerely equate pro-Republican with accurate, unbiased, fair and balanced.

So it might hurt them to include the Pulse results and get the estimate Rasmussen house effect down. But it's still the right thing to do.

I think it is clear that the huge Gallup house effect is the result of an admirable determination to not fiddle the numbers.  Gallup is still using the likely voter filter they developed decades ago (at least the description seems the same to me -- it might be subtly changed in a way which is numerically very important but passed my this seems familiar vague memory screen).  They developed an excellent record of getting it about right with that filter.  Then in 2010 they missed the actual Republican - Democrat average over congressional races by 9%.  It is very clear that they were worried about the LV filter in 2010, because they began publishing alternative LV estimates using the same sort of LV filter as others (basically if you say you are likely to vote you are counted as a likely voter).

It seems that something about the population has changed.  I don't know what.  My guess is that the Gallup LV filter always gave a sample which was older than the population of people who actually voted.  This would not be a huge problem back when the LV filter gave good forecasts of results, because there wasn't a huge difference in partisan preferences by age.  In 2008 and 2010 there were huge differences (older = more Republican).

Then Gravis Marketing the other pollster beyond Rasmussen.  I am quite sure they are accutely embarrassed by their house effect.  This is because a Gravis Marketing employee spent an amazing amount of time and pixels in the comment thread to a post of mine about their house effect.  One explanation of the Rasmussen house effect is that they try to poll all in one day.  Others choose phone numbers and then call and call day after day (for 3 or 4 days) till they get an answer.  All in one day pollsters over-sample people who are home a lot.  Note that the first Gravis polls were conducted all in one day and then they called for 2 days.  This was clearly (first person claim) an effort to avoid a bias which might be showing up as a Republican House effect.

Over on the other side, I just don't get Zogby.  The real phone based Zogby poll is perfectly respectable.  The web based poll Zogby Interactive now renamed JZ analytics is notoriously absurdly unreliable.  This has to be terrible publicity for Zogby.  Using just initials and not a name is not enough to hide the association from, well the people who might hire a pollster. It isn't even a strategy to get Democratic organizations to commission reliably Democratic slanted polls to affect the narrative.  Newsmax is a major consumer of JZ analytics polls.  I just can't imagine the thinking of John Zogby or whoever runs Newsmax (I type whoever as I assume there is a human being involved although I can't claim that Newsmax editorial policy has clearly passed the Turing test).  ]




I think some arithmetic assistance might be useful.  You show the house effects for Obama's vote share and stress that all estimates are in two party terms.  I think it would be helpful if you explained that the house effect for Obama minus Romney is twice as large (I am presenting myself as arithmetically sophisticated but I typed "slightly less than" because I mis-remembered "two party vote share" as "percent support").  Yes some readers will be insulted to have arithmetic explained to them, but others will be helped.

Note: Huffdweebs have a word limit so this is longer than my comment there.  I let it all hang out here.  The stuff that isn't there is in [].

No comments: