But once in a great while, a poll comes along with methodology that is so implausible that it deserves some further comment. The Foster McCollum White Baydoun poll of Florida is one such survey.[skip]
For instance, we have our house effects adjustment, which corrects for most of these tendencies. Based on this poll, and a prior survey the firm conducted in Michigan, we calculate the firm’s house effect as leaning Republican by roughly 11 percentage points relative to the overall consensus. We do not subtract out the entire 11-point house effect from the polling firm’s results — the model allows polling firms to retain some of their house effect — but the model does adjust the poll substantially, treating it as about a 7-point lead for Mr. Romney rather than a 15-point one. That’s still a very good number for Mr. Romney — enough to make him a slight favorite in our forecast for the state — but at least a little bit more reasonable relative to common sense. Is there argument for just throwing the poll out? In this case, perhaps. But as I said, I’d rather design a system where we have to make fewer of those judgment calls and err on the side of inclusivity. Our threshold for calling out a poll’s technique as being dubious, as we have here, is pretty high — but our threshold for actually throwing a poll out is higher.Silver is by far the most sophisticated aggregator published by mass media. He notes that outrageous nonsense which is pro-Romney by 11 points compared to the average of other pollsters only counts as if it were pro-Romney by 7 points. I think the 11 point estimated house effect is a new record. I don't like to make predictions, but I am willing to predict that it will be surpassed. The other plainly biased pollsters are "We Ask America" (which belongs to a business lobby) and "Purple Strategies" whose CEO is the notorious Alex "hands" Castellanos one of the vilest partisan operatives in the business.