One common complaint among political scientists, or those who like complaining about academia, is that so much research has so little to do with real politics. This is not a difficult argument to support; one could easily read a dozen contemporary academic journals and count on one hand the articles that are actually relevant outside of a university setting. As you can imagine, there are two camps on this subject. Some people feel that research is not intended to be popular reading and that it necessarily targets a very small audience. This argument is not without merit. Academic journals are written in a way that presupposes a lot of knowledge on the part of its readers.
Conversely it is often argued that research, or at least its conclusions, should be somehow applicable to the real world. In other words, you may not care about academic argument nor will you understand all of the statistical jargon/literature reviews in an article but you will be interested to know that the research shows X and Y to be true. I fall firmly in this camp. I believe that the first question to ask of any research agenda is "Will I be able to explain this to an intelligent layperson, and if so will he care?" Of course research cannot be written to the average Jerry Springer audience. But a normal person with an interest in politics and the ability to process arguments above the Sean Hannity level should be able to grasp the implications of your findings.
From time to time I would like to take the opportunity to share some relevant research with you the gentle readers. I don't care to turn this into an academic blog (believe it or not, there are plenty in every conceivable field of interest) but I think it's important for more people to realize that there is solid empirical support – quantitative and experimental – for many of the things we presume to know about the political landscape. In other words, "Right-wing talk radio badly misinforms people" is not an assumption but rather a well-supported argument.
Along those lines I would like to recommend one of my favorite pieces of public opinion research, one that goes a long way toward understanding why our national political discourse is one step above a throng of retards slap-fighting in a mud puddle. Jim Kuklinski and Paul Quirk's
"Reconsidering the Rational Public: Cognition, Heuristics, and Mass Opinion" from Elements of Reason (if I link the book they might not punch me for posting a chapter here) is one of the best, most cynical analyses of individual-level public opinion that you will find. While I doubt you're interested in reading 50 pages of it without the benefit of course credit, even a glance at their experimental results (pages 28 and beyond) will be interesting.
The authors perform a series of lab experiments to measure opinions, information, and how individuals react when their beliefs conflict with facts. They ask the participants to guess what portion of the budget is spent on welfare, offer an appropriate amount to spend on welfare (if it differs), and state how confident they are in their estimations. The findings tell a lot of us what we already know.
First, the Reagan years of "welfare queen" rhetoric have resulted in nearly every participant significantly overestimating the amount we spend on welfare payments (other forms of public aid were explicitly not included in the discussion). Some guessed amounts as much as 25% of the annual budget. Secondly, and more importantly, people are wildly overconfident in their levels of information. Two of three respondents were either "confident" or "very confident" that their guesses were accurate. The relationship between accuracy and confidence was inverse; that is, the less accurate the guess, the more confident the respondent was in its accuracy.
If you have a strong stomach you can proceed to the section entitled "Resistance to Correction" (p. 29). Presenting the participants with facts showing that their responses were incorrect had almost no effect on their opinions. Very few of them were willing to revise their positions or retract their previous statements even when the hard facts were put in front of them. In short the beliefs/preferences of participants with no information were indistinguishable from those who were given the facts. There is no relationship between what these people believed and reality. Whether the two coincided or not was irrelevant to the firmness with which they clung to their versions of the facts.
If you've ever wondered why debating "average people" about…well, about anything is so goddamn fulfilling, I think this type of research does an excellent job explaining it. People aren't "stupid" in the sense that they lack information or access to it (well, that may also be the case but it's beside the point). The truth is much more depressing. It makes absolutely no difference whether or not they have information. Presenting the average American with cold, hard facts disproving his or her beliefs is likely to be of no consequence, especially on issues connected to a set of ideological beliefs or values.
Our society encourages people to create their own reality, and it is succeeding. Bertrand Russell said "The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt." In the current political landscape we find that the less people know, the more confident they are that they know everything.