In a well-publicized incident in March, a campus protest erupted into violence at Middlebury College in Vermont. Although author and political scientist Charles Murray had been invited to speak at a faculty-moderated student forum, other students, faculty, and alumni unsuccessfully tried to disinvite him. When he appeared, protestors shouted him down and kept him from speaking. He and the event’s moderator, Professor Allison Stanger, moved to a secure location and conducted the speech and Q&A via livestream. Fire alarms sounded. Protestors banged on windows. When they attempted to leave the building, they were surrounded and physically attacked. Professor Stanger, shoved in one direction while someone pulled her by the hair in another, ended up in the ER with a neck injury. (If you are unfamiliar with the details, Professor Stanger wrote a first-hand account on Facebook, and many sources, including the Washington Post and Middlebury College’s website, collected accounts immediately after the incident.)
Putting aside the obvious debate this incident raises about free speech, the reaction to Charles Murray highlights a shortcoming we all have as decision-makers: we tend to shut out messages delivered by messengers we don’t like, and that costs us. We all know the expression, “Don’t shoot the .” If you shoot a messenger because you don’t like the , the only messages you receive will be good news, telling you what you want to hear. That is a disaster for any attempt to make rational decisions.
There should be a companion warning, “Don’t shoot the message either.” If you refuse to listen to messengers you don’t like, the only people giving you information will be those you already agree with. Both kinds of “shootings” are significant contributors to creating echo chambers where all we hear are views of the world that align with what we already believe. A necessary step to making objective, informed decisions is to separate from messengers – which promotes objective evaluation of all messages, even those you think you may not agree with or those that come from a source you might not like.
The situation with Murray at Middlebury shows how we hurt ourselves (and, literally, others) when we close our minds. Charles Murray is, to be sure, a controversial figure. He co-authored The Bell Curve in 1994, concluding (amidst much criticism of his methods or the conclusion itself) there was a genetic link between race and , a thesis many found to be offensive. Because some people thought that was a racist message, they didn’t want to hear anything he had to say, or let him speak at Middlebury.
He was at Middlebury to speak about a different subject, his 2012 book, Coming Apart, about the political and social consequences of the growing isolation of the white working class. In light of the results of the most recent Presidential election and the polarization we are seeing in politics, Coming Apart covers an issue that’s current, relevant, and on which Murray was an early commentator.
Just because Murray’s past book offended, doesn’t mean the message of Coming Apart couldn’t be educational or important to hear. And it doesn’t mean that message is even, necessarily, right-wing and extreme.
Cornell professors Williams and Ceci, in fact, decided to test to see how extreme Murray’s views were since it was the perceived right-wing extremeness of his message that “warranted” the extreme reaction by the mob at Middlebury. In an opinion piece for the New York Times, they showed that 57 professors rating the content of a transcript of the speech did not find it right wing at all. In fact, they rated it as a pretty centrist message (on a scale of 1 to 9, where 1 was the most liberal and 9 was the most conservative, they rated the content a 5.05, right down the middle). The protestors were trying to shut down a talk that, even by their definition, didn’t even contain “dangerous” content. Because they didn’t like the messenger, they shot the message.
My point is not directed at one side of the political aisle or the other. According to a meta-analysis written earlier this month by a group of psychologists led by Peter Ditto, partisan bias is an issue liberals and conservatives have in common. Based on 41 studies involving 12,000 participants, they found “overall partisan bias was robust” and “liberals and conservatives showed nearly identical levels of bias across studies.”
Given that everybody has this partisan bias, we could all use a dose of hearing messages from messengers we don’t agree with. If we get exposed to a more comprehensive slate of opinions, we’ll all make better decisions. Even when the messenger (who we wanted to shoot) fails to persuade us, our views become more valuable for having withstood a test from an opposing view.