“Amazon Prime Day: Cash or Crash?” — Annie’s Newsletter, August 6, 2018

Sign Up to Receive Annie Duke’s Weekly E-Newsletter and Updates Here


Newsletter Volume 1, Number 34


AMAZON PRIME DAY: CASH OR CRASH?
Depends on which headline you read

On July 16, I went to look for something on Amazon.com and, instead of finding it, got an error message with an image of an adorable dog.

The adorable dog distracted me from the specifics of the error message, so I didn’t know what the problem was.

(Likewise, the error message distracted me from figuring out the point of the adorable dog: Did the dog belong to an Amazon.com employee? Did it chew through a computer wire, an Internet Age version of my-dog-ate-my-homework?)

It was Amazon Prime Day.

I realize that Prime Day has become a huge deal. Assuming my experience on the site wasn’t a coincidence, I wondered whether website issues had potentially become the story of the day.

It’s easy to imagine such a story being framed in two distinct ways, ending up as Bad News or Good News.

Bad News: Biggest web retailer invents a quasi-holiday, gets everyone to buy into it, and then blows it because it can’t keep its website operating.

Good News: An embarrassment of riches, Amazon is doing so much business it can barely keep up.

I scanned some headlines to see if this was turning into a good thing or a bad thing and, lo and behold, it depended on which headline I looked at:

This is a good example of the importance of framing. The same facts, spun differently, can influence the takeaway.

It’s easy for us to fall for this. We form our beliefs in a haphazard way, and – even though we don’t really vet the information very well – those beliefs drive the way we interpret subsequent information.

This pair of headlines about Amazon clearly demonstrate that. You wouldn’t guess that the headlines could possibly be referring to the same information.

It’s all about the frame.


THE IDEOLOGICAL TURING TEST:
Steel man vs. straw man

Here’s a thoughtful piece from Charles Chu on Medium.com, “The Ideological Turing Test: How to Be Less Wrong.”

Chu offers that letting go of the belief that we’re always right (or mostly right, or even often right) gets us a long way to being better decision-makers.

Learning and advancement REQUIRE that we start from a position of being wrong.

Embrace being wrong and we can learn. We can advance. We can be part of the solution rather than being part of the problem.

(The “problem” is not being wrong. It’s clinging to a belief that we must be right.)

If we’re not married to our ideas (or in a committed relationship to them), we can better expose our ideas to challenges that help to refine and calibrate those beliefs, ultimately steering us toward a more accurate model of the objective truth.

That’s where the Ideological Turing Test comes in.

The original Turing Test is, of course, about whether an AI can pass as a human being in a conversation with another human being. Or:

A robot walks into a bar…. If you can tell, it didn’t pass the test.

Economist Bryan Caplan came up with the Ideological Turing Test.

If you can successfully convince someone who holds a belief that opposes you that you genuinely hold that opposing belief, you’ve passed.

If you want to make sure you’re practically applying John Stuart Mill’s idea, “he who knows only his own side of the case knows little of that”, make sure you are defending your side of the argument against a steel man version of the opposing side rather than a straw man.

If you can’t pass the Ideological Turing Test, that’s a good sign you aren’t defending your ideas against the strongest version of the opposition.

Additional places for finding the people and ideas from this article:
Charles Chu: @MMeditationsThe Polymath ProjectThe Polymath Podcast
@MediumMedium.com
Bryan Caplan: @Bryan_CaplanWebsiteblogs


VIEWPOINT DIVERSITY (OR LACK THEREOF) IN SOCIOLOGY
Recent research, ideological imbalance, and potential consequences
h/t John Wright

I’ve written, in Thinking in Bets and in numerous newsletters, about the ongoing issue of the significant ideological imbalance in the social sciences.

John Wright, a criminal-justice professor at the University of Cincinnati, tweeted a thread about a recent study of that imbalance among sociologists. (You can read the study itself, “Sociology’s Sacred Victims and the Politics of Knowledge: Moral Foundations Theory and Disciplinary Controversies,”  by Mark Horowitz, Anthony Haynor, and Kenneth Kickham.)

The ideological imbalance is as bad as we’ve thought – or worse. In the survey of 479 sociologists, 83% identified as liberal or radical, 13% as moderate, 2% as conservative, and 2% as libertarian.

In answers to questions about sociology’s moral mission, advocacy-objectivity, and controversies, the ratio of those agreeing-disagreeing with questions in these areas aligned closely with ideological orientation.

Wright gave this example from the study, involving questions on moral mission and advocacy-objectivity:

You can see from this survey that the line between activism and objective inquiry has blurred in the discipline.

As an example, 43% of respondents disagreed with the statement that “Advocacy and research should be separate for objectivity.”

The potential effect on the quality of the findings of a profession so composed is frightening.

On an individual-study basis, it comes back to the familiar issue of framing: In a predominantly liberal-radical discipline, what topics are they choosing to study? (And what topics may be considered off-limits?)

How are studies conducted and data analyzed on potentially controversial or sensitive topics when everyone in the process shares the same political orientation?

The potential for losing sight of the scientific mission is significant. Legendary Physicist Richard Feynman described the essential quality of scientific integrity as

“a kind of utter honesty – a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid – not only what you think is right about it: other causes that could possibly explain your result…”

If you’re not asking why you’re wrong, you’re likely not advancing science. Because of the difficulty of doing that, viewpoint diversity is an important safeguard.

But how can you do that if everyone in your field has the same viewpoint?

Additional places for finding the people and ideas from this article:
John Wright: @CJProfmanfaculty page
Mark Horowitz: faculty page
Anthony Haynor: faculty page
Kenneth Kickham: faculty page


IS INTOLERANCE A FEATURE OF THE TRIBAL NATURE OF HUMANS?
New meta-analysis reveals the “Intolerance of the Right” is also “Intolerance of the Left” if you ask the right questions

Right on cue, Jesse Singal came out with a thought-provoking piece in New York Magazine that shows how the political imbalance in the social sciences has the potential to deeply bias topics studied, experiment and survey design, and interpretation of results.

That, in turn, can lead to faulty conclusions, especially when those conclusions comport with what our political leanings would lead us to intuit.

When a conclusion agrees with our politics, we are much more likely to accept the result, blind to how bias has affected a study’s design and framing.

Singal’s piece covers a powerful example of how bias can drive results, reviewing the work that led to the popular belief in the “Rigidity of the Right.”

The big idea is that conservatives are especially prone to rigidity, intolerance, or authoritarianism. Singal defines the model as follows:

“a constellation of psychological attributes and evocable states — including dogmatism, closed-mindedness, intolerance of ambiguity, preference for order and structure, aversion to novelty and stimulation, valuing of conformity and obedience, and relatively strong concern with threat — leads to a preference for right-wing over left-wing political ideology.”

But here is the problem: The kinds of questions that researchers come up with to probe on these attributes (e.g., asking about opinions on the power of the government vs. the rights of individuals) are likely to be political in nature.

That opens the door for the political leanings of the researchers to influence the results by asking questions on topics where they value the rights of the individual over the rights of the group.

If you ask if the government should interfere with the rights of individuals to marry, Left-leaning respondents will side with individual rights more than Right-leaning respondents.

But does that mean the Right is more authoritarian in general?

You wouldn’t know unless you also asked that range of people about an issue on which the Left favors government intervention and the Right favors individual rights.

Tweaking a set of “standard” questions to mix whether they tap into liberal or conservative sentiments yields results of dogmatism across the spectrum:

“Liberals scored as more dogmatic than conservatives when it came to their agreement with sentiments like ‘When it comes to stopping global warming, it is better to be a dead hero than a live coward’ and ‘A person who thinks primarily of his/her own happiness, and in so doing disregards the health of the environment (for example, trees and other animals), is beneath contempt,’ while conservatives, by contrast, scored higher than liberals on items tuned in the opposite political direction.”

When researchers asked people to express agreement or disagreement with statements like “I think that _____________ should not be allowed to organize in order to influence public policy” and filled in the blank with the groups on this chart, they found that respondents’ level of tolerance depended on their similarity to or difference from the group involved:

Once you probe a variety of people on sensitive issues on both sides of the aisle, you see that they are both equally willing to favor the rights of the group over the rights of individuals.

But you can’t tell that if you ask questions only about issues that are sensitive to one side.

And here’s the big problem: The researchers who’ve found “Rigidity on the Right” are unlikely to have intentionally biased their findings. It’s more likely they just didn’t notice that the probes were biased toward exposing authoritarian tendencies from Right-leaning individuals.

You can have the best intent, but you also need people with opposing views in the conversation challenging you.

Without a motivated, opposing viewpoint in your face, you’re going to inevitably drift toward your views and the views of the group with which you identify.

What surprises me somewhat is that the body of research affirming the “Authoritarian Right” overlooked the historical reality that authoritarian leaders have come from doctrines identified with the Left (see Mao, Stalin) and the Right (see Hitler).

Authoritarianism is not a Left-Right thing. It is better thought of as orthogonal Left-Right.

The Political Compass offers a nice visualization of the relationship between the Left-Right axis and the authoritarian-libertarian axis.

You can actually take the quiz this chart is based on, to see where your views fit on the economic (Left-Right) and social (libertarian-authoritarian) spectrum.

Additional places for finding the people and ideas from this article:
@JesseSingal
New York Magazine@NYMag
PoliticalCompass.org


ADDENDUM ABOUT ACTION BIAS
Steve Glaveski shared with me an interesting point about the soccer-goalie example

In the July 23 newsletter, I recommended an article in the Harvard Business Reviewabout “action bias” – our preference for taking action, even when not acting or thinking may be the better course.

Steve Glaveski, CEO and cofounder of Collective Campus, read the item and emailed me a nice note, along with a point about the soccer-goalie example, which he gave me permission to share:

“One potential factor in the soccer goalie example might have discounted is sample size of penalties.

A goalie who only ever stands still is unlikely to save 33% of the time once shooters catch on to his strategy!”

Additional places for finding the people and ideas from this article: Harvard Business Review@HarvardBiz
Steve Glaveski: Collective Campus@CollCampus  


THIS WEEK’S ILLUSION: CONFETTI

This is a version of the Munker illusion, tweeted by David Novick). The dots are all the same color:

Additional places for finding the people and ideas from this article:
David Novick: @NovickProffaculty page
h/t @AkiyoshiKitaoka