Op-Ed Columnist Learning How to Think
By NICHOLAS D. KRISTOF
March 26, 2009 Ever wonder how financial experts could lead the world over the economic cliff?
Nicholas D. Kristof
One explanation is that so-called experts turn out to be, in many situations, a stunningly poor source of expertise.
There’s evidence that what matters in making a sound forecast or decision isn’t so much knowledge or experience as good judgment — or, to be more precise, the way a person’s mind works.
More on that in a moment. First, let’s acknowledge that even very smart people allow themselves to be buffaloed by an apparent “expert” on occasion.
The best example of the awe that an “expert” inspires is the “Dr. Fox effect.” It’s named for a pioneering series of psychology experiments in which an actor was paid to give a meaningless presentation to professional educators.
The actor was introduced as “Dr. Myron L. Fox” (no such real person existed) and was described as an eminent authority on the application of mathematics to human behavior. He then delivered a lecture on “mathematical game theory as applied to physician education” — except that by design it had no point and was completely devoid of substance. However, it was warmly delivered and full of jokes and interesting neologisms.
Afterward, those in attendance were given questionnaires and asked to rate “Dr. Fox.” They were mostly impressed. “Excellent presentation, enjoyed listening,” wrote one. Another protested: “Too intellectual a presentation.”
A different study illustrated the genuflection to “experts” another way. It found that a president who goes on television to make a case moves public opinion only negligibly, by less than a percentage point. But experts who are trotted out on television can move public opinion by more than 3 percentage points, because they seem to be reliable or impartial authorities.
But do experts actually get it right themselves?
The expert on experts is Philip Tetlock, a professor at the University of California, Berkeley. His 2005 book, “Expert Political Judgment,” is based on two decades of tracking some 82,000 predictions by 284 experts. The experts’ forecasts were tracked both on the subjects of their specialties and on subjects that they knew little about.
The result? The predictions of experts were, on average, only a tiny bit better than random guesses — the equivalent of a chimpanzee throwing darts at a board.