You are currently viewing the printable version of this article, to return to the normal page, please click here.

Average citizens can be much better than CIA at forecasting world events, study finds

Story Topics
Question of the Day

Is it still considered bad form to talk politics during a social gathering?

View results

A small group of psychologists working with the intelligence community set out to see if the average person could predict world events better than experts with inside information. It turns out, they often can — and they're now known as superforecasters.

Three years ago, 3,000 people were chosen to participate in the Good Judgment Project, which pitted average people from a wide variety of backgrounds with the intelligence analysts. They were asked a number of questions of importance to the intelligence community, and it turned out that the best forecasters (the top one percent) predicted geopolitical events with 30 percent better accuracy than those with classified information, NPR reported.

"I'm just a pharmacist," Elaine Rich, a pharmacist, told NPR. "Nobody cares about me, nobody knows my name, I don't have a professional reputation at stake. And it's this anonymity which actually gives me freedom to make true forecasts."

Ms. Rich is one of the most accurate participants in the study, otherwise known as a "superforecaster."

Ms. Rich uses Google to do a bit of research on the question at hand, but told NPR that was about it. She is asked questions like "Will Russian armed forces enter Kharkiv, Ukraine, by May 10?" and then rates her forecasts on a number scale.

"Everyone has been surprised by these outcomes," Philip Tetlock, one of the three psychologists who came up with the idea for the Good Judgment Project, told NPR.

Mr. Tetlock believes that the study's findings will validate the belief that there is wisdom in crowds, meaning that although the educated guesses of many people will often produce erroneous predictions, on average groups of people can hone in on the truth.

"There's a lot of noise, a lot of statistical random variation," Tetlock told NPR. "But it's random variation around a signal, a true signal, and when you add all of the random variation on each side of the true signal together, you get closer to the true signal."

By tracking the results, scientists can find out who is best at zeroing in on truth, and then work on ways to make them even better.

Jason Matheny, a member of the intelligence community who helped get the Good Judgment Project off the ground, was surprised by the results.

"They've shown that you can significantly improve the accuracy of geopolitical forecasts, compared to methods that had been the state of the art before this project started," he said.

Asked by NPR whether he thought the study would put a dent in the need for professional intelligence analysts, he said no. "I think it's a complement to [existing] methods rather than a substitute," he said.

In the near future, the scientists behind the Good Judgment Project plan on recruiting new people to continue their work, NPR reported. 

© Copyright 2014 The Washington Times, LLC. Click here for reprint permission.

Comments
blog comments powered by Disqus
TWT Video Picks