Experts are often on television making short-term predictions on stocks and global markets or longer-term predictions on topics like society and technology. With impressive credentials, these experts usually have had some level of measurable success in their careers, but how do they actually fare when comparing their predictions to actual outcomes?

The Score Card
Baseball players have batting averages that qualify and rank them according to their hits. Shouldn’t experts have the same accountability for their assertions? I am not criticizing; in fact, I appreciate being called an expert – in my field. But I do think we should assess experts based on their accuracy.

When listening to “expert” advice, shouldn’t we know the difference between the accurate predictors and the inaccurate predictors?

Back in 1984, The Economist, a well-known and respected European magazine that can be likened to Time magazine in the U.S., asked for ten-year predictions on economic growth, inflation, U.K pound to U.S. dollar exchange rate and other similar data.

The magazine asked four groups to provide forecasts:

1) Ex-finance ministers of OECD economies
2) Chairmen of multinational firms
3) Oxford University economics students
4) Dustmen (the U.K. term for sanitation workers)

Most people, myself included, would presume that certain groups would make more accurate predictions than others. Here is how I thought they would perform:

Most accurate predictions: Finance ministers
Why: Macroeconomics is what they have studied and is what they do. The economy is their expertise.

Second most accurate predictions: Chairman of multinational firms
Why: They must have a great understanding of various world economies in order to plan for their companies, but most plan for 3-5 years, not 10 years.

Third most accurate predictions: Oxford University economics students
Why: They are smart, educated, and study economics, but have no real practical experience.

Least accurate predictions: Dustmen
Why: This group probably has less formal education than the other groups, and they have jobs that don’t relate to economics.

The Results:
By 1994, the ten years had passed and there was actual data to compare to the predictions.

So, who was right?

The best predictors, in order of accuracy, were:

First Place (TIED):
Tying for first place were the chairman of the multinational firms and the dustmen.

Third Place:
Oxford University students

Last place:
Ministers of Finance

The Ministers of Finance coming in dead last begs the question…


One might immediately dismiss the results as sheer luck on the part of the dustmen, but not so fast…

The Economist’s article does not layout any reasoning proposed by the various groups, but the concept of a layperson better predicting the long-term future is actually quite common.


Experts are usually confident in their capabilities, which means that they are less open-minded about their key assumptions possibly being wrong. The person with less confidence and who does not consider himself an expert at the subject is not burdened by any preexisting notions and is less confident that his initial presumptions must be right.

The result: the group with less knowledge was more accurate, but less confident. The group with the most knowledge, was less accurate, but very confident.


Experts can provide substantial value, especially over the short-term. They are notoriously not good at longer-term predictions. Use their knowledge, but think for yourself.

I am a decision-making expert, but as I remind audiences during speeches and presentations, my role is not to decide the answer for others. My work is about empowering others to make better decisions on their own. You should take the same approach when dealing with anyone who is an expert.