We humans have capabilities that we don't use and may not even know about.
In the long-ago days when merchant ships had sails, for example, a captain could put together a crew that, with training and experience, could scamper up bare masts, rig the sails and scamper back down and get the ship underway in a matter of minutes.
Where would that captain go today to find such an able-bodied crew? The potential is out there, surely, but it's untapped.
That's an untapped skill we know about. What about skills we don't know about? Foreseeing the future would be nice. Can a forecast about something other than the weather be reliable?
An experiment organized by the national intelligence community and set to end this year is attempting to demonstrate that reliable forecasts can be made about economic and geopolitical events. And that some people, including Woodside resident and investment adviser Bob Sawyer, are good enough at it to earn the title of super forecaster.
Mr. Sawyer is one of some 12,000 volunteers throughout the world participating in the Good Judgment Project. For the past four years, small teams of about a dozen people each have been competing for the honor of accurately answering yes-or-no questions like these:
■ Will the U.N. Security Council, before March 1, 2015, admit either India or Brazil as a permanent member?
■ Will a no-fly zone officially be established over any part of Syria before March 1, 2015?
■ Will North Korean President Kim Jong Un, before June 1, 2015, meet a head of state from one of several named countries?
These forecasting tournaments, with about 100 questions a year, are sponsored by the Intelligence Advanced Research Projects Activity (IARPA), which is overseen by the Office of the Director of National Intelligence. IARPA, with its numerous and varied programs, may be at the cutting edge of advances in intelligence gathering and analysis.
The intelligence community is "gaining knowledge about methods for making timely and accurate forecasts of world events," according to an IARPA statement. "The methods include selection and training of forecasters as well as elicitation and aggregation of their forecasts."
About three times a month, Good Judgment Project teams receive questions and collaborate on forecasts, with each member eventually weighing in with yes-or-no answers, Mr. Sawyer says. Questions accumulate, forecasts evolve, and since most questions have a time element, they drop off the list as deadlines pass. Participants receive a stipend for their work.
Mr. Sawyer's forecasts were in the top 4 percent in terms of accuracy in his first year, the top 3 percent in his second year, and the top 2 percent in the third, which earned him his title. "I really dove in and got addicted to it," he says.
"I guess I'm a little surprised," he said when asked about his standing. In his first year, he says he never looked at his score. At some point, he learned that he was number 8 out of 200. "It's like, 'Whoa.' I didn't know," he says. "I never would have guessed I was good. After four years, I know that I'm rather good at it. I'm sort of good. I'm not great."
He says he spends about 10 hours a week thinking about the questions and reading background material, a process that would be impossible without the web, he says. Sometimes he works in his office, sometimes at the dining room table, sometimes in the kitchen, sometimes in bed. "That's the beautiful thing about laptops," he says.
While exercising on a stair-step machine -- he says he climbs 50,000 flights a year -- he may watch the news and he may take notes. "I'll go find a piece of paper in the fitness club and I'll put some notes on it," he says. Notes tend to accumulate in his pocket before finding their way to his desk, which has even more pieces of paper on it. "I should clean that up," he says. "Maybe I should be more organized."
Mr. Sawyer, who is 57, is married to Nancy Sawyer and the couple has two children. He grew up in an educated household. His dad was an electronic and chemical engineer. His mother volunteered at the library and school and had a degree in math. In the 1940s, she was employed calculating numbers with a slide rule in a wind tunnel for the Massachusetts Institute of Technology. "Her job title was 'computer,'" Mr. Sawyer says. "My mother was a computer."
He has a bachelor's degree in geography and a master's degree in business, both from Northwestern University. Geography? He says he chose it because it was light on required courses, allowing him to take electives, including lots of economics. A complex question that a geographer might like: Why do cities evolve as they do? Why did France develop with just one major city while Germany has several?
"I'm inquisitive, insatiably curious," Mr. Sawyer says. "I can get fascinated by lots of different disciplines."
Good Judgment Project team members do their own research and discuss their forecasts electronically, bringing to the discussion their diverse assets. One of Mr. Sawyer's teammates, for example, subscribes to Bloomberg News, a wellspring of authoritative information on economics and geopolitics.
The team may have women on it, he says, but he doesn't know since some members use aliases. In 2012, women were about 20 percent of the group at a Good Judgment Project conference, he says. "We'd like to have more," he adds.
Thinking about Ebola
The project applies principles discussed in James Surowiecki's 2004 book, "The Wisdom of Crowds." In this thorough and highly readable analysis, Mr. Surowiecki offers a forecasting formula based on the judgment of a particular kind of crowd. The right components have to be in place: diversity of opinion, independence of thought, the ability to draw on specialized knowledge, and the ability to integrate individual judgments into a collective decision.
In early 2015, when the Ebola virus had broken out in two African countries, his team received a question asking for a forecast on whether Ebola would move to a third country by a certain date. Mr. Sawyer says he was confident that it would be contained. The World Health Organization and the Centers for Disease Control were engaged. "They've got this down," he recalls thinking. "You picture a (Boeing) 747 full of medical experts and they land and Ebola deaths get stopped at 50 deaths or 100 deaths."
He won the agreement of his teammates. Three weeks later, Ebola broke out in a third country. "I was wrong," he says. "You could still argue that my reasoning was right. (The public health establishment) doesn't blow it very often ... but they sure blew it on that one."
Mr. Sawyer says that when this project ends in 2015, he may start a consultancy with some of his colleagues to train organizations to do their own forecasting. Part of that may be having an organization hold internal forecasting tournaments. His services could include devising well-formed yes-or-no questions -- not an easy task, he says -- and composing algorithms that can aggregate peoples' answers.
In 2005, this reporter made a successful forecast about Kepler's Books & Magazines. Based on observations of shelves barer than usual and book markers slightly smaller than usual, this reporter announced to the Almanac newsroom, to laughter and incredulity, that the bookstore was "in trouble." Several months later, the store announced it was closing. (It soon reopened with the enthusiastic support of the community.)
How would his team have considered this forecast? "You'd get shredded," Mr. Sawyer said. "Not to say that you were wrong. You would find a lot of questions raised about your reasoning. ... People confuse luck with skill."
About the project
While the intelligence community is sponsoring this project, it was the inspiration of Phillip Tetlock, a psychology and business school professor at the University of Pennsylvania.
In an essay about the project's genesis, Mr. Tetlock cites the surprising end to the Cold War, particularly the arrival in 1988 of Soviet Union Chairman Mikhail Gorbachev. Mr. Tetlock noted that experts who hadn't predicted Mr. Gorbachev's radical policy changes nevertheless came up with compelling explanations for what was happening in the Soviet Union.
Over 30 years of research, Mr. Tetlock says he's learned two things about political analysts: that it's "very hard" for them to do much better than chance, and that as a group they tend to be overconfident.
"When they made strong predictions that something was going to happen and it didn't," he says, "they were inclined to argue something along the lines of, 'Well, I predicted that the Soviet Union would continue and it would have if the coup plotters against Gorbachev had been more organized.'"
The Good Judgment Project expects accountability from its forecasters, Mr. Sawyer says.
The project also addresses the question of whether social sciences world politics and economics for example are like hard sciences in that they are clock-like and predictable, or cloud-like and sort of random. With that question in hand, the project addresses what goes into the making of a good forecaster: deep knowledge about a topic or wide-ranging opinions on many topics?
"If world politics is more cloud-like -- little wisps of clouds blowing around in the air in quasi random ways -- no matter how theoretically prepared the observer is, the observer is not going to be able to predict very well," Mr. Tetlock says. "One of the things that we discovered ... was that forecasters who suspected that politics was more cloud-like were actually more accurate in predicting longer-term futures than forecasters who believed that it was more clock-like.
"Good judgment is a curious thing," he says. "Virtually all of us think we possess it but few of us can come up with a definition much more compelling than the old definition of pornography: 'I know it when I see it.'"