Human beings have always been great anticipators. It was our bread and butter from the start. Other species have superior physical strength, or sharper senses, or greater speed. But we homo sapiens are the great planners, a quality that has given us enormous survival advantages.
Consider: there’s little need to run down your prey if you can accurately anticipate where it will be and then ambush it. Similarly, you don’t need to spend your whole day foraging when you can predict when trees and bushes in specific locations will be bearing fruit. Or, better yet, when you can foresee that the seeds you plant in the spring will become a full-fledged crops a few months later.
So, it’s no wonder that we have been striving to hone the art of prediction throughout the ages. Although we are still far from reaching the goal of flawless foresight, it’s amazing how far we have come. To gain the proper appreciation, let’s briefly examine seven key periods in history.
Primary Era: Prehistory to Classical
Because forethought made it easier for us to hunt and forage, we tried to stretch our abilities by looking to other features of the natural world. Early humans turned to practices such as augury–that, observing the flight of birds—in order to divine the future. At an elemental level, maybe there were practical benefits to this practice. When seagulls stop flying and take refuge somewhere, for example, it can mean that a storm is approaching.
But, being human, we pushed the perceived meanings of bird behaviors. By the time of the Romans, those practiced in the art of augury were helping to make key social and military decisions based on their interpretations of the world of birds. Of course, there were various other less savory divination practices as well, such as haruspicy, which was the inspection of the entrails of sacrificed animals for clues to the future.
Primary Era: Classical to Medieval
The process called “sortilege” or “cleromancy” involves predicting the future from sticks, beans or other items drawn at random from a collection. Forms of this practice show up throughout history in a wide variety of cultures, from Judeo-Christian traditions (casting lots appears in various places in the Bible) to the Chinese tradition of the I Ching, which is descended from bone divination.
Such practices were one step removed from the interpretation of natural patterns and one step closer to symbol-dominated forecasting. Even assuming that there’s no underlying validity to cleromancy, it can sometimes provide beneficial effects to decision-makers. The I Ching, for example, is based on both the tossing of coins and the subsequent interpretation of text to which the coin tosses point. As the text is interpreted, people can bring both their conscious and unconscious minds to a given problem. Under certain circumstances, this might loosen the constraints on conventional thought and lead to more innovative solutions and decisions.
Applying Math to Nature
Primary Era: Renaissance
Over the course of thousands of years of recorded history, human beings became careful observers of many kinds of naturals patterns, especially the patterns of the heavenly bodies. Through sheer empiricism, some experts were able to predict the motions of the stars and planets with great exactitude.
When Nicolaus Copernicus (1473–1543) wrote On the Revolutions of the Heavenly Spheres, he helped change the way humanity saw its place in the universe. Through a potent combination of logic, mathematics and empirical observation, he was able to demonstrate that the earth rotated around the sun rather than vice versa. His ideas were later confirmed by Galileo Galilei and others.
Through such means, human beings became better able to explain what was happening in our solar system, and our predictive powers become considerably more impressive. Eventually, geniuses like Isaac Newton and Gottried Leibniz developed calculus. This was a huge step forward in the discipline of forecasting. After all, calculus is ultimately the study of how things change, providing a “framework for modeling systems in which there is change, and a way to deduce the predictions of such models,” according to Prof. Daniel Kleitman of MIT.
Primary Era: 18th to 20th Centuries
Although some statistical methods are over two millennia old, probability theory emerged in the 17th century. Modern statistics as we know it arose in late 19th and early 20th centuries. Karl Pearson introduced today’s widely used Pearson product-moment correlation coefficient and John Galton developed key concepts such as standard deviation, correlation and even regression analysis. The brilliant Ronald Fisher developed the null hypothesis and many other key concepts. Such ideas remain integral to modern data science, machine learning and predictive analysis.
What it boils down to is this: human kind finally had a systematic way of collecting, analyzing and presenting numerical data related to probabilities. Statistics allows researchers and analysts to measure, communicate and sometimes even control uncertainty.
Developing Future Studies
Primary Era: Post World War II
The origins of future studies lie with people such as Samuel Madden and H.G. Wells, who used a combination of imagination and trendwatching to make guesses about the evolution of society and, in the case of Wells, its most important technologies. However, “future studies” only became a formal discipline after World War II, when technologies and social systems were changing at such a fast pace the future appeared less certain than in previous eras.
The World Future Society was founded in 1966 and the World Futures Studies Federation was founded in 1973. During those years, the first generation of so-called futurists emerged. Alvin Toffler, author of the influential and popular Future Shock, helped turn the topic of the future into a mainstream subject. During this era, futurists developed a number of methodologies, including “serious games,” scenario planning, model and simulation creation, and visioning. However, futurists were (and are) often criticized for faulty forecasting or by side-stepping forecasting altogether in favor of future-creation or future anticipation.
Predictive Analytics and Data-based Forecasting
Primary Era: Late 20th Century to Early 21st Century
The origin of predictive analytics dates back at least as far as modern statistics itself, but it wasn’t until that the 1950s that various organizations began using computer-based modeling to anticipate everything from weather patterns to credit risks. In the 1970s, the famous Black Scholes model was developed to predict the best prices for stock options, and in the 1990s analytics became widely used for everything from web searches to baseball line-ups. The closely related field of machine learning began flourishing in the 1990s. In recent years, predictive models have been become common in virtually every data-based discipline, from biology to marketing to criminal justice.
More Structured and Quantified Forecasts
Primary Era: Early 21st Century
Although this method also has deep historical roots, only recently has research conducted by Philip Tetlock and others demonstrated—via forecasting tournaments held by Intelligence Advanced Research Projects Agency (IARPA)—that people can become empirically better at forecasting. The research suggests good forecasters can even teach others how to forecast more effectively. One key to achieving this is the quantification of forecasting by assigning probabilities.
Will there be further improvements to the art and science of forecasting? It seems likely, given all the financial and strategic incentives for doing so. Consider, for example, that in the field of predictive analytics, research firms project the market will reach up to $6.5 billion by 2019. Yet, that’s probably just the tip of a very large iceberg. We’ll likely never know how much the US and other governments are pouring into efforts associated forecasting and prediction, but we can assume the numbers are very large indeed given governments’ needs to anticipate threats and formulate effective political and military strategies. If we count in the myriad other uses of forecasting and prediction (economic outlooks, stock predictions, political polling, weather forecasting, etc.), the “foresight industry” would be enormous indeed. And almost certainly growing at a fast rate every year.