Can a longer memory improve predictions?

By John Hintze     
December 02, 2014

From: Magazine

Harold Hurst had a problem: reality refused to conform to a model. That’s no small concern when your job is to predict the flow of the Nile River, which runs through 11 countries. Hurst, a British hydrologist, found that existing statistical methods underestimated the ideal height of dams across the Nile, and in the 1950s, he published a mathematical approach to adjust for this discrepancy. Researchers who later sought to explain the anomaly Hurst studied developed the concept of “long memory,” a term for describing systems whose behaviors depend on events that occurred long in the past.

Statisticians create models for a vast range of activities, from building computer networks to sequencing DNA to trading stocks and bonds. There’s a growing interest among statisticians in examining long memory, but many statistical models rely solely on recent data, says Chicago Booth’s Robert B. Gramacy. He says statisticians should revise their models.

When looking at a time series—a sequence of uniformly measured data points—that seek to explain the price of a commodity, the annual flow of the Nile, and other phenomena, statisticians should try to determine how much of the variation is due to inherent fluctuations, and how much indicates a structural shift, Gramacy says. The danger of using only the present state of a system to predict its future is that the data may at first appear to show a meaningful trend, but that trend disappears when researchers examine a longer time horizon.

Gramacy wants more researchers to understand how the concept of long memory evolved, and to apply it to their own work. In a paper with special contributions from Timothy Graves of the University of Cambridge, along with Nicholas Watkins of the Max Planck Institute for the Physics of Complex Systems in Dresden, and Christian L.E. Franzke of the University of Hamburg, he traces the contributions of mathematician Benoit B. Mandelbrot. Best known for groundbreaking work in fractal geometry, Mandelbrot was the key developer of the concept of long memory, which arose from his research in finance. In studying cotton markets, he concluded in 1963 that the distributions of asset-price changes had tails much “heavier” than those of then-standard econometric models. He also realized that heavy-tailed distributions could take another form, as long waiting times between events. He built on this idea by introducing long-memory processes as a modeling tool in hydrology in the late 1960s. Since then many researchers have incorporated long memory into modeling the Earth’s climate and other complex systems.

“Long memory behavior has been found in a wide range of fields, from climatology to finance, and from astrophysics to information theory,” Graves says.

Mandelbrot’s insight explains why Hurst had so much trouble predicting the Nile’s behavior. The long-memory effect means that a river will dwell at its highs and lows longer than the models of Hurst’s time predicted, so a dam needs to be built higher to accommodate the resulting stored water. Mandelbrot’s concepts remain relevant to current financial debates, Gramacy and his coauthors note. For example, regulators implementing the Basel III rules, which increase bank capital requirements, need to consider whether their estimates should include heavy-tailed distributions and long memory.

“Fat tails in finance were originally discovered in the 1960s, and clustered volatility or long memory in the early 1990s,” says Mark Buchanan, author of Forecast: What Physics, Meteorology, and the Natural Sciences Can Teach Us About Economics. “But this stuff has been mathematically harder to deal with and so has been ignored until recent years.”

Applying long-memory concepts also could provide insight into major stock-market movements, such as the 2007–10 financial crisis, Gramacy says. “Is it just that bubbles are a normal part of the cycle of life in the financial markets, or can they be avoided?” Gramacy asks. If the latter, another challenge awaits: trying to prevent the next disaster.

The 'Noah effect'

Mathematician Benoit Mandelbrot colloquially described his findings on heavy-tailed and long-memory processes as the “Noah effect” and “Joseph effect,” respectively.

The Noah effect, which refers to the Biblical flood, describes sudden, discontinuous change. Mandelbrot found that the “tails,” or outlying values, in the distribution of cotton prices are much fatter—or more likely—than a normal curve would indicate. Studies have shown this effect in US stock indices, major currencies, and interest rates.

The 'Joseph effect'

The Joseph effect refers to Joseph’s interpretation in the Old Testament of Pharaoh’s dream, which predicted that seven years of plenty would be followed by seven years of famine. It describes the tendency for successive values to remain close to each other, or to be dependent. “If markets are volatile this week, then they will also tend to be volatile next week,” says physicist Mark Buchanan.

Timothy Graves, Robert B. Gramacy, Nicholas Watkins, and Christian L.E. Franzke, “A Brief History of Long Memory,” Working paper, November 2014.

Benoit B. Mandelbrot and J.R. Wallis, “Noah, Joseph, and Operational Hydrology,” Water Resources Research, October 1968.