How "moral hazard" benefits investors

From: Blog

During the 2007-08 financial crisis, policymakers frequently cited "moral hazard" as a reason not to bail out big banks. The thinking was that if these institutions were saved from the consequences of their behavior—and with taxpayer funding, no less—they would continue to take excessive risks. But the decision to let Lehman Brothers fail in 2008 sharply escalated the crisis, and regulators ultimately intervened, just as they had in the rescue of Long-Term Capital Management in the 1990s, and in the savings and loan crisis of the 1980s. 

Exactly who benefits from moral hazard, and how? One obvious answer is the bondholders of financial institutions, who typically, in a rescue, are repaid in full. More surprisingly, though, shareholders also gain from the expectation that some banks are too big to fail, according to research by Bryan Kelly, assistant professor of finance at Chicago Booth. 

Kelly and other economists who have studied the moral-hazard phenomenon presented their findings May 15, as part of a Chicago Booth conference on the 30th anniversary of the rescue of Continental Illinois Bank—the first too-big-to-fail institution—and its lessons for the financial system. The conference was organized by Booth's Stigler Center for the Study of Economy and the State. 

To measure the impact of bailouts, Kelly constructs a model of what asset markets would look like in the absence of government intervention, and then he compares that model to what actually occurred in the bailout. He uses the prices of options on financial stocks, explaining that options are essentially a form of insurance against a financial-system crash.

During the crisis, the price in the options market of insuring an index of financial-company stocks was surprisingly cheap, "because the government was handing out the same insurance free," in the form of an expected bailout, Kelly said. "There are huge subsidies to equity holders." 

From August 2007 to March 2009, the average subsidy to investors in bank stocks was about $50 billion, Kelly finds. The price of the insurance contracts that protected against a crash of the financial system was subsidized about 50%. 

The findings show an unintended consequence of government bailouts. "This has a lot of implication going forward for thinking about how we manage systemic risk in the economy," Kelly said. "When you give a bailout guarantee, it distorts market prices." 

Similarly, Deniz Anginer, assistant professor of finance at the Pamplin College of Business at Virginia Tech, finds in his research that when investors expect that the government will shield them from losses, those expectations change other aspects of credit markets. For one, implicit bailout guarantees reduce the cost of debt for large financial companies. Also, the cost of their debt becomes less sensitive to changes in risk. 

Anginer says the subsidy to corporate debt provided by bailout guarantees could be shifted onto the largest banks by imposing a tax or surcharge. "This would help level the playing field, align risk and return, and promote a more efficient financial system," he said. "It might be better than creating 3,000 pages of new regulations." 

The researchers and other conference participants noted that the 2010 Dodd-Frank financial overhaul law, which prohibits taxpayer bailouts, is reducing the incentive for banks to be very large and is likely to diminish the effects of moral hazard. But they cautioned against complacency. 

"My view is that the too-big-to-fail problem has receded dramatically," said Philip Strahan, professor and John I. Collins, SJ Chair in Finance at the Carroll School of Management at Boston College. "The danger is that the cycle repeats as memories start to fade and markets adapt."

Strahan, who earned a PhD in economics from the University of Chicago, presented research showing that bailouts give firms incentives to borrow more than they should, to become more interconnected with other financial companies, and to rely more heavily on short-term credit.

In a previous post, economists and policymakers offered their ideas for ending the subsidies to too-big-to-fail banks and for making the financial system less prone to systemic crises.

—Amy Merrick
Cat:Policy, Sub:Economics,

Celebrating a Nobel Prize, the Chicago way

From: Blog

Chicago economists launch an inquiry into the work that earned the 2013 Nobel Prize in Economics

This article originally appeared on the Becker Friedman Institute website. Read the original here

After the press conferences, the television interviews, the parties and champagne, the University of Chicago celebrates Nobel Prizes Chicago-style: we examine, illuminate, and maybe even debate the prize-winning ideas.

“The Work Behind the Prize” gathered the campus community Nov. 4 to applaud the latest laureates, Eugene Fama and Lars Peter Hansen, recipients of the 2013 Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel.

As President Robert J. Zimmer told the crowd of nearly 450, it is fitting to celebrate by examining their accomplishments and the decades of work that led up to the prize. Four of Fama and Hansen’s close colleagues did exactly that, with institute chair Gary S. Becker moderating.

Unifying Theory, Data, and Statistics
James J. Heckman, the 2000 Nobel laureate, kicked things off: “Lars Hansen is a model economic scientist. His work tackles a fundamental problem:  In a changing world, how can we predict the future, make plans, or devise effective policies?”

Heckman placed Hansen in the context of a line of pioneering Chicago economists and statisticians who have struggled to understand economic dynamics and incorporate uncertainty into economic models. Among them were Frank Knight, a strong influence in Hansen’s work; several economists with the Cowles Commission, including the recently deceased Lawrence Klein; and the late Leonard Jimmie Savage of the statistics faculty. “Their motto is Lars’s motto: science is measurement,” Heckman said. “His work embodies this vision of econometrics as the unification of theory, and data and statistics.”

Hansen’s key contribution was finding statistical methods to simplify complex economic models. The conclusion Hansen’s Generalize Method of Moments (GMM) is that the discrepancy between predicted and actual values must be uncorrelated to any data observed by the decision making agent at the investment date.

“Lars took this … as a way to understand observed phenomenon, and applied it with profound results,” Heckman noted.

With coauthor Thomas J. Sargent, Hansen worked out the new econometrics of rational expectations models pioneered in part by Robert Lucas Jr., who won the 1995 Nobel Prize for that work.

By recognizing that many people don’t fully understand the world around them when making economic choices, Heckman said that Hansen relaxed the rational expectations model, incorporating uncertainty more effectively into economic models.

“He has integrated modern decision theory into what’s called robust controls, risk sensitive controls,” said Heckman. “This is model science. He applies and adopts the scientific method to learn from data, to understand the world and make it a better place.”

Doing Something Without Doing Everything
John Heaton, PhD’89 (Econ), Deputy Dean for Faculty at the University of Chicago Booth School of Business is Hansen’s coauthor and former student. He used Hansen's own words to describe GMM:  “You can do something without having to do everything.”

Heaton explained Hansen’s work with a few simple charts. In a series of papers with Kenneth Singleton, Hansen examined the relationship between aggregate consumption and risk in financial market movements.  They showed that when the stock market dips, as in the recent financial crisis, consumption wobbles.

“The relationship of these two things … is a measure of risk. Investors looking at these two data series are asking, ‘Should I be in the market, given that risk?’” said Heaton, the Joseph L. Gidwitz Professor of Finance.

Consumption, asset returns, and covariance are all factors that are easily measurable.  “What’s missing is the risk preference parameter—how risk-averse people are,” Heaton said. “What Hansen and Singleton showed us is that without having to understand the whole dynamics, when given that [data] series, we can identify that parameter, using historical moments.”

The risk parameter is important in many practical applications, Heaton said. For example, when the Federal Reserve changes interest rates, we expect investors and consumers to respond. “That’s really what’s behind this rational expectations business—it’s investors trying to solve a dynamic estimation problem.”

Heaton said Hansen’s work has important implications for macroeconomists, particularly those at the Fed, who are trying to build complex economic models incorporating many dynamic factors. It helps simplify and isolate the key factors.

Marked by a measure of seriousness and modesty, Hansen’s work is rooted in his deep understanding of the implications that can be drawn from the data, Heaton concluded. “His analysis says that measurements of these linkages and risk are difficult, not just for individuals but for economists as well. Some of the recent things I’ve heard Lars say are just absolutely right and worth listening to, especially with regard to the implications for policymakers overseeing markets. Simple rules and regulation may be the best way to go.”

Simple Efficiency
Continuing the theme of simplicity, John Cochrane explained Fama’s most famous contribution in a nutshell:  “In 1970, Gene defined markets to be informationally efficient – that prices at any given moment reflect all information about future.”

“It’s not a complex theory. Think Darwin, not Einstein,” said Cochrane, the AQR Capital Management Distinguished Service Professor of Finance at Chicago Booth. “It simply says what prices in competitive market should look like. They should not be predictable.”

But the efficient markets hypothesis has subtle and surprising implications. One is that trading rules, technical systems, or market newsletters—all the methods deployed to beat the market—have essentially no power, beyond luck, to forecast stock prices. “That’s not a theory, an axiom, not a philosophy, or a religion,” said Cochrane. “It's an empirical observation that easily could have come out the other way —and sometimes it does.”

Today, 43 years after Fama put forth the efficient markets hypothesis, it remains contentious—mostly because people confuse a few anecdotal examples of beating the market with solid long-term evidence, or they misunderstand what economists mean by efficiency.

“The main prediction of efficient markets is that prices should be unpredictable. But starting in the mid-1970s, Gene started looking at long-run returns,” said Cochrane. “Lo and behold, he found that you can predict prices at long horizons.”

In recent periods, people should have responded rationally to low stock prices after a sharp market decline, but didn’t buy because they felt the market was too risky. “Efficiency is still there, but the facts require huge revision of our world view. The business cycle variation in risk premiums—not variation in expected cash flows—accounts entirely for the volatility in stock valuation,” Cochrane explained.

There are vastly different theories to explain observed facts in financial markets. “We need models of market equilibrium that tie these price fluctuations to more facts. These facts set the set the agenda that my generation is working on,” Cochrane concluded.

And Cochrane said there are many more important questions to answer: “Is the finance industry too large or small? Why do we pay fund managers so much? What accounts for the monstrous amount of trading in markets? How prevalent are runs? Are banks regulated correctly?”
“Gene always has the bottom line for it:  Look at the facts.  Collect the data. Analyze them carefully. Every time we do, the world surprises us. And it will again.” 

Why Asset Prices Matter
The final speaker, Tobias Moskowitz, touched briefly on how Fama’s work improved upon the capital asset pricing model (CAPM).  With the Fama-French model, Fama added two additional factors that produced expected return estimates that aligned much more closely with actual returns.

“With all this work trying to explain asset price movements, why do we care?” asked Moskowitz, the Fama Family Professor of Finance at Chicago Booth.

“We need to know how prices are set because they determine resource allocation. They determine the cost of capital, because different risks face different prices when borrowing,” said Moskowitz. “We can also use these models to evaluate money managers.”

Moskowitz noted that the idea of market efficiency has spawned the whole index fund industry, leading to a huge savings on management fees for investors.  

The efficient markets hypothesis led to event studies—analysis of how asset prices behaved leading up to a major announcement such as a merger, and how quickly information is incorporated into the price. Academics and finance practitioners alike have found success in the area.

“Asset pricing research helps understand what risks people care about and how are they priced. That has been a theme in Lars’s work and Gene’s as well,” he concluded.

Chicago Booth Dean Sunil Kumar wrapped up the event by thanking panelists “and the winners for winning the prize.” Kumar said that since he came to UChicago, he has wondered what happens when someone wins a Nobel Prize here. 

“Now I know,” he said. “We launch an inquiry.”

Cat:Markets,Sub:Finance,