Ed Dolan's Econ Blog

Simplicity vs. Compexity, Goodhart’s Law, and the Financial Regulator’s Dilemma

One of the most interesting papers to come out of the Jackson Hole conference this year, both in title and content, was “The Dog and the Frisbee.” The paper, presented by Andrew G. Haldane, Executive Director for Financial Stability at the Bank of England and co-authored by Vasileios Madouros, an economist at the same institution, argues that simple rules are best in financial regulation.

The Case for Simple Rules

The title refers to the simple rules that a dog supposedly uses to catch a Frisbee: Keep your eye on the Frisbee and run just fast enough to keep it in view at a constant angle. The regulatory analogs of such simple rules are things like limits on unweighted leverage ratios, as opposed to more complex limits on multiple risk-weighted tiers of regulatory capital.

Haldane laments the fact that the Basel Accords, which set international standards for financial regulation, have grown from 30 pages (Basel I, 1998) to 347 pages (Basel II, 2004), to 616 pages (Basel III, 2010). When translated into national legislation, the Basel rules can total thousands or even tens of thousands of pages. Worst of all, in his view, are regulations that rely on banks’ own internal risk models—models that are so complex that the banks themselves do not adequately understand them and that have proved incapable of adequately managing risks during periods of financial stress.

The paper notes that the Glass-Stegall Act, with its simple prohibitions on mixing commercial banking with investment banking, lasted sixty years, compared with just six for the complex rules of Basel II. It cites statistical evidence that in the period leading up to and during the recent financial crisis, simple indicators of bank soundness were better predictors of failure than complex ones.

Yet, as Haldane recognizes, simple rules, as well as complex ones, can fall victim to Goodhart’s Law. The law, named after Charles Goodhart, a former adviser to the Bank of England, says that statistical regularities tend to break down when they are used as instruments of control.

Tracing the Mechanisms Behind Goodhart’s Law

Goodhart’s law rates only a passing mention in Haldane’s presentation, but it is worth taking a closer look at its underlying mechanisms, as they apply to monetary policy and financial regulation.

The first reason that Goodhart’s law holds is that statistical regularities do not always reflect causal relationships. Consider, for example, the “nickels paradox.”  Suppose the Fed observes a strong past correlation between the number of nickels issued by the U.S. mint and the rate of inflation. Does that mean restricting the issue of nickels would be a sufficient instrument to control inflation? Of course not—not if pennies, dimes, bank balances, and money in all other forms were issued in the same quantities as before. All that would happen is that the previously observed correlation of nickels with inflation would disappear.

The second mechanism behind Goodhart’s law stems from the way policy rules affect expectations. This is the focus of the version of the law that economists call the Lucas critique. Robert Lucas emphasized that the optimal behavior of rational economic agents depends on their expectations, and that their expectations  depend, among other things, on the rules the policymakers follow. For example, market expectations of future inflation rates will differ depending on whether the central bank follows a purely discretionary policy, with deference to political influences, or whether it adheres to inflation targeting.

The Lucas critique, as such, is not a critique of policy rules, but rather, of the use of econometric evidence in formulating such rules. It argues that evidence from a period when one rule is in force may not be relevant to a period governed by a different policy regime. It recognizes that a well formulated rule, credibly followed, can change expectations in a way that improve macroeconomic performance.

The third mechanism behind Goodhart’s law arises from regulatory arbitrage. That awkward phrase refers to the fact that financial institutions tend to game the regulations to which they are subject in a way that complies with their letter while subverting their intention. The Basel I regulations for risk-weighted capital provide an example. Those regulations divided assets into four “risk buckets”—loans to consumers and nonfinancial corporations, mortgage loans, loans to financial institutions, and sovereign loans of OECD countries. Each category had its own required capital ratio. The idea was to force banks to reduce their risk by increasing their capital whenever they changed their mix of assets for the worse. Instead, banks selectively loaded their balance sheets with the most risky assets within each category, so that overall risk reduction was much less than had been hoped.

Changes in institutional incentives are a fourth mechanism that lies behind Goodhart’s law. The regulatory use of ratings issued by Nationally Recognized Statistical Rating Organizations (NRSROs) provides an example.

It has long been recognized that there is an inherent conflict of interest in the system under which the NRSROs, Standard and Poor’s, Moody’s, and Fitch, are paid by the issuers of the securities they rate. That creates an incentive to give high ratings in order to get repeat business from the issuers. However, that incentive was supposed to be offset, at least to some extent, by market discipline from the side of investors. If investors became convinced that one of the NRSROs was giving inflated ratings, they would supposedly tend to pay less for securities rated by that organization. That, in turn, would induce issuers to send future business to their more honest competitors.

Market discipline never operated perfectly, and it was weakened further when regulators, under Basel II, began to use NRSRO ratings to determine the amount of capital that banks were required to hold. Banks, which are among the most important buyers of many classes of securities, now had an incentive to hold securities that they knew to be overrated in order to minimize the amount of capital they were required to hold. The situation in which investors stood for honest ratings against the common interest of issuers and NRSROs fell apart. Now all three parties were on the side of ratings inflation. That contributed to the complete institutional breakdown of the ratings process, a major factor contributing to the financial crisis.

The Financial Regulator’s Dilemma

A fuller understanding of Goodhart’s law brings us to the financial regulator’s dilemma: Complex regulations tend to fail for reasons Haldane enumerated in his Jackson Hole presentation, but simple regulations also break down, no matter how soundly they are based on past statistical regularities. The Basel II agreements perfectly illustrate the dilemma. During the long debates leading up to those regulations, some argued in favor of relatively simple risk weighting based on ratings from NRSROs. Others argued in favor using banks’ own internal risk models that relied on more complex approaches like value-at-risk analysis. The debate was resolved by issuing two parallel regulatory frameworks, one based on the simpler approach and the other on the more complex one. Both failed spectacularly.

Haldane offers no real solution to the dilemma. Instead, he takes an eclectic approach. He recognizes that there is a need for some degree of complexity, but argues that simple rules, like unweighted leverage ratios, should be used as backstops for more complex rules. He also warns against overreliance on rules of any sort. Instead, he favors a greater use of regulatory discretion and market discipline.

My own view, as I have argued before, is that any form of financial regulation, simple or complex, is likely to fail if it does nothing more than require financial firms to avoid specific risky behaviors. Regardless of the regulations in force, if bankers want to take on greater risks than regulators want them to, they will find a way to do so.  Regulators find themselves in the same situation as a parent who tries to get an obese child to take some weight off by saying, “No more chocolate ice cream, no more Big Macs.” The child just switches to mint chip ice cream and pizza.

In the end, there are only two regulatory approaches that can offer real promise of reducing the risk that the financial system poses to the wider economy.

One approach would focus directly on reducing financial managers’ appetite for risk. Short of putting some kind of drug in the Wall Street water system, the best hope of doing that would be through reform of corporate governance. Among the most important would be reforms that increase shareholder influence over compensation systems that give executives, traders, and others excessive incentives to take risks. Such reforms are easier to advocate than to implement. Furthermore, their potential for success depends on the premise that managers take on more risks than shareholders want them to, which may or may not be true. Despite their problems, though, they are worth pursuing.

The other approach is to try to reduce the risks to the economy at large from the failure of individual financial institutions. That would require a head-on attack on the too-big-to-fail doctrine. The Dodd-Frank Act is supposed to give regulators new tools, including the power to break up large financial firms, but the financial giants are resisting their effective implementation with all of their vast political power. Yet, if financial institutions are too big to fail, and if regulations, whether simple or complex, cannot prevent their failure, what else is left to try?

5 Responses to “Simplicity vs. Compexity, Goodhart’s Law, and the Financial Regulator’s Dilemma”

ThomasGrennesSeptember 6th, 2012 at 9:42 am

Optimum regulation is a difficult subject but allowing unsuccessful firms to fail would
simplify the process. Two comments on Dodd-Frank: (1) It is not simple! (2) More than two years after its passage, the public does not know what it means. Whether something
called a Volcker rule will become law and what exactly it means remain to be announced.
Nick Bloom and colleagues have quantified the increase in fiscal uncertainty in recent years and have argued that it has decreased business investment and aggregate demand.

Ed Dolan EdDolanSeptember 6th, 2012 at 10:40 am

I agree on both points. Dodd-Frank is absurdly complex, like almost all recent legislation from Congress seems to be. (Just look at the health care law for a case of good intentions mired in complexity.)

Also, you are right to point out that TBTF can be attacked in two ways, not just by making the big firms less big, but by improving ways of winding them up so that their failure results in less contagion.

Andris_StrazdsSeptember 6th, 2012 at 11:09 am

Ed, I fully agree to your first point on perverse incentives. On the second one – getting rid of too-big-to-fail institutions would not necessarily solve the problem. What if there are many small banks, but most of them invest in bonds issued by profligate governments or give out lots of low quality mortgage loans locally, thereby creating a real estate bubble? When the governments default (or restructure) or the real estate bubble bursts, each of them separately would be too-small-to-bother-bailing-them-out, but letting them fail en masse would not be an option either. There are also advantages to big institutions – for example, they might be more likely to internationalize and once they have done that, problems in one country might be compensated by strong performance elsewhere. A case in point is apparently Santander. I think requiring large institutions to hold more capital is a step in the right direction. In addition, the risk weights of assets should be reviewed. It might be even fair to say that there is no such thing as a risk-free asset.

areopageticaSeptember 7th, 2012 at 8:29 am

That's why it took Moses 40 days up on the mountain. He knew he had to boil it down to a few simple rules. Not easy.