Close

This page had been redirected to a new URL, please update any bookmarks.

Font Size: A A A

macroblog

January 31, 2014

A Brief Interview with Sergio Rebelo on the Euro-Area Economy

Last month, we at the Atlanta Fed had the great pleasure of hosting Sergio Rebelo for a couple of days. While he was here, we asked Sergio to share his thoughts on a wide range of current economic topics. Here is a snippet of a Q&A we had with him about the state of the euro-area economy:

Sergio, what would you say was the genesis of the problems the euro area has faced in recent years?

The contours of the euro area’s problems are fairly well known. The advent of the euro gave peripheral countries—Ireland, Spain, Portugal, and Greece—the ability to borrow at rates that were similar to Germany's. This convergence of borrowing costs was encouraged through regulation that allowed banks to treat all euro-area sovereign bonds as risk free.

The capital inflows into the peripheral countries were not, for the most part, directed to the tradable sector. Instead, they financed increases in private consumption, large housing booms in Ireland and Spain, and increases in government spending in Greece and Portugal. The credit-driven economic boom led to a rise in labor costs and a loss of competitiveness in the tradable sector.

Was there a connection between the financial crisis in the United States and the sovereign debt crisis in the euro area?

Simply put, after Lehman Brothers went bankrupt, we had a sudden stop of capital flows into the periphery, similar to that experienced in the past by many Latin American countries. The periphery boom quickly turned into a bust.

What do you see as the role for euro area monetary policy in that context?

It seems clear that more expansionary monetary policy would have been helpful. First, it would have reduced real labor costs in the peripheral countries. In those countries, the presence of high unemployment rates moderates nominal wage increases, so higher inflation would have reduced real wages. Second, inflation would have reduced the real value of the debts of governments, banks, households, and firms. There might have been some loss of credibility on the part of the ECB [European Central Bank], resulting in a small inflation premium on euro bonds for some time. But this potential cost would have been worth paying in return for the benefits.

And did this happen?

In my view, the ECB did not follow a sufficiently expansionary monetary policy. In fact, the euro-area inflation rate has been consistently below 2 percent and the euro is relatively strong when compared to a purchasing-power-parity benchmark. The euro area turned to contractionary fiscal policy as a panacea. There are good theoretical reasons to believe that—when the interest rate remains constant that so the central bank does not cushion the fall in government spending—the multiplier effect of government spending cuts can be very large. See, for example, Gauti Eggertsson and Michael Woodford, “The Zero Interest-rate Bound and Optimal Monetary Policy,” and Lawrence Christiano, Martin Eichenbaum, and Sergio Rebelo, "When Is the Government Spending Multiplier Large?

Theory aside, the results of the austerity policies implemented in the euro area are clear. All of the countries that underwent this treatment are now much less solvent than in the beginning of the adjustment programs managed by the European Commission, the International Monetary Fund, and the ECB.

Bank stress testing has become a cornerstone of macroprudential financial oversight. Do you think they helped stabilize the situation in the euro area during the height of the crisis in 2010 and 2011?

No. Quite the opposite. I think the euro-area problems were compounded by the weak stress tests conducted by the European Banking Association in 2011. Almost no banks failed, and almost no capital was raised. Banks largely increased their capital-to-asset ratios by reducing assets, which resulted in a credit crunch that added to the woes of the peripheral countries.

But we’re past the worst now, right? Is the outlook for the euro-area economy improving?

After hitting the bottom, a very modest recovery is under way in Europe. But the risk that a Japanese-style malaise will afflict Europe is very real. One useful step on the horizon is the creation of a banking union. This measure could potentially alleviate the severe credit crunch afflicting the periphery countries.

Thanks, Sergio, for this pretty sobering assessment.

John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department

Editor’s note: Sergio Rebelo is the Tokai Bank Distinguished Professor of International Finance at Northwestern University’s Kellogg School of Management. He is a fellow of the Econometric Society, the National Bureau of Economic Research, and the Center for Economic Policy Research.


January 31, 2014 in Banking, Capital and Investment, Economics, Europe, Interest Rates, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a73d66a0e3970d

Listed below are links to blogs that reference A Brief Interview with Sergio Rebelo on the Euro-Area Economy:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 26, 2013

The New Normal? Slower R&D Spending

In case you need more to worry about, try this: the pace of research and development (R&D) spending has slowed. The National Science Foundation defines R&D as “creative work undertaken on a systematic basis in order to increase the stock of knowledge” and application of this knowledge toward new applications. (The Bureau of Economic Analysis (BEA) used to treat R&D as an intermediate input in current production. But the latest benchmark revision of the national accounts recorded R&D spending as business investment expenditure. See here for an interesting implication of this change.)

The following chart shows the BEA data on total real private R&D investment spending (purchased or performed on own-account) over the last 50 years, on a year-over-year percent change basis. (For a snapshot of R&D spending across states in 2007, see here.)

Real Spending on Research and Development


Notice the unusually slow pace of R&D spending in recent years. The 50-year average is 4.6 percent. The average over the last 5 years is 1.1 percent. This slower pace of spending has potentially important implications for overall productivity growth, which has also been below historic norms in recent years.

R&D spending is often cited as an important source of productivity growth within a firm, especially in terms of product innovation. But R&D is also an inherently risky endeavor, since the outcome is quite uncertain. So to the extent that economic and policy uncertainty has helped make businesses more cautious in recent years, a slow pace of R&D spending is not surprising. On top of that, the federal funding of R&D activity remains under significant budget pressure. See, for example, here.

So you can add R&D spending to the list of things that seem to be moving more slowly than normal. Or should we think of it as normal?

Photo of John RobertsonBy John Robertson, vice president and senior economist in the Atlanta Fed’s research department


September 26, 2013 in Business Cycles, Capital and Investment, Productivity | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff9af5e8970d

Listed below are links to blogs that reference The New Normal? Slower R&D Spending:

Comments

As someone who has spent many years in corporate R&D, I think I would advise some caution in interpreting these numbers. My experience is that an enormous amount of corporate R&D spending is simply wasted, essentially through poor management (alternatively just the fact that managing R&D from ideation through to product creation and monetization is really a very difficult task).

So it's possible that a gradual fall in overall R&D expenditure, especially relative to its natural variability, could actually reflect a healthy re-balancing of corporate spending, either through improved research productivity or through a shift towards more product-oriented expenditures. Without a lot more analysis it's difficult to really assess what's going on here.

Posted by: Mark Thomson (@markmthomson) | September 26, 2013 at 05:43 PM

Do these data include expenditures at universities? Maybe it's a low share. But as a public good every state has an incentive to let someone else provide elite higher education (as they're the most mobile geographically) and R&D.

Posted by: mike smitka | October 03, 2013 at 02:03 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 22, 2013

Too Big to Fail: Not Easily Resolved

As Fed Chairman Ben Bernanke has indicated, too-big-to-fail (TBTF) remains a major issue that is not solved, but “there’s a lot of work in train.” In particular, he pointed to efforts to institute Basel III capital standards and the orderly liquidation authority in Dodd-Frank. The capital standards seek to lower the probability of insolvency in times of financial stress, while the liquidation authority attempts to create a credible mechanism to wind down large institutions if necessary. The Atlanta Fed’s flagship Financial Markets Conference (FMC) recently addressed various issues related to both of these regulatory efforts.

The Basel capital standards are a series of international agreements on capital requirements reached by the Basel Committee on Banking Supervision. These requirements are referred to as “risk-weighted” because they tie the required amount of bank capital to an estimate of the overall riskiness of each bank’s portfolio. Put simply, riskier banks need to hold more capital under this system.

The first iteration of the Basel requirements, known as Basel I, required only 30 pages of regulation. But over time, banks adjusted their portfolios in response to the relatively simple risk measures in Basel I, and these measures became insufficient to characterize bank risk. The Basel Committee then shifted to a more complex system called Basel II, which allows the most sophisticated banks to estimate their own internal risk models subject to supervisory approval and use these models to calculate their required capital. After the financial crisis, supervisors concluded that Basel II did not require enough capital for certain types of transactions and agreed that a revised version called Basel III should be implemented.

At the FMC, Andrew Haldane from the Bank of England gave a fascinating recap of the Basel capital standards as a part of a broader discussion on the merits of complex regulation. His calculations show that the Basel accords have become vastly more complex, with the number of risk weights applied to bank positions increasing from only five in the Basel I standards to more than 200,000 in the current Basel III standards.

Haldane argued that this increase in complexity and reliance on banks’ internal risk models has unfortunately not resulted in a fair or credible system of capital regulation. He pointed to supervisory studies revealing wide disparities across banks in their estimated capital requirements for a hypothetical common portfolio. Further, Haldane pointed to a survey of investors by Barclays Capital in 2012 showing, not surprisingly, that investors do not put a great deal of trust in the Basel weightings.

So is the problem merely that the Basel accords have taken the wrong technical approach to risk measurement? The conclusion of an FMC panel on risk measurement is: not necessarily. The real problem is that estimating a bank’s losses in unlikely but not implausible circumstances is at least as much an art as it is a science.

Til Schuermann of Oliver Wyman gave several answers to the question “Why is risk management so hard?” including the fact that we (fortunately) don’t observe enough bad events to be able to make good estimates of how big the losses could become. As a result, he said, much of what we think we know from observations in good times is wrong when big problems hit: we estimate the wrong model parameters, use the wrong statistical distributions, and don’t take account of deteriorating relationships and negative feedback loops.

David Rowe of David M. Rowe Risk Advisory gave an example of why crisis times are different. He argued that the large financial firms can absorb some of the volatility in asset prices and trading volumes in normal times, making the financial system appear more stable. However, during crises, the large movements in asset prices can swamp even these large players. Without their shock absorption, all of the volatility passes through to the rest of the financial system.

The problems with risk measurement and management, however, go beyond the technical and statistical problems. The continued existence of TBTF means that the people and institutions that are best placed to measure risk—banks and their investors—have far less incentive to get it right than they should. Indeed, with TBTF, risk-based capital requirements can be little more than costly constraints to be avoided to the maximum extent possible, such as by “optimizing” model estimates and portfolios to reduce measured risk under Basel II and III. However, if a credible resolution mechanism existed and failure was a realistic threat, then following the intent of bank regulations would become more consistent with the banks’ self-interest, less costly, and sometimes even nonbinding.

Progress on creating such a mechanism under Dodd-Frank has been steady, if slow. Arthur Murton of the Federal Deposit Insurance Corporation (FDIC) presented, as a part of a TBTF panel, a comprehensive update on the FDIC’s planning process for making the agency’s new Orderly Liquidation Authority functional. The FDIC’s plans for resolving systemically important nonbank financial firms (including the parent holding company of large banks) is to write off the parent company’s equity holders and then use its senior and subordinated debt to absorb any remaining losses and recapitalize the parent. The solvent operating subsidiaries of the failed firm would continue in normal operation.

Importantly, though, the FDIC may exercise its new power only if both the Treasury and Federal Reserve agree that putting a firm that is in default or in danger of default into judicial bankruptcy would have seriously adverse effects on U.S. financial stability. And this raises a key question: why isn’t bankruptcy a reasonable option for these firms?

Keynote speaker John Taylor and TBTF session panelist Kenneth Scott—both Stanford professors—argued that in fact bankruptcy is a reasonable option, or could be, with some changes. They maintain that creditors could better predict the outcome of judicial bankruptcy than FDIC-administered resolution. And predictability of outcomes is key for any mechanism that seeks to resolve financial firms with as little damage as possible to the broader financial system.

Unfortunately, some of the discussion during the TBTF panel also made it apparent that Chairman Bernanke is right: TBTF has not been solved. The TBTF panel discussed several major unresolved obstacles, including the complications of resolving globally active financial firms with substantial operations outside the United States (and hence outside both the FDIC and the U.S. bankruptcy court’s control) and the problem of dealing with many failing systemically important financial institutions at the same time, as is likely to occur in a crisis period. (A further commentary on these two obstacles is available in an earlier edition of the Atlanta Fed’s Notes from the Vault.)

Thus, the Atlanta Fed’s recent FMC highlighted both the importance of ending TBTF and the difficulty of doing so. The Federal Reserve continues to work with the FDIC to address the remaining problems. But until TBTF is a “solved” problem, what to do about these financial firms should and will remain a front-burner issue in policy circles.

Photo of Paula Tkac By Paula Tkac, vice president and senior economist, and

Photo of Larry Wall Larry Wall, director of the Center for Financial Innovation and Stability, both in the Atlanta Fed’s research department

April 22, 2013 in Capital and Investment, Capital Markets, Financial System | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef017d43058473970c

Listed below are links to blogs that reference Too Big to Fail: Not Easily Resolved:

Comments

Yes, solving the Too Big to Fail problem is really, really difficult if the people making policy don't want to solve it, as they evidently don't. Clearly, the political influence of such organizations is at work impeding the process. Sadly, the regulators/enablers (both international and domestic) are terrified of imposing the kind of "dumb but simple" workable solution that would drastically impact the business models of these very politically powerful organizations. No, let's keep putting increasingly sophisticated band aids on tumors--that's the spirit, fellows!

Posted by: William Meyer | April 23, 2013 at 09:37 AM

Nice summary of the problems here. In addition to the problems you mention, I worry about a couple of others. One was noted recently by Sarah Gordon in the FT. She writes:


There is a compelling body of evidence suggesting that the people most likely to go into the riskier areas of financial services are precisely those least suited to judging risk. Susan Cain’s recently published book Quiet cites a series of studies that suggest that extroverts tend to be attracted to the high-reward environments of investment banking, deals and trading. And, troublingly, these outgoing people also tend to be less effective at balancing opportunity and risk than some of their more introverted peers. (http://www.ft.com/intl/cms/s/0/a917de18-abef-11e2-9e7f-00144feabdc0.html#axzz2Rxxxb9nc)

A second thing that makes me nervous is the fact, as you point out, that the FDIC an exercise its new powers to resolve a systemically important institution only if the Fed and the Treasury agree to let it do so. Would the Treasury really be able to resist political pressures and make such a decision objectively? Even the Fed, independent though it is, might find it hard to do so.

Posted by: Ed Dolan | April 30, 2013 at 12:56 PM

There is waiting in the Platonic cave a critically important paper that derives Campbell's Law and Goodhart's Law from Goedel's incompleteness theorem. As long as we allow a financial system to exist in which ever more complex debt instruments are allowed to be created, then for every control regime devised by some Basel XXXIV or whatever, there will be a new class of derivatives that escapes those regulations and destabilizes the system.

There are three possible ways to escape this endless escalation of risk-creating games: (1) impose a formal language for creating financial instruments that has been proven to be sub-Turing in its expressive power, (2) implement institution-scale TBTF, preventing any single institution or implicit consortium of institutions from acquiring enough significance to endanger the system as a whole, or (3) implement sovereign TBTF, preventing the failure of any nation-state from endangering the financial systems of others. It's not too late to start considering mechanisms to allow the Eurozone to spin off disruptive members in the same way that a lizard sheds its tail when in mortal danger. Cyprus is almost there already.

Unfortunately the US cannot spin off insolvent states, much as some of them might wish to leave. If only we could let North Carolina go, and take the Bank of America with it!

Good luck!

Posted by: George McKee | April 30, 2013 at 10:04 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in