The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.
- BLS Handbook of Methods
- Bureau of Economic Analysis
- Bureau of Labor Statistics
- Congressional Budget Office
- Economic Data - FRED® II, St. Louis Fed
- Office of Management and Budget
- Statistics: Releases and Historical Data, Board of Governors
- U.S. Census Bureau Economic Programs
- White House Economic Statistics Briefing Room
January 31, 2014
A Brief Interview with Sergio Rebelo on the Euro-Area Economy
Last month, we at the Atlanta Fed had the great pleasure of hosting Sergio Rebelo for a couple of days. While he was here, we asked Sergio to share his thoughts on a wide range of current economic topics. Here is a snippet of a Q&A we had with him about the state of the euro-area economy:
Sergio, what would you say was the genesis of the problems the euro area has faced in recent years?
The contours of the euro area’s problems are fairly well known. The advent of the euro gave peripheral countries—Ireland, Spain, Portugal, and Greece—the ability to borrow at rates that were similar to Germany's. This convergence of borrowing costs was encouraged through regulation that allowed banks to treat all euro-area sovereign bonds as risk free.
The capital inflows into the peripheral countries were not, for the most part, directed to the tradable sector. Instead, they financed increases in private consumption, large housing booms in Ireland and Spain, and increases in government spending in Greece and Portugal. The credit-driven economic boom led to a rise in labor costs and a loss of competitiveness in the tradable sector.
Was there a connection between the financial crisis in the United States and the sovereign debt crisis in the euro area?
Simply put, after Lehman Brothers went bankrupt, we had a sudden stop of capital flows into the periphery, similar to that experienced in the past by many Latin American countries. The periphery boom quickly turned into a bust.
What do you see as the role for euro area monetary policy in that context?
It seems clear that more expansionary monetary policy would have been helpful. First, it would have reduced real labor costs in the peripheral countries. In those countries, the presence of high unemployment rates moderates nominal wage increases, so higher inflation would have reduced real wages. Second, inflation would have reduced the real value of the debts of governments, banks, households, and firms. There might have been some loss of credibility on the part of the ECB [European Central Bank], resulting in a small inflation premium on euro bonds for some time. But this potential cost would have been worth paying in return for the benefits.
And did this happen?
In my view, the ECB did not follow a sufficiently expansionary monetary policy. In fact, the euro-area inflation rate has been consistently below 2 percent and the euro is relatively strong when compared to a purchasing-power-parity benchmark. The euro area turned to contractionary fiscal policy as a panacea. There are good theoretical reasons to believe that—when the interest rate remains constant that so the central bank does not cushion the fall in government spending—the multiplier effect of government spending cuts can be very large. See, for example, Gauti Eggertsson and Michael Woodford, “The Zero Interest-rate Bound and Optimal Monetary Policy,” and Lawrence Christiano, Martin Eichenbaum, and Sergio Rebelo, "When Is the Government Spending Multiplier Large?”
Theory aside, the results of the austerity policies implemented in the euro area are clear. All of the countries that underwent this treatment are now much less solvent than in the beginning of the adjustment programs managed by the European Commission, the International Monetary Fund, and the ECB.
Bank stress testing has become a cornerstone of macroprudential financial oversight. Do you think they helped stabilize the situation in the euro area during the height of the crisis in 2010 and 2011?
No. Quite the opposite. I think the euro-area problems were compounded by the weak stress tests conducted by the European Banking Association in 2011. Almost no banks failed, and almost no capital was raised. Banks largely increased their capital-to-asset ratios by reducing assets, which resulted in a credit crunch that added to the woes of the peripheral countries.
But we’re past the worst now, right? Is the outlook for the euro-area economy improving?
After hitting the bottom, a very modest recovery is under way in Europe. But the risk that a Japanese-style malaise will afflict Europe is very real. One useful step on the horizon is the creation of a banking union. This measure could potentially alleviate the severe credit crunch afflicting the periphery countries.
Thanks, Sergio, for this pretty sobering assessment.
By John Robertson, a vice president and senior economist in the Atlanta Fed’s research department
Editor’s note: Sergio Rebelo is the Tokai Bank Distinguished Professor of International Finance at Northwestern University’s Kellogg School of Management. He is a fellow of the Econometric Society, the National Bureau of Economic Research, and the Center for Economic Policy Research.
TrackBack URL for this entry:
Listed below are links to blogs that reference A Brief Interview with Sergio Rebelo on the Euro-Area Economy:
September 26, 2013
The New Normal? Slower R&D Spending
In case you need more to worry about, try this: the pace of research and development (R&D) spending has slowed. The National Science Foundation defines R&D as “creative work undertaken on a systematic basis in order to increase the stock of knowledge” and application of this knowledge toward new applications. (The Bureau of Economic Analysis (BEA) used to treat R&D as an intermediate input in current production. But the latest benchmark revision of the national accounts recorded R&D spending as business investment expenditure. See here for an interesting implication of this change.)
The following chart shows the BEA data on total real private R&D investment spending (purchased or performed on own-account) over the last 50 years, on a year-over-year percent change basis. (For a snapshot of R&D spending across states in 2007, see here.)
Notice the unusually slow pace of R&D spending in recent years. The 50-year average is 4.6 percent. The average over the last 5 years is 1.1 percent. This slower pace of spending has potentially important implications for overall productivity growth, which has also been below historic norms in recent years.
R&D spending is often cited as an important source of productivity growth within a firm, especially in terms of product innovation. But R&D is also an inherently risky endeavor, since the outcome is quite uncertain. So to the extent that economic and policy uncertainty has helped make businesses more cautious in recent years, a slow pace of R&D spending is not surprising. On top of that, the federal funding of R&D activity remains under significant budget pressure. See, for example, here.
So you can add R&D spending to the list of things that seem to be moving more slowly than normal. Or should we think of it as normal?
By John Robertson, vice president and senior economist in the Atlanta Fed’s research department
TrackBack URL for this entry:
Listed below are links to blogs that reference The New Normal? Slower R&D Spending:
April 22, 2013
Too Big to Fail: Not Easily Resolved
As Fed Chairman Ben Bernanke has indicated, too-big-to-fail (TBTF) remains a major issue that is not solved, but “there’s a lot of work in train.” In particular, he pointed to efforts to institute Basel III capital standards and the orderly liquidation authority in Dodd-Frank. The capital standards seek to lower the probability of insolvency in times of financial stress, while the liquidation authority attempts to create a credible mechanism to wind down large institutions if necessary. The Atlanta Fed’s flagship Financial Markets Conference (FMC) recently addressed various issues related to both of these regulatory efforts.
The Basel capital standards are a series of international agreements on capital requirements reached by the Basel Committee on Banking Supervision. These requirements are referred to as “risk-weighted” because they tie the required amount of bank capital to an estimate of the overall riskiness of each bank’s portfolio. Put simply, riskier banks need to hold more capital under this system.
The first iteration of the Basel requirements, known as Basel I, required only 30 pages of regulation. But over time, banks adjusted their portfolios in response to the relatively simple risk measures in Basel I, and these measures became insufficient to characterize bank risk. The Basel Committee then shifted to a more complex system called Basel II, which allows the most sophisticated banks to estimate their own internal risk models subject to supervisory approval and use these models to calculate their required capital. After the financial crisis, supervisors concluded that Basel II did not require enough capital for certain types of transactions and agreed that a revised version called Basel III should be implemented.
At the FMC, Andrew Haldane from the Bank of England gave a fascinating recap of the Basel capital standards as a part of a broader discussion on the merits of complex regulation. His calculations show that the Basel accords have become vastly more complex, with the number of risk weights applied to bank positions increasing from only five in the Basel I standards to more than 200,000 in the current Basel III standards.
Haldane argued that this increase in complexity and reliance on banks’ internal risk models has unfortunately not resulted in a fair or credible system of capital regulation. He pointed to supervisory studies revealing wide disparities across banks in their estimated capital requirements for a hypothetical common portfolio. Further, Haldane pointed to a survey of investors by Barclays Capital in 2012 showing, not surprisingly, that investors do not put a great deal of trust in the Basel weightings.
So is the problem merely that the Basel accords have taken the wrong technical approach to risk measurement? The conclusion of an FMC panel on risk measurement is: not necessarily. The real problem is that estimating a bank’s losses in unlikely but not implausible circumstances is at least as much an art as it is a science.
Til Schuermann of Oliver Wyman gave several answers to the question “Why is risk management so hard?” including the fact that we (fortunately) don’t observe enough bad events to be able to make good estimates of how big the losses could become. As a result, he said, much of what we think we know from observations in good times is wrong when big problems hit: we estimate the wrong model parameters, use the wrong statistical distributions, and don’t take account of deteriorating relationships and negative feedback loops.
David Rowe of David M. Rowe Risk Advisory gave an example of why crisis times are different. He argued that the large financial firms can absorb some of the volatility in asset prices and trading volumes in normal times, making the financial system appear more stable. However, during crises, the large movements in asset prices can swamp even these large players. Without their shock absorption, all of the volatility passes through to the rest of the financial system.
The problems with risk measurement and management, however, go beyond the technical and statistical problems. The continued existence of TBTF means that the people and institutions that are best placed to measure risk—banks and their investors—have far less incentive to get it right than they should. Indeed, with TBTF, risk-based capital requirements can be little more than costly constraints to be avoided to the maximum extent possible, such as by “optimizing” model estimates and portfolios to reduce measured risk under Basel II and III. However, if a credible resolution mechanism existed and failure was a realistic threat, then following the intent of bank regulations would become more consistent with the banks’ self-interest, less costly, and sometimes even nonbinding.
Progress on creating such a mechanism under Dodd-Frank has been steady, if slow. Arthur Murton of the Federal Deposit Insurance Corporation (FDIC) presented, as a part of a TBTF panel, a comprehensive update on the FDIC’s planning process for making the agency’s new Orderly Liquidation Authority functional. The FDIC’s plans for resolving systemically important nonbank financial firms (including the parent holding company of large banks) is to write off the parent company’s equity holders and then use its senior and subordinated debt to absorb any remaining losses and recapitalize the parent. The solvent operating subsidiaries of the failed firm would continue in normal operation.
Importantly, though, the FDIC may exercise its new power only if both the Treasury and Federal Reserve agree that putting a firm that is in default or in danger of default into judicial bankruptcy would have seriously adverse effects on U.S. financial stability. And this raises a key question: why isn’t bankruptcy a reasonable option for these firms?
Keynote speaker John Taylor and TBTF session panelist Kenneth Scott—both Stanford professors—argued that in fact bankruptcy is a reasonable option, or could be, with some changes. They maintain that creditors could better predict the outcome of judicial bankruptcy than FDIC-administered resolution. And predictability of outcomes is key for any mechanism that seeks to resolve financial firms with as little damage as possible to the broader financial system.
Unfortunately, some of the discussion during the TBTF panel also made it apparent that Chairman Bernanke is right: TBTF has not been solved. The TBTF panel discussed several major unresolved obstacles, including the complications of resolving globally active financial firms with substantial operations outside the United States (and hence outside both the FDIC and the U.S. bankruptcy court’s control) and the problem of dealing with many failing systemically important financial institutions at the same time, as is likely to occur in a crisis period. (A further commentary on these two obstacles is available in an earlier edition of the Atlanta Fed’s Notes from the Vault.)
Thus, the Atlanta Fed’s recent FMC highlighted both the importance of ending TBTF and the difficulty of doing so. The Federal Reserve continues to work with the FDIC to address the remaining problems. But until TBTF is a “solved” problem, what to do about these financial firms should and will remain a front-burner issue in policy circles.
By Paula Tkac, vice president and senior economist, and
Larry Wall, director of the Center for Financial Innovation and Stability, both in the Atlanta Fed’s research department
TrackBack URL for this entry:
Listed below are links to blogs that reference Too Big to Fail: Not Easily Resolved:
- Working for Yourself, Some of the Time
- Gauging Firm Optimism in a Time of Transition
- Can Tight Labor Markets Inhibit Investment Growth?
- More Ways to Watch Wages
- Unemployment versus Underemployment: Assessing Labor Market Slack
- Does a High-Pressure Labor Market Bring Long-Term Benefits?
- Net Exports Continue to Bedevil GDPNow
- Examining Changes in Labor Force Participation
- Wage Growth Tracker: Every Which Way (and Up)
- Following the Overseas Money
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- Business Cycles
- Business Inflation Expectations
- Capital and Investment
- Capital Markets
- Data Releases
- Economic conditions
- Economic Growth and Development
- Exchange Rates and the Dollar
- Fed Funds Futures
- Federal Debt and Deficits
- Federal Reserve and Monetary Policy
- Financial System
- Fiscal Policy
- Health Care
- Inflation Expectations
- Interest Rates
- Labor Markets
- Latin America/South America
- Monetary Policy
- Money Markets
- Real Estate
- Saving, Capital, and Investment
- Small Business
- Social Security
- This, That, and the Other
- Trade Deficit
- Wage Growth