The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.
- BLS Handbook of Methods
- Bureau of Economic Analysis
- Bureau of Labor Statistics
- Congressional Budget Office
- Economic Data - FRED® II, St. Louis Fed
- Office of Management and Budget
- Statistics: Releases and Historical Data, Board of Governors
- U.S. Census Bureau Economic Programs
- White House Economic Statistics Briefing Room
September 27, 2016
Back to the '80s, Courtesy of the Wage Growth Tracker
Things have been a wee bit quiet in macroblog land the last few weeks, chiefly because our time has been devoted to two exciting new projects. The first is a refresh of our labor force dynamics website, which will feature a nifty tool for looking at the main reasons behind changes in labor force participation for different age groups. More on that later.
The other project has been adding more history to our Wage Growth Tracker. The tracker's current time series starts in 1997. The chart below shows an extended version of the tracker that starts in 1983.
Recall that the Wage Growth Tracker depicts the median of the distribution of 12-month changes of matched nominal hourly earnings. In the extended time series, you'll notice two gaps, which resulted from the U.S. Census Bureau scrambling the identifiers in its Current Population Survey. For those two periods, you'll have to use your imagination and make some inferences.
As we have emphasized previously, the Wage Growth Tracker is not a direct measure of the typical change in overall wage costs because it only looks at (more or less) continuously employed workers. But it should reflect the amount of excess slack in the labor market. This point is illustrated in the following chart, which compares the Wage Growth Tracker with the unemployment gap computed from the Congressional Budget Office's (CBO) estimate of the long-run natural rate of unemployment.
As the chart shows, our measure of nominal wage growth has historically tracked the cyclical movement in the unemployment rate gap estimate fairly well, at least since the mid-1980s. We think this feature is potentially important, because the true unemployment rate gap is very hard to know in real time and hence is subject to potentially large revision. For example, in real time, the unemployment rate was estimated to have fallen below the natural rate in the fourth quarter of 1994, but it is now thought to have not breached the natural rate until the first quarter of 1997—more than two years later. The Wage Growth Tracker is not subject to revision (although it is subject to a small amount of sampling uncertainty) and hence could be useful in evaluating the reliability of the unemployment rate gap estimate in real time.
This also is important from a monetary policy perspective if we are worried about the risk of the economy overheating. For example, President Rosengren of the Boston Fed described why he dissented at the most recent Federal Open Market Committee meeting in favor of a quarter-point increase in the target range for the federal funds rate. His dissent, he said, arose partly from his concern that the economy may overheat and drive unemployment below a level he believes is sustainable.
Currently, the CBO estimate of the unemployment rate gap looks like it is plateauing at close to zero. The fact that the Wage Growth Tracker for the third quarter slowed a bit is consistent with that. But it's only one quarter of data, and so we'll closely monitor the Wage Growth Tracker in the coming months to see what it suggests about the actual unemployment rate gap. We'll discuss what observations we make here.
September 08, 2016
Introducing the Atlanta Fed's Taylor Rule Utility
Simplicity isn't always a virtue, but when it comes to complex decision-making processes—for example, a central bank setting a policy rate—having simple benchmarks is often helpful. As students and observers of monetary policy well know, the common currency in the central banking world is the so-called "Taylor rule."
The Taylor rule is an equation introduced by John Taylor in a seminal 1993 paper that prescribes a value for the federal funds rate—the interest rate targeted by the Federal Open Market Committee (FOMC)—based on readings of inflation and the output gap. The output gap measures the percentage point difference between real gross domestic product (GDP) and an estimate of its trend or potential.
Since 1993, academics and policymakers have introduced and used many alternative versions of the rule. The alternative forms of the rule can supply policy prescriptions that differ significantly from Taylor's original rule, as the following chart illustrates.
The green line shows the policy prescription from a rule identical to the one in Taylor's paper, apart from some minor changes in the inflation and output gap measures. The red line uses an alternative and commonly used rule that gives the output gap twice the weight used for the Taylor (1993) rule, derived from a 1999 paper by John Taylor. The red line also replaces the 2 percent value used in Taylor's 1993 paper with an estimate of the natural real interest rate, called r*, from a paper by Thomas Laubach, the Federal Reserve Board's director of monetary affairs, and John Williams, president of the San Francisco Fed. Federal Reserve Chair Janet Yellen also considered this alternative estimate of r* in a 2015 speech.
Both rules use real-time data. The Taylor (1993) rule prescribed liftoff for the federal funds rate materially above the FOMC's 0 to 0.25 percent target range from December 2008 to December 2015 as early as 2012. The alternative rule did not prescribe a positive fed funds rate since the end of the 2007–09 recession until this quarter. The third-quarter prescriptions incorporate nowcasts constructed as described here. Neither the nowcasts nor the Taylor rule prescriptions themselves necessarily reflect the outlook or views of the Federal Reserve Bank of Atlanta or its president.
Additional variables that get plugged into this simple policy rule can influence the rate prescription. To help you sort through the most common variations, we at the Atlanta Fed have created a Taylor Rule Utility. Our Taylor Rule Utility gives you a number of choices for the inflation measure, inflation target, the natural real interest rate, and the resource gap. Besides the Congressional Budget Office–based output gap, alternative resource gap choices include those based on a U-6 labor underutilization gap and the ZPOP ratio. The latter ratio, which Atlanta Fed President Dennis Lockhart mentioned in a November 2015 speech while addressing the Taylor rule, gauges underemployment by measuring the share of the civilian population working their desired number of hours.
Many of the indicator choices use real-time data. The utility also allows you to establish your own weight for the resource gap and whether you want the prescription to put any weight on the previous quarter's federal funds rate. The default choices of the Taylor Rule Utility coincide with the Taylor (1993) rule shown in the above chart. Other organizations have their own versions of the Taylor Rule Utility (one of the nicer ones is available on the Cleveland Fed's Simple Monetary Policy Rules web page). You can find more information about the Cleveland Fed's web page on the Frequently Asked Questions page.
Although the Taylor rule and its alternative versions are only simple benchmarks, they can be useful tools for evaluating the importance of particular indicators. For example, we see that the difference in the prescriptions of the two rules plotted above has narrowed in recent years as slack has diminished. Even if the output gap were completely closed, however, the current prescriptions of the rules would differ by nearly 2 percentage points because of the use of different measures of r*. We hope you find the Taylor Rule Utility a useful tool to provide insight into issues like these. We plan on adding further enhancements to the utility in the near future and welcome any comments or suggestions for improvements.
August 15, 2016
Payroll Employment Growth: Strong Enough?
The U.S. Bureau of Labor Statistics' estimate of nonfarm payroll employment is the most closely watched indicator of overall employment growth in the U.S. economy. By this measure, employment increased by 255,000 in July, well above the three-month average of 190,000. Yet despite this outsized gain, the unemployment rate barely budged. What gives?
Well, for a start, there is no formal connection between the payroll employment data and the unemployment rate data. The employment data used to construct the unemployment rate come from the Current Population Survey (CPS) and the payroll employment data come from a different survey. However, it is possible to relate changes in the unemployment rate to the gap between the CPS and payroll measures of employment, as well as changes in the labor force participation (LFP) rate, and the growth of payroll employment relative to the population.
The following chart shows the contribution of each of these three factors to the monthly change in the unemployment rate during the last year.
A note about the chart: The CPS employment and population measures have been smoothed to account for annual population control adjustments. The smoothed employment data are available here. The method used to compute the contributions is available here.
The black line is the monthly change in the unemployment rate (unrounded). Each green segment of a bar is the change in the unemployment rate coming from the gap between population growth and payroll employment growth. Because payroll employment has generally been growing faster than the population, it has helped make the unemployment rate lower than it otherwise would have been.
But as the chart makes clear, the other two factors can also exert a significant influence on the direction of the unemployment rate. The labor force participation rate contribution (the red segments of the bars) and the contribution from the gap between the CPS and payroll employment measures (blue segments) can vary a lot from month to month, and these factors can swamp the payroll employment growth contribution.
So any assumption that strong payroll employment gains in any particular month will automatically lead to a decline in the unemployment rate could, in fact, be wrong. But over longer periods, the mapping is a bit clearer because it is effectively smoothing the month-to-month variation in the three factors. For example, the following chart shows the contribution of the three factors to 12-month changes in the unemployment rate from July 2012 to July 2013, from July 2013 to July 2014, and so on.
Gains in payroll employment relative to the population have helped pull the unemployment rate lower. Moreover, prior to the most recent 12 months, declines in the LFP rate put further downward pressure on the unemployment rate. Offsetting this pressure to varying degrees has been the fact that the CPS measure of employment has tended to increase more slowly than the payroll measure, making the decline in the unemployment rate smaller than it would have been otherwise. During the last 12 months, the LFP rate turned positive on balance, meaning that the magnitude of the unemployment rate decline has been considerably less than implied by the relative strength of payroll employment growth.
Going forward, another strong payroll employment reading for August is certainly no guarantee of a corresponding decline in the unemployment rate. But as shown by my colleagues David Altig and Patrick Higgins in an earlier macroblog post, under a reasonable range of assumptions for the trend path of population growth, the LFP rate, and the gap between the CPS and payroll survey measures of employment, payroll growth averaging above 150,000 a month should be enough to cause the unemployment rate to continue declining.
August 11, 2016
Forecasting Loan Losses for Stress Tests
Bank capital requirements are back in the news with the recent announcements of the results of U.S. stress tests by the Federal Reserve and the European Union (E.U.) stress tests by the European Banking Authority (EBA). The Federal Reserve found that all 33 of the bank holding companies participating in its test would have continued to meet the applicable capital requirements. The EBA found progress among the 51 banks in its test, but it did not define a pass/fail threshold. In summarizing the results, EBA Chairman Andrea Enria is widely quoted as saying, "Whilst we recognise the extensive capital raising done so far, this is not a clean bill of health," and that there remains work to do.
The results of the stress tests do not mean that banks could survive any possible future macroeconomic shock. That standard would be an extraordinarily high one and would require each bank to hold capital equal to its total assets (or maybe even more if the bank held derivatives). However, the U.S. approach to scenario design is intended to make sure that the "severely adverse" scenario is indeed a very bad recession.
The Federal Reserve's Policy Statement on the Scenario Design Framework for Stress Testing indicates that the severely adverse scenario will have an unemployment increase of between 3 and 5 percentage points or a level of 10 percent overall. That statement observes that during the last half century, the United States has seen four severe recessions with that large of an increase in the unemployment rate, with the rate peaking at more than 10 percent in last three severe recessions.
To forecast the losses from such a severe recession, the banks need to estimate loss models for each of their portfolios. In these models, the bank estimates the expected loss associated with a portfolio of loans as a function of the variables in the scenario. In estimating these models, banks often have a very large number of loans with which to estimate losses in their various portfolios, especially the consumer and small business portfolios. However, they have very few opportunities to observe how the loans perform in a downturn. Indeed, in almost all cases, banks started keeping detailed loan loss data only in the late 1990s and, in many cases, later than that. Thus, for many types of loans, banks might have at best data for only the relatively mild recession of 2001–02 and the severe recession of 2007–09.
Perhaps the small number of recessions—especially severe recessions—would not be a big problem if recessions differed only in their depth and not their breadth. However, even comparably severe recessions are likely to hit different parts of the economy with varying degrees of severity. As a result, a given loan portfolio may suffer only small losses in one recession but take very large losses in the next recession.
With the potential for models to underestimate losses given there are so few downturns to calibrate to, the stress testing process allows humans to make judgmental changes (or overlays) to model estimates when the model estimates seem implausible. However, the Federal Reserve requires that bank holding companies should have a "transparent, repeatable, well-supported process" for the use of such overlays.
My colleague Mark Jensen recently made some suggestions about how stress test modelers could reduce the uncertainty around projected losses because of limited data from directly comparable scenarios. He recommends using estimation procedures based on a probability theorem attributed to Reverend Thomas Bayes. When applied to stress testing, Bayes' theorem describes how to incorporate additional empirical information into an initial understanding of how losses are distributed in order to update and refine loss predictions.
One of the benefits of using techniques based on this theorem is that it allows the incorporation of any relevant data into the forecasted losses. He gives the example of using foreign data to help model the distribution of losses U.S. banks would incur if U.S. interest rates become negative. We have no experience with negative interest rates, but Sweden has recently been accumulating experience that could help in predicting such losses in the United States. Jensen argues that Bayesian techniques allow banks and bank supervisors to better account for the uncertainty around their loss forecasts in extreme scenarios.
Additionally, I have previously argued that the existing capital standards provide further way of mitigating the weaknesses in the stress tests. The large banks that participate in the stress tests are also in the process of becoming subject to a risk-based capital requirement commonly called Basel III that was approved by an international committee of banking supervisors after the financial crisis. Basel III uses a different methodology to estimate losses in a severe event, one where the historical losses in a loan portfolio provide the parameters to a loss distribution. While Basel III faces the same problem of limited loan loss data—so it almost surely underestimates some risks—those errors are likely to be somewhat different from those produced by the stress tests. Hence, the use of both measures is likely to somewhat reduce the possibility that supervisors end up requiring too little capital for some types of loans.
Both the stress tests and risk-based models of the Basel III type face the unavoidable problem of inaccurately measuring risk because we have limited data from extreme events. The use of improved estimation techniques and multiple ways of measuring risk may help mitigate this problem. But the only way to solve the problem of limited data is to have a greater number of extreme stress events. Given that alternative, I am happy to live with imperfect measures of bank risk.
Author's note: I want to thank the Atlanta Fed's Dave Altig and Mark Jensen for helpful comments.
- Back to the '80s, Courtesy of the Wage Growth Tracker
- Introducing the Atlanta Fed's Taylor Rule Utility
- Payroll Employment Growth: Strong Enough?
- Forecasting Loan Losses for Stress Tests
- Men at Work: Are We Seeing a Turnaround in Male Labor Force Participation?
- What’s Moving the Market’s Views on the Path of Short-Term Rates?
- Lockhart Casts a Line into the Murky Waters of Uncertainty
- How Will Employers Respond to New Overtime Regulations?
- How Good Is The Employment Trend? Decide for Yourself
- Is the Labor Market Tossing a Fair Coin?
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- November 2015
- Business Cycles
- Business Inflation Expectations
- Capital and Investment
- Capital Markets
- Data Releases
- Economic conditions
- Economic Growth and Development
- Exchange Rates and the Dollar
- Fed Funds Futures
- Federal Debt and Deficits
- Federal Reserve and Monetary Policy
- Financial System
- Fiscal Policy
- Health Care
- Inflation Expectations
- Interest Rates
- Labor Markets
- Latin America/South America
- Monetary Policy
- Money Markets
- Real Estate
- Saving, Capital, and Investment
- Small Business
- Social Security
- This, That, and the Other
- Trade Deficit
- Wage Growth