The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.

Authors for macroblog are Dave Altig, John Robertson, and other Atlanta Fed economists and researchers.

February 13, 2018

GDPNow's Forecast: Why Did It Spike Recently?

If you felt whipsawed by GDPNow recently, it's understandable. On February 1, the Atlanta Fed's GDPNow model estimate of first-quarter real gross domestic product (GDP) growth surged from 4.2 percent to 5.4 percent (annualized rates) after a manufacturing report from the Institute for Supply Management. GDPNow's estimate then fell to 4.0 percent on February 2 after the employment report from the U.S. Bureau of Labor Statistics. GDPNow displayed a similar undulating pattern early in the forecast cycle for fourth-quarter GDP growth.

What accounted for these sawtooth patterns? The answer lies in the treatment of the ISM manufacturing release. To forecast the yet-to-be released monthly GDP source data apart from inventories, GDPNow uses an indicator of growth in economic activity from a statistical model called a dynamic factor model. The factor is estimated from 127 monthly macroeconomic indicators, many of which are used to estimate the Chicago Fed National Activity Index (CFNAI). Indices like these can be helpful for forecasting macroeconomic data, as demonstrated here  and here.

Perhaps not surprisingly, the CFNAI and the GDPNow factor are highly correlated, as the red and blue lines in the chart below indicate. Both indices, which are normalized to have an average of 0 and a standard deviation of 1, are usually lower in recessions than expansions.

A major difference in the indices is how yet-to-be-released values are handled for months in the recent past that have reported values for some, but not all, of the source data. For example, on February 2, January 2018 values had been released for data from the ISM manufacturing and employment reports but not from the industrial production or retail sales reports. The CFNAI is released around the end of each month when about two-thirds of the 85 indicators used to construct it have reported values for the previous month. For the remaining indicators, the Chicago Fed fills in statistical model forecasts for unreported values. In contrast, the GDPNow factor is updated continuously and extended a month after each ISM manufacturing release. On the dates of the ISM releases, around 17 of the 127 indicators GDPNow uses have reported values for the previous month, with six coming from the ISM manufacturing report.


[ Enlarge ]

For months with partially missing data, GDPNow updates its factor with an approach similar to the one used in a 2008 paper by economists Domenico Giannone, Lucrezia Reichlin and David Small. That paper describes a dynamic factor model used to nowcast GDP growth similar to the one that generates the New York Fed's staff nowcast of GDP growth. In the Atlanta Fed's GDPNow factor model, the last month of ISM manufacturing data have large weights when calculating the terminal factor value right after the ISM report. These ISM weights decrease significantly after the employment report, when about 50 of the indicators have reported values for the last month of data.

In the above figure, we see that the January 2018 GDPNow factor reading was 1.37 after the February 1 ISM release, the strongest reading since 1994 and well above either its forecasted value of 0.42 prior to the ISM release or its estimated value of 0.43 after the February 2 employment release. The aforementioned rise and decline in the GDPNow forecast of first-quarter growth is largely a function of the rise and decline in the January 2018 estimates of the dynamic factor.

Although the January 2018 reading of 59.2 for the composite ISM purchasing managers index (PMI) was higher than any reading from 2005 to 2016, it was little different than either a consensus forecast from professional economists (58.8) or the forecast from a simple model (58.9) that uses the strong reading in December 2017 (59.3). Moreover, it was well above the reading the GDPNow dynamic factor model was expecting (54.5).

A possible shortcoming of the GDPNow factor model is that it does not account for the previous month's forecast errors when forecasting the 127 indicators. For example, the predicted composite ISM PMI reading of 54.4 in December 2017 was nearly 5 points lower than the actual value. For this discussion, let's adjust GDPNow's factor model to account for these forecast errors and consider a forecast evaluation period with revised current vintage data after 1999. Then, the average absolute error of the 85–90 day-ahead adjusted model forecasts of GDP growth after ISM manufacturing releases (1.40 percentage points) is lower than the average absolute forecast error on those same dates for the standard version of GDPNow (1.49 percentage points). Moreover, the forecasts using the adjusted factor model are significantly more accurate than the GDPNow forecasts, according to a standard statistical test . If we decide to incorporate adjustments to GDPNow's factor model, we will do so at an initial forecast of quarterly GDP growth and note the change here .

Would the adjustment have made a big difference in the initial first-quarter GDP forecast? The February 1 GDP growth forecast of GDPNow with the adjusted factor model was "only" 4.7 percent. Its current (February 9) forecast of first-quarter GDP growth was the same as the standard version of GDPNow: 4.0 percent. These estimates are still much higher than both the recent trend in GDP growth and the median forecast of 3.0 percent from the Philadephia Fed's Survey of Professional Forecasters (SPF).

Most of the difference between the GDPNow and SPF forecasts of GDP growth is the result of inventories. GDPNow anticipates inventories will contribute 1.2 percentage points to first-quarter growth, and the median SPF projection implies an inventory contribution of only 0.4 percentage points. It's not unusual to see some disagreement between these inventory forecasts and it wouldn't be surprising if one—or both—of them turn out to be off the mark.

February 13, 2018 in Business Cycles, Forecasts, GDP, Productivity, This, That, and the Other | Permalink


Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 18, 2018

How Low Is the Unemployment Rate, Really?

In 2017, the unemployment rate averaged 4.4 percent. That's quite low on a historical basis. In fact, it's the lowest level since 2000, when unemployment averaged 4.0 percent. But does that mean that the labor market is only 0.4 percentage points away from being as strong as it was in 2000? Probably not. Let's talk about why.

As observed by economist George Perry in 1970, although movement in the aggregate unemployment rate is mostly the result of changes in unemployment rates within demographic groups, demographic shifts can also change the overall unemployment rate even if unemployment within demographic groups has not changed. Adjusting for demographic changes makes for a better apples-to-apples comparison of unemployment today with past rates.

Three large demographic shifts underway since the early 2000s are the rise in the average age and educational attainment of the labor force, and the decline in the share who are white and non-Hispanic. These changes are potentially important because older workers and those with more education have lower rates of unemployment across age and education groups respectively, and white non-Hispanics tend to have lower rates of unemployment than other ethnicities.

The following chart shows the results of a demographic adjustment that jointly controls for year-to-year changes in two sex, three education, four race/ethnicity, and six age labor force groups, (see here for more details). Relative to the year 2000, the unemployment rate in 2017 is about 0.6 percentage points lower than it would have been otherwise simply because the demographic composition of the labor force has changed (depicted by the blue line in the chart).

In other words, even though the 2017 unemployment rate is only 0.4 percentage points higher than in 2000, the demographically adjusted unemployment rate (the green line in the chart) is 1.0 percentage points higher. In terms of unemployment, after adjusting for changes in the composition of the labor force, we are not as close to the 2000 level as you might have thought.

The demographic discrepancy is even larger for the broader U6 measure of unemployment, which includes marginally attached and involuntarily part-time workers. The 2017 demographically adjusted U6 rate is 2.5 percentage points higher than in 2000, whereas the unadjusted U6 rate is only 1.5 percentage points higher. That is, on a demographically adjusted basis, the economy had an even larger share of marginally attached and involuntarily part-time workers in 2017 than in 2000.

The point here is that when comparing unemployment rates over long periods, it's advisable to use a measure that is reasonably insulated from demographic changes. However, you should also keep in mind that demographics are only one of several factors that can cause fluctuation. Changes in labor market and social policies, the mix of industries, as well as changes in the technology of how people find work can also result in changes to how labor markets function. This is one reason why estimates of the so-called natural rate of unemployment are quite uncertain and subject to revision. For example, participants at the December 2012 Federal Open Market Committee meeting had estimates for the unemployment rate that would prevail over the longer run ranging from 5.2 to 6.0 percent. At the December 2017 meeting, the range of estimates was almost a whole percentage point lower at 4.3 to 5.0 percent.

January 18, 2018 in Business Cycles, Economic conditions, Labor Markets, Unemployment | Permalink


Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 17, 2018

What Businesses Said about Tax Reform

Many folks are wondering what impact the Tax Cuts and Jobs Act—which was introduced in the House on November 2, 2017, and signed into law a few days before Christmas—will have on the U.S. economy. Well, in a recent speech, Atlanta Fed president Raphael Bostic had this to say: "I'm marking in a positive, but modest, boost to my near-term GDP [gross domestic product] growth profile for the coming year."

Why the measured approach? That might be our fault. As part of President Bostic's research team, we've been curious about the potential impact of this legislation for a while now, especially on how firms were responding to expected policy changes. Back in November 2016 (the week of the election, actually), we started asking firms in our Sixth District Business Inflation Expectations (BIE) survey how optimistic they were (on a 0–100 scale) about the prospects for the U.S. economy and their own firm's financial prospects. We've repeated this special question in three subsequent surveys. For a cleaner, apples-to-apples approach, the charts below show only the results for firms that responded in each survey (though the overall picture is very similar).

As the charts show, firms have become more optimistic about the prospects for the U.S. economy since November 2016, but not since February 2017, and we didn't detect much of a difference in December 2017, after the details of the tax plan became clearer. But optimism is a vague concept and may not necessarily translate into actions that firms could take that would boost overall GDP—namely, increasing capital investment and hiring.

In November, we had two surveys in the field—our BIE survey (undertaken at the beginning of the month) and a national survey conducted jointly by the Atlanta Fed, Nick Bloom of Stanford University, and Steven Davis of the University of Chicago. (That survey was in the field November 13–24.) In both of these surveys, we asked firms how the pending legislation would affect their capital expenditure plans for 2018. In the BIE survey, we also asked how tax reform would affect hiring plans.

The upshot? The typical firm isn't planning on a whole lot of additional capital spending or hiring.

In our national survey, roughly two-thirds of respondents indicated that the tax reform hasn't enticed them into changing their investment plans for 2018, as the following chart shows.

The chart below also makes apparent that small firms (fewer than 100 employees) are more likely to significantly ramp up capital investment in 2018 than midsize and larger firms.

For our regional BIE survey, the capital investment results were similar (you can see them here). And as for hiring, the typical firm doesn't appear to be changing its plans. Interestingly, here too, smaller firms were more likely to say they'd ramp up hiring. Among larger firms (more than 100 employees), nearly 70 percent indicated that they'd leave their hiring plans unchanged.

One interpretation of these survey results is that the potential for a sharp acceleration in GDP growth is limited. And that's also how President Bostic described things in his January 8 speech: "For now, I am treating a more substantial breakout of tax-reform-related growth as an upside risk to my outlook."

January 17, 2018 in Business Cycles, Data Releases, Economic conditions, Economic Growth and Development, Economics, Taxes | Permalink


Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 07, 2017

What Is the "Right" Policy Rate?

What is the right monetary policy rate? The Cleveland Fed, via Michael Derby in the Wall Street Journal, provides one answer—or rather, one set of answers:

The various flavors of monetary policy rules now out there offer formulas that suggest an ideal setting for policy based on economic variables. The best known of these is the Taylor Rule, named for Stanford University's John Taylor, its author. Economists have produced numerous variations on the Taylor Rule that don't always offer a similar story...

There is no agreement in the research literature on a single "best" rule, and different rules can sometimes generate very different values for the federal funds rate, both for the present and for the future, the Cleveland Fed said. Looking across multiple economic forecasts helps to capture some of the uncertainty surrounding the economic outlook and, by extension, monetary policy prospects.

Agreed, and this is the philosophy behind both the Cleveland Fed's calculations based on Seven Simple Monetary Policy Rules and our own Taylor Rule Utility. These two tools complement one another nicely: Cleveland's version emphasizes forecasts for the federal funds rate over different rules and Atlanta's utility focuses on the current setting of the rate over a (different, but overlapping) set of rules for a variety of the key variables that appear in the Taylor Rule (namely, the resource gap, the inflation gap, and the "neutral" policy rate). We update the Taylor Rule Utility twice a month after Consumer Price Index and Personal Income and Outlays reports and use a variety of survey- and model-based nowcasts to fill in yet-to-be released source data for the latest quarter.

We're introducing an enhancement to our Taylor Rule utility page, a "heatmap" that allows the construction of a color-coded view of Taylor Rule prescriptions (relative to a selected benchmark) for five different measures of the resource gap and five different measures of the neutral policy rate. We find the heatmap is a useful way to quickly compare the actual fed funds rate with current prescriptions for the rate from a relatively large number of rules.

In constructing the heatmap, users have options on measuring the inflation gap and setting the value of the "smoothing parameter" in the policy rule, as well establishing the weight placed on the resource gap and the benchmark against which the policy rule is compared. (The inflation gap is the difference between actual inflation and the Federal Open Market Committee's 2 percent longer-term objective. The smoothing parameter is the degree to which the rule is inertial, meaning that it puts weight on maintaining the fed funds rate at its previous value.)

For example, assume we (a) measure inflation using the four-quarter change in the core personal consumption expenditures price index; (b) put a weight of 1 on the resource gap (that is, specify the rule so that a percentage point change in the resource gap implies a 1 percentage point change in the rule's prescribed rate); and (c) specify that the policy rule is not inertial (that is, it places no weight on last period's policy rate). Below is the heatmap corresponding to this policy rule specification, comparing the rules prescription to the current midpoint of the fed funds rate target range:

We should note that all of the terms in the heatmap are described in detail in the "Overview of Data" and "Detailed Description of Data" tabs on the Taylor Rule Utility page. In short, U-3 (the standard unemployment rate) and U-6 are measures of labor underutilization defined here. We introduced ZPOP, the utilization-to-population ratio, in this macroblog post. "Emp-Pop" is the employment-population ratio. The natural (real) interest rate is denoted by r*. The abbreviations for the last three row labels denote estimates of r* from Kathryn Holston, Thomas Laubach, and John C. Williams, Thomas Laubach and John C. Williams, and Thomas Lubik and Christian Matthes.

The color coding (described on the webpage) should be somewhat intuitive. Shades of red mean the midpoint of the current policy rate range is at least 25 basis points above the rule prescription, shades of green mean that the midpoint is more than 25 basis points below the prescription, and shades of white mean the midpoint is within 25 basis points of the rule.

The heatmap above has "variations on the Taylor Rule that don't always offer a similar story" because the colors range from a shade of red to shades of green. But certain themes do emerge. If, for example, you believe that the neutral real rate of interest is quite low (the Laubach-Williams and Lubik-Mathes estimates in the bottom two rows are −0.22 and −0.06) your belief about the magnitude of the resource gap would be critical to determining whether this particular rule suggests that the policy rate is already too high, has a bit more room to increase, or is just about right. On the other hand, if you are an adherent of the original Taylor Rule and its assumption that a long-run neutral rate of 2 percent (the top row of the chart) is the right way to think about policy, there isn't much ambiguity to the conclusion that the current rate is well below what the rule indicates.

"[D]ifferent rules can sometimes generate very different values for the federal funds rate, both for the present and for the future." Indeed.

September 7, 2017 in Business Cycles, Data Releases, Economics, Monetary Policy | Permalink


Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 23, 2015

Approaching the Promised Land? Yes and No

Last Friday, we released our June installment of the Business Inflation Expectations (BIE) survey. Among the questions we put to our panel of businesses was a quarterly question on slack, asking firms to consider how their current sales levels compare to what they would consider normal.

The good news is, on average, the gap between firms' current unit sales levels and what they would consider normal sales levels continues to close (see the chart).


By our measure, firm sales, in the aggregate, are 1.9 percentage points below normal, a bit better than when we polled them in March (when they were 2.1 percent below normal) and much improved from this time last year (3.7 percent below normal). For comparison, the Congressional Budget Office's (CBO) estimate of slack on a real gross domestic product (GDP) basis was 2.6 percent in the first quarter (though this estimate will almost certainly be revised to something closer to 2.4 percent when the revised GDP estimates are reported later today). And if GDP growth this quarter comes in around 2.5 percent as economists generally expect, the CBO's GDP-based slack estimate will be 2.2 percent this quarter, just a shade larger than what our June survey data are saying.

Now, as we have emphasized frequently (for example, in macroblog posts in May 2015, February 2015, and June 2013), performance in the aggregate and performance within select firm groups can differ widely. For example, while small firms continue to have greater slack than larger firms, their pace of improvement has been much more rapid (see the table).


Likewise, some industries (such as transportation and finance) see current sales as better than normal. But others, like manufacturers, are currently reporting considerable slack—and findings from this group appear to show a marginal worsening in sales levels over the past 12 months.

Another item that caught our attention this month was the differing pace of narrowing in the sales gap among those firms with significant export exposure (greater than 20 percent of sales) relative to those with no direct export exposure. We connected these dots using responses to this month's special question, in which responding firms specified their share of customers by geographic area: local, regional (the Southeast, in our case), national, and international (see the table).


So things are still getting better for the economy overall, and the small firms in our panel have displayed particularly rapid improvement during the last year. But if you've got exposure to the "soft" export markets, as mentioned in the June 17 FOMC statement, you've likely experienced a slower pace of improvement.

Photo of Mike Bryan
By Mike Bryan, vice president and senior economist,
Photo of Brent Meyer
Brent Meyer, economist, and
Photo of Nicholas Parker
Nicholas Parker, economic policy specialist, all in the Atlanta Fed's research department

June 23, 2015 in Business Cycles, Business Inflation Expectations, Economic conditions | Permalink


Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 19, 2015

Will the Elevated Share of Part-Time Workers Last?

There seems to be mounting evidence that at least part of the elevated share of part-time employment in the economy is here to stay. We have some insights to offer based on a recent survey of our business contacts.

Why are we interested? A higher part-time share of employment isn't necessarily a bad thing, if people are doing so voluntarily. Unfortunately, the elevated share is concentrated among people who would prefer to be working full-time. Using the average rate of decline over the past five years, the part-time for economic reasons (PTER) share of employment is projected to reach its prerecession average in about 10 years.

This is significantly slower than the decline in the unemployment rate, whose trajectory suggests a much sooner arrival—in around a year. The deviation raises an important policy question for measuring the amount of slack there is beyond what the unemployment rate suggests, and ultimately the extent to which policy can effectively reduce it.

What are the drivers? Data versus anecdotes
Researchers (here, here, and here) have pointed to factors such as industry shifts in the economy, changing workforce demographics, rising health care costs, and the Affordable Care Act as potentially important drivers of this shift. But we can glean only so much information from data. When a gap develops, we generally turn to our business contacts who are participating members in our Regional Economic Information Network (REIN) to fill in the missing information.

According to our contacts, the relative cost of full-time employees remains the most important reason for having a higher share of part-time employees than before the recession, which is the same response we received in last summer's survey on the same topic. Lack of strong enough sales growth to justify conversion of part-time to full-time workers came in as a close second.

The importance rating for each of the factors was notably similar to last year's survey, with one exception. Technology was rated as somewhat important, reflecting an uptick from the average response we received last year. We've certainly heard anecdotally that scheduling software has enabled firms to better manage their part-time staff, and it seems that this factor has gained in importance over the past year.

The chart below summarizes the reasons our business contacts gave in the July 2014 and the May 2015 surveys. The question was asked only of those who currently have a higher share of part-time workers than they did before the recession. The chart shows the results for all respondents, whether they responded to one or both surveys. When we limited our analysis to only those who responded to both surveys, the results were the same.

Will the elevated share persist?
The results suggest that a return to prerecession levels is unlikely to occur in the near term.

The chart below shows employers' predictions for part-time employment at their firms, relative to before the recession. About 27 percent of respondents believe that in two years, their firms will be more reliant on part-time work compared to before the recession. About 7 percent do not currently have an elevated share of part-time employees but believe they will in two years. About two-thirds believe their share of part-time will be roughly the same as before, while only 8 percent believe they will have less reliance on part-time workers compared to before the recession.

The majority of our contacts believe their share of part-time employment will normalize over the next two years, but some believe it will stay elevated. Still, 2017 does not mean the shift will be permanent. In fact, firms cited a balance of cyclical and structural factors for the higher reliance on part-time. Low sales growth and an ample supply of workers willing to take part-time jobs could both be viewed as cyclical factors that will dissipate as the economy further improves.

Meanwhile, higher compensation costs of full-time relative to part-time employees and the role of technology that enables companies to more easily manage their workforce can be considered structural factors influencing the behavior of firms. Firms that currently have a higher share of part-time employees gave about equal weight to these forces, suggesting that, as other research has found, both cyclical and structural factors are important explanations for the slow decline in the part-time share of employment.

June 19, 2015 in Business Cycles, Employment, Labor Markets, Unemployment | Permalink


Current technologies are a great enabler, this may not have been the case in the past. But one of the reasons, which needs further study is the fall out of M&A and the impact on payrolls, which makes very little allowance for full-time additions thereafter. The full time additions have been more in the retail space or service area, followed by technology, while we have seen dwindling fortunes in the Oil & Natural Gas sector, the last one has seen a switch to part-time.

Posted by: Procyon Mukherjee | June 21, 2015 at 11:26 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

May 18, 2015

Sales Flexing Muscle at More Firms

The news in this month's Business Inflation Expectations (BIE) report is that, in the aggregate, firms' unit sales levels continue to strengthen: Specifically, the survey question measures firms' perceptions of current unit sales levels relative to "normal times."

This month, 70 percent of firms indicated their sales levels are at or above what they consider normal. Last November, that share was 61 percent, and one year ago, it was only 54 percent. We typically report the aggregate results in a diffusion index (see the chart), which also shows the overall progression toward "normal times" (a value of 0).

But, typical of aggregate statistics, these results obscure the diversity of experience among sectors. Digging deeper, we found that most (but not all) of the sectors represented in our panel have shown further improvement in their sales performance relative to last November (see the chart).

Retailers and those in the real estate and rental leasing/construction sectors reported the most significant improvement since November, with retailers approaching what they consider normal sales levels. This news is likely to be most welcome to Dennis Lockhart, our boss here in Atlanta, who has put the performance of the consumer on his "must watch" list. Two industries—finance and insurance, and transportation and warehousing—reported above-normal sales levels in our recent survey.

Only the manufacturers in our panel indicated that their sales performance has deteriorated since November, and they are now reporting sales well below normal. Of course, this news shouldn't be terribly surprising given the recent softness in the manufacturing indexes from both the Institute for Supply Management and industrial production data. This information was also on the boss's watch list, as he made clear in his speech:

The stronger dollar was likely reflected in a drag on net exports...[and] looking ahead, I expect net exports to be a modest drag on economic activity over much of the year.... It should be noted, however, that in recent weeks the dollar has stabilized and oil prices have begun to move up a little. These developments, if they stick, could dilute somewhat what would otherwise be drags on the economy in the near term. We shall see.

Well, judging from our May BIE report, manufacturers aren't seeing improvement quite yet.

Photo of Nicholas Parker
By Nicholas Parker, an economic policy analysis specialist in the research department of the Atlanta Fed

May 18, 2015 in Business Cycles, Business Inflation Expectations, Economic conditions | Permalink


TrackBack URL for this entry:

Listed below are links to blogs that reference Sales Flexing Muscle at More Firms:


Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 26, 2014

Torturing CPI Data until They Confess: Observations on Alternative Measures of Inflation (Part 3)

On May 30, the Federal Reserve Bank of Cleveland generously allowed me some time to speak at their conference on Inflation, Monetary Policy, and the Public. The purpose of my remarks was to describe the motivations and methods behind some of the alternative measures of the inflation experience that my coauthors and I have produced in support of monetary policy.

This is the last of three posts on that talk. The first post reviewed alternative inflation measures; the second looked at ways to work with the Consumer Price Index to get a clear view of inflation. The full text of the speech is available on the Atlanta Fed's events web page.

The challenge of communicating price stability

Let me close this blog series with a few observations on the criticism that measures of core inflation, and specifically the CPI excluding food and energy, disconnect the Federal Reserve from households and businesses "who know price changes when they see them." After all, don't the members of the Federal Open Market Committee (FOMC) eat food and use gas in their cars? Of course they do, and if it is the cost of living the central bank intends to control, the prices of these goods should necessarily be part of the conversation, notwithstanding their observed volatility.

In fact, in the popularly reported all-items CPI, the Bureau of Labor Statistics has already removed about 40 percent of the monthly volatility in the cost-of-living measure through its seasonal adjustment procedures. I think communicating in terms of a seasonally adjusted price index makes a lot of sense, even if nobody actually buys things at seasonally adjusted prices.

Referencing alternative measures of inflation presents some communications challenges for the central bank to be sure. It certainly would be easier if progress toward either of the Federal Reserve's mandates could be described in terms of a single, easily understood statistic. But I don't think this is feasible for price stability, or for full employment.

And with regard to our price stability mandate, I suspect the problem of public communication runs deeper than the particular statistics we cite. In 1996, Robert Shiller polled people—real people, not economists—about their perceptions of inflation. What he found was a stark difference between how economists think about the word "inflation" and how folks outside a relatively small band of academics and policymakers define inflation. Consider this question:


And here is how people responded:


Seventy-seven percent of the households in Shiller's poll picked number 2—"Inflation hurts my real buying power"—as their biggest gripe about inflation. This is a cost-of-living description. It isn't the same concept that most economists are thinking about when they consider inflation. Only 12 percent of the economists Shiller polled indicated that inflation hurt real buying power.

I wonder if, in the minds of most people, the Federal Reserve's price-stability mandate is heard as a promise to prevent things from becoming more expensive, and especially the staples of life like, well, food and gasoline. This is not what the central bank is promising to do.

What is the Federal Reserve promising to do? To the best of my knowledge, the first "workable" definition of price stability by the Federal Reserve was Paul Volcker's 1983 description that it was a condition where "decision-making should be able to proceed on the basis that 'real' and 'nominal' values are substantially the same over the planning horizon—and that planning horizons should be suitably long."

Thirty years later, the Fed gave price stability a more explicit definition when it laid down a numerical target. The FOMC describes that target thusly:

The inflation rate over the longer run is primarily determined by monetary policy, and hence the Committee has the ability to specify a longer-run goal for inflation. The Committee reaffirms its judgment that inflation at the rate of 2 percent, as measured by the annual change in the price index for personal consumption expenditures, is most consistent over the longer run with the Federal Reserve's statutory mandate.

Whether one goes back to the qualitative description of Volcker or the quantitative description in the FOMC's recent statement of principles, the thrust of the price-stability objective is broadly the same. The central bank is intent on managing the persistent, nominal trend in the price level that is determined by monetary policy. It is not intent on managing the short-run, real fluctuations that reflect changes in the cost of living.

Effectively achieving price stability in the sense of the FOMC's declaration requires that the central bank hears what it needs to from the public, and that the public in turn hears what they need to know from the central bank. And this isn't likely unless the central bank and the public engage in a dialog in a language that both can understand.

Prices are volatile, and the cost of living the public experiences ought to reflect that. But what the central bank can control over time—inflation—is obscured within these fluctuations. What my colleagues and I have attempted to do is to rearrange the price data at our disposal, and so reveal a richer perspective on the inflation experience.

We are trying to take the torture out of the inflation discussion by accurately measuring the things that the Fed needs to worry about and by seeking greater clarity in our communications about what those things mean and where we are headed. Hard conversations indeed, but necessary ones.

Photo of Mike BryanBy Mike Bryan, vice president and senior economist in the Atlanta Fed's research department


June 26, 2014 in Business Cycles, Data Releases, Inflation | Permalink


TrackBack URL for this entry:

Listed below are links to blogs that reference Torturing CPI Data until They Confess: Observations on Alternative Measures of Inflation (Part 3):


It would seem the non-economists may also be saying that the economists low inflation is their own stagnant wage.

Sure, they may see prices rising, but they stated what they suffer is the reduction of purchasing power.

Perhaps they would be happy to see prices rising rapidly as long as their own wages outpace.

The 70s may not have been so bad for them.

Posted by: cfaman | June 27, 2014 at 10:01 AM

In addition to the issues discussed in the article, Fed policy makers typically ignore one-time prices changes, particularly those originating on the supply side of the economy -- e.g., those caused by bad weather or a foreign conflict. 

The public can't ignore those price changes, which comprise their daily reality.

Posted by: Thomas Wyrick | July 06, 2014 at 05:57 PM

Tried to contact u in Cleveland late summer 2008. Had a simple? w t f is happening. I saw your picture on frb website next day your picture disappeared I called frb Cleveland some girl may be an economist said you don't work there anymore that was all the information she had. I thought you quit because Greenspan discussed you!!!! Hope all is well Henry

Posted by: Henry Feldman | June 27, 2017 at 02:16 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 24, 2014

Torturing CPI Data until They Confess: Observations on Alternative Measures of Inflation (Part 2)

On May 30, the Federal Reserve Bank of Cleveland generously allowed me some time to speak at their conference on Inflation, Monetary Policy, and the Public. The purpose of my remarks was to describe the motivations and methods behind some of the alternative measures of the inflation experience that my coauthors and I have produced in support of monetary policy.

This is the second of three posts based on that talk. Yesterday's post considered the median CPI and other trimmed-mean measures.

Is it more expensive, or does it just cost more money? Inflation versus the cost of living

Let me make two claims that I believe are, separately, uncontroversial among economists. Jointly, however, I think they create an incongruity for how we think about and measure inflation.

The first claim is that over time, inflation is a monetary phenomenon. It is caused by too much money chasing a limited number of things to buy with that money. As such, the control of inflation is rightfully the responsibility of the institution that has monopoly control over the supply of money—the central bank.

My second claim is that the cost of living is a real concept, and changes in the cost of living will occur even in a world without money. It is a description of how difficult it is to buy a particular level of well-being. Indeed, to a first approximation, changes in the cost of living are beyond the ability of a central bank to control.

For this reason, I think it is entirely appropriate to think about whether the cost of living in New York City is rising faster or slower than in Cleveland, just as it is appropriate to ask whether the cost of living of retirees is rising faster or slower than it is for working-aged people. The folks at the Bureau of Labor Statistics produce statistics that can help us answer these and many other questions related to how expensive it is to buy the happiness embodied in any particular bundle of goods.

But I think it is inappropriate for us to think about inflation, the object of central bank control, as being different in New York than it is in Cleveland, or to think that inflation is somehow different for older citizens than it is for younger citizens. Inflation is common to all things valued by money. Yet changes in the cost of living and inflation are commonly talked about as if they are the same thing. And this creates both a communication and a measurement problem for the Federal Reserve and other central banks around the world.

Here is the essence of the problem as I see it: money is not only our medium of exchange but also our numeraire—our yardstick for measuring value. Embedded in every price change, then, are two forces. The first is real in the sense that the good is changing its price in relation to all the other prices in the market basket. It is the cost adjustment that motivates you to buy more or less of that good. The second force is purely nominal. It is a change in the numeraire caused by an imbalance in the supply and demand of the money being provided by the central bank. I think the concept of "core inflation" is all about trying to measure changes in this numeraire. But to get there, we need to first let go of any "real" notion of our price statistics. Let me explain.

As a cost-of-living approximation, the weights the Bureau of Labor Statistics (BLS) uses to construct the Consumer Price Index (CPI) are based on some broadly representative consumer expenditures. It is easy to understand that since medical care costs are more important to the typical household budget than, say, haircuts, these costs should get a greater weight in the computation of an individual's cost of living. But does inflation somehow affect medical care prices differently than haircuts? I'm open to the possibility that the answer to this question is yes. It seems to me that if monetary policy has predictable, real effects on the economy, then there will be a policy-induced disturbance in relative prices that temporarily alters the cost of living in some way.

But if inflation is a nominal experience that is independent of the cost of living, then the inflation component of medical care is the same as that in haircuts. No good or service, geographic region, or individual experiences inflation any differently than any other. Inflation is a common signal that ultimately runs through all wages and prices.

And when we open up to the idea that inflation is a nominal, not-real concept, we begin to think about the BLS's market basket in a fundamentally different way than what the BLS intends to measure.

This, I think, is the common theme that runs through all measures of "core" inflation. Can the prices the BLS collects be reorganized or reweighted in a way that makes the aggregate price statistic more informative about the inflation that the central bank hopes to control? I think the answer is yes. The CPI excluding food and energy is one very crude way. Food and energy prices are extremely volatile and certainly point to nonmonetary forces as their primary drivers.

In the early 1980s, Otto Eckstein defined core inflation as the trend growth rate of the cost of the factors of production—the cost of capital and wages. I would compare Eckstein's measure to the "inflation expectations" component that most economists (and presumably the FOMC) think "anchor" the inflation trend.

The sticky-price CPI

Brent Meyer and I have taken this idea to the CPI data. One way that prices appear to be different is in their observed "stickiness." That is, some prices tend to change frequently, while others do not. Prices that change only infrequently are likely to be more forward-looking than are those that change all the time. So we can take the CPI market basket and separate it into two groups of prices—prices that tend to be flexible and those that are "sticky" (a separation made possible by the work of Mark Bils and Peter J. Klenow).

Indeed, we find that the items in the CPI market basket that change prices frequently (about 30 percent of the CPI) are very responsive to changes in economic conditions, but do not seem to have a very forward-looking character. But the 70 percent of the market basket items that do not change prices very often—these are accounted for in the sticky-price CPI—appear to be largely immune to fluctuations in the business conditions and are better predictors of future price behavior. In other words, we think that some "inflation-expectation" component exists to varying degrees within each price. By reweighting the CPI market basket in a way that amplifies the behavior of the most forward-looking prices, the sticky-price CPI gives policymakers a perspective on the inflation experience that the headline CPI can't.

Here is what monthly changes in the sticky-price CPI look like compared to the all-items CPI and the traditional "core" CPI.

Let me describe another, more radical example of how we might think about reweighting the CPI market basket to measure inflation—a way of thinking that is very different from the expenditure-basket approach the BLS uses to measure the cost of living.

If we assume that inflation is ultimately a monetary event and, moreover, that the signal of this monetary inflation can be found in all prices, then we might use statistical techniques to help us identify that signal from a large number of price data. The famous early-20th-century economist Irving Fisher described the problem as trying to track a swarm of bees by abstracting from the individual, seemingly chaotic behavior of any particular bee.

Cecchetti and I experimented along these lines to measure a common signal running through the CPI data. The basic idea of our approach was to take the component data that the BLS supplied, make a few simple identifying assumptions, and let the data itself determine the appropriate weighting structure of the inflation estimate. The signal-extraction method we chose was a dynamic-factor index approach, and while we didn't pursue that work much further, others did, using more sophisticated and less restrictive signal-extraction methods. Perhaps most notable is the work of Ricardo Reis and Mark Watson.

To give you a flavor of the approach, consider the "first principal component" of the CPI price-change data. The first principal component of a data series is a statistical combination of the data that accounts for the largest share of their joint movement (or variance). It's a simple, statistically shared component that runs through all the price data.

This next chart shows the first principal component of the CPI price data, in relation to the headline CPI and the core CPI.

Again, this is a very different animal than what the folks at the BLS are trying to measure. In fact, the weights used to produce this particular common signal in the price data bear little similarity to the expenditure weights that make up the market baskets that most people buy. And why should they? The idea here doesn't depend on how important something is to the well-being of any individual, but rather on whether the movement in its price seems to be similar or dissimilar to the movements of all the other prices.

In the table below, I report the weights (or relative importance) of a select group of CPI components and the weights they would get on the basis of their contribution to the first principal component.


While some criticize the CPI because it over weights housing from a cost-of-living perspective, it may be these housing components that ought to be given the greatest consideration when we think about the inflation that the central bank controls. Likewise, according to this approach, restaurant costs, motor vehicle repairs, and even a few food components should be taken pretty seriously in the measurement of a common inflation signal running through the price data.

And what price movements does this approach say we ought to ignore? Well, gasoline prices for one. But movements in the prices of medical care commodities, communications equipment, and tobacco products also appear to move in ways that are largely disconnected from the common thread in prices that runs through the CPI market basket.

But this and other measures of "core" inflation are very much removed from the cost changes that people experience on a monthly basis. Does that cause a communications problem for the Federal Reserve? This will be the subject of my final post.

Photo of Mike BryanBy Mike Bryan, vice president and senior economist in the Atlanta Fed's research department


June 24, 2014 in Business Cycles, Data Releases, Inflation | Permalink


TrackBack URL for this entry:

Listed below are links to blogs that reference Torturing CPI Data until They Confess: Observations on Alternative Measures of Inflation (Part 2):


Great thoughts, thanks for sharing. taking the the idea of core inflation as the movements in prices that contain information about future inflation, have you ever thought about applying partial least squares (PLS) rather than PCA for dimension reduction, and making a future value of headline inflation the Y variable in the PLS decomposition of the Y'X? then you would get weightings that reflected the information content of each price series x on future Y, rather than PCA which simply decomposes the variance within X'X

Posted by: Michael Hugman | June 25, 2014 at 11:10 AM

This is very interesting. But I wonder, is it really possible to distinguish monetary inflation from cost-of-living inflation? As you say, monetary inflation reflects an imbalance between the supply and demand for money. Where does the demand for money come from? Presumably from the level of real activity. And how do we measure real activity independent of money, if not as a level of well-being?

In fact, the measurement of quantity in terms of well-being is the explicit basis of the hedonic price adjustments that go into a significant fraction of the CPI. So at the least, if you want a pure monetary measure of inflation, shouldn't you strip those adjustments back out?

Along the same lines, you say the inflation controlled by the central should be identical in New York and Cleveland. But what if monetary policy produces identical rates of money supply growth in both cities, while different real growth rates mean that money demand is rowing faster in one place than the other?

Posted by: JW Mason | June 27, 2014 at 09:42 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 10, 2014

Reasons for the Decline in Prime-Age Labor Force Participation

Editor's note: Since this post was written, we have developed new tools for examining labor market trends. For a more detailed examination of factors affecting labor force participation rates, please visit our Labor Force Participation Dynamics web page, where you can create your own charts and download data.

As a follow up to this post on recent trends in labor force participation, we look specifically at the prime-age group of 25- to 54-year-olds. The participation decisions of this age cohort are less affected by the aging population and the longer-term trend toward lower participation of youths because of rising school enrollment rates. In that sense, they give us a cleaner window on responses of participation to changing business cycle conditions.

The labor force participation rate of the prime-age group fell from 83 percent just before the Great Recession to 81 percent in 2013. The participation rate of prime-age males has been trending down since the 1960s. The participation rate of women, which had been rising for most of the post-World War II period, appears to have plateaued in the 1990s and has more recently shared the declining pattern of participation for prime-age men. But the decline in participation for both groups appears to have accelerated between 2007 and 2013 (see chart 1).


We look at the various reasons people cite for not participating in the labor force from the monthly Current Population Survey. These reasons give us some insight into the impact of changes in employment conditions since 2007 on labor force participation. The data on those not in the official labor force can be broken into two broad categories: those who say they don't currently want a job and those who say they do want a job but don't satisfy the active search criteria for being in the official labor force. Of the prime-age population not in the labor force, most say they don't currently want a job. At the end of 2007, about 15 percent of 25- to 54-year-olds said they didn't want a job, and slightly fewer than 2 percent said they did want a job. By the end of 2013, the don't-want-a-job share had reached nearly 17 percent, and the want-a-job share had risen to slightly above 2 percent (see chart 2).


Prime-Age Nonparticipation: Currently Want a Job
Most of the rise in the share of the prime-age population in the want-a-job category is due to so-called marginally attached individuals—they are available and want a job, have looked for a job in the past year, but haven't looked in the past four weeks—especially those who say they are not currently looking because they have become discouraged about job-finding prospects (see the blue and orange lines of chart 3). In 2013, there were about 1.1 million prime-age marginally attached individuals compared to 0.7 million in 2007, and the prime-age marginally attached accounted for about half of all marginally attached in the population.


The marginally attached are aptly named in the sense that they have a reasonably high propensity to reenter the labor force—more than 40 percent are in the labor force in the next month and more than 50 percent are in the labor force 12 months later (see chart 4). This macroblog post discusses what the relative stability in the flow rate from marginally attached to the labor force means for thinking about the amount of slack labor resources in the economy.


Prime-Age Nonparticipation: Currently Don't Want a Job
As chart 2 makes evident, the vast majority of the rise in prime-age nonparticipation since 2009 is due to the increase in those saying they do not currently want a job. The largest contributors to the increase are individuals who say they are too ill or disabled to work or who are in school or training (see the orange and blues lines in chart 5).


Those who say they don't want a job because they are disabled have a relatively low propensity to subsequently (re)enter the labor force. So if the trend of rising disability persists, it will put further downward pressure on prime-age participation. Those who say they don't currently want a job because they are in school or training have a much greater likelihood of (re)entering the labor force, although this tendency has declined slightly since 2007 (see chart 6).


Note that the number of people in the Current Population Survey citing disability as the reason for not currently wanting a job is not the same as either the number of people applying for or receiving social security disability insurance. However, a similar trend has been evident in overall disability insurance applications and enrollments (see here).

Some of the rise in the share of prime-age individuals who say they don't want a job could be linked to erosion of skills resulting from prolonged unemployment or permanent changes in the composition of demand (a different mix of skills and job descriptions). It is likely that the rise in share of prime-age individuals not currently wanting a job because they are in school or in training is partly a response to the perception of inadequate skills. The increase in recent years is evident across all ages until about age 50 but is especially strong among the youngest prime-age individuals (see chart 7).


But lack of required skills is not the only plausible explanation for the rise in the share of prime-age individuals who say they don't currently want a job. For instance, the increased incidence of disability is partly due to changes in the age distribution within the prime-age category. The share of the prime-age population between 50 and 54 years old—the tail of the baby boomer cohort—has increased significantly (see chart 8).


This increase is important because the incidence of reported disability within the prime-age population increases with age and has become more common in recent years, especially for those older than 45 (see chart 9).


The health of the labor market clearly affects the decision of prime-age individuals to enroll in school or training, apply for disability insurance, or stay home and take care of family. Discouragement over job prospects rose during the Great Recession, causing many unemployed people to drop out of the labor force. The rise in the number of prime-age marginally attached workers reflects this trend and can account for some of the decline in participation between 2007 and 2009.

But most of the postrecession rise in prime-age nonparticipation is from the people who say they don't currently want a job. How much does that increase reflect trends established well before the recession, and how much can be attributed to the recession and slow recovery? It's hard to say with much certainty. For example, participation by prime-age men has been on a secular decline for decades, but the pace accelerated after 2007—see here for more discussion.

Undoubtedly, some people will reenter the labor market as it strengthens further, especially those who left to undertake additional training. But for others, the prospect of not finding a satisfactory job will cause them to continue to stay out of the labor market. The increased incidence of disability reported among prime-age individuals suggests permanent detachment from the labor market and will put continued downward pressure on participation if the trend continues. The Bureau of Labor Statistics projects that the prime-age participation rate will stabilize around its 2013 level. Given all the contradictory factors in play, we think this projection should have a pretty wide confidence interval around it.

Note: All data shown are 12-month moving averages to emphasize persistent shifts in trends.

Melinda PittsBy Melinda Pitts, director, Center for Human Capital Studies,

John RobertsonJohn Robertson, a vice president and senior economist in the Atlanta Fed's research department, and

Ellyn TerryEllyn Terry, a senior economic analyst in the Atlanta Fed's research department

April 10, 2014 in Business Cycles, Employment, Labor Markets | Permalink


TrackBack URL for this entry:

Listed below are links to blogs that reference Reasons for the Decline in Prime-Age Labor Force Participation :


have you considered that a number of people will say they dont want a job because they have experienced repeated frustration in finding one? it's better for one's psyche to lie to yourself and to others about such than to accept the fact that one has repeatedly been rejected...

Posted by: rjs | April 10, 2014 at 04:39 PM

Astonishing decline in male labor force participation since 1970s.

I would be interested to see more detailed age bracket than category of the 25 - 54 age brackets.

This is so we can see if the decline over time is consistent for all ages or the particular works from certain age that flows through remainder of their working life.


Posted by: Jason | April 12, 2014 at 10:21 PM

Clearly there is nobody who is unemployed who does not want a job. This article is simply a deceptive representation of the facts. The problem is largely that employers will not hire qualified people unless they have done the exact same job before. They will not for example hire an Architect to work as a project manager at a company that manufactures windows, because the HR people use IT to scan the resumes in place of interviews and will only choose from the set of people who have been employed by manufacturers of windows in the past.

Do the survey and the research over again and ask the right questions. The problem more than likely is that the most qualified people are being overlooked, are frustrated because they can,t crossover to a different industry or are victims of age discrimination. You can be certain that most people want to have a job. Sop, dig deeper.

Posted by: Terry L. Walker, ARCHITECT | April 14, 2014 at 11:01 AM

I wonder if the trends would look like same if charted by age. Surely 54 YOs are a bigger part of the group today.

Posted by: Floccina | January 13, 2017 at 03:33 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in