The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.
- BLS Handbook of Methods
- Bureau of Economic Analysis
- Bureau of Labor Statistics
- Congressional Budget Office
- Economic Data - FRED® II, St. Louis Fed
- Office of Management and Budget
- Statistics: Releases and Historical Data, Board of Governors
- U.S. Census Bureau Economic Programs
- White House Economic Statistics Briefing Room
June 24, 2019
Mapping the Financial Frontier at the Financial Markets Conference
The Atlanta Fed recently hosted its 24th annual Financial Markets Conference, whose theme was Mapping the Financial Frontier: What Does the Next Decade Hold? The conference addressed a variety of issues pertinent to the future of the financial system. Among the sessions touching on macroeconomics was a keynote speech on corporate debt by Federal Reserve Board chair Jerome Powell and another on revitalizing America by Massachusetts Institute of Technology (MIT) professor Simon Johnson. The conference also included a panel discussion of the Fed's plans for implementing monetary policy in the future. This macroblog post reviews these macroeconomic discussions. A companion Notes from the Vault post reviews conference sessions on blockchain technology, data privacy, and postcrisis developments in the markets for mortgage backed securities.
Chair Powell's thoughts on corporate debt levels
Chair Powell's keynote speech focused on the risks posed by increases in corporate debt levels. In his speech, titled "Business Debt and Our Dynamic Financial System" (which you can watch or read), Powell began by observing that business debt levels have increased by a variety of measures including the ratios of debt to gross domestic product as well as the debt to the book value of corporate assets. These higher debt ratios alone don't currently pose a problem because corporate profits are high and interest rates are low. Powell noted some reasons for concern, however, including the reduced average quality of investment-grade bonds, with more corporate debt concentrated in the "lowest rating—a phenomenon known as the 'triple-B cliff'".
Powell noted several differences between the recent increase in corporate debt and the increase in household debt prior to the 2007–09 crisis that offset these risks. These differences include a more moderate rate of increase in corporate debt, the lack of a feedback loop from debt levels to asset prices, reduced leverage in the banking system, and less liquidity risk.
Powell concluded his remarks by saying that although business debt does pose a risk of amplifying a future downturn, it does not appear to pose "notable risks to financial stability." Finally, he noted that the Fed is working toward a more thorough understand of the risks.
Simon Johnson on jumpstarting America
Simon Johnson started his keynote speech by discussing Amazon's search for a second headquarters city. The company received proposals from 238 cities across the country (and Canada). However, in the end, it selected two large metropolitan areas—New York and Washington, DC—that were already among the leaders in creating new tech jobs. Although many places around the country want growth in good jobs, he said the innovation economy is "drawn disproportionately to these few places."
Johnson's remedy for this disproportionate clustering is for the federal government to make a deliberate effort to encourage research and development in various technical areas at a number of research universities around the country. This proposal is based on his book with fellow MIT economist Jonathan Gruber. They argue that the proposal encourages "exactly what the U.S. did in the '40s, '50s, and '60s," which was to help the United States develop new technology to be used in World War II and the Cold War.
Johnson proposed that the funding for new technical projects be allocated through a nationwide competition that intentionally seeks to create new tech hubs. In making his case, Johnson observed that the view that "all the talent is just in six places is fundamentally wrong." Johnson said that he and his coauthor found 102 cities in 36 states that have a substantial proportion of college graduates and relatively low housing prices. Moreover, Johnson observed that existing tech centers' cost of living has become very high, and those cities have substantial political limits on their ability to sustain new population growth. If some of these 102 potential hubs received the funding to start research and provide capital to business, Johnson argued, overall growth in the United States could increase and be more evenly distributed.
Discussing the implementation of monetary policy
The backdrop for the session on monetary policy implementation was postcrisis developments in the Fed's approach to implementing monetary policy. As the Fed's emergency lending programs started to recede after the crisis, it started making large-scale investments in agency mortgage backed securities and U.S. Treasuries. This program, widely (though somewhat misleadingly) called "quantitative easing," or QE, pumped additional liquidity into securities markets and played a role in lowering longer-term interest rates. As economic conditions improved, the Fed first started raising short-term rates and then adopted a plan to shrink its balance sheet starting in 2018. However, earlier this year, the Fed announced plans to stop shrinking the balance sheet in September if the economy performs as it expected.
Julia Coronado, president of MacroPolicy Perspectives, led the discussion of the Fed's plans, and a large fraction of that discussion addressed its plans for the size of the balance sheet. Kevin Warsh, former Federal Reserve governor and currently a visiting fellow at Stanford University's Hoover Institution, provided some background information on the original rationale for QE, when many financial markets were still rather illiquid. However, he argued that those times were extraordinary and that "extraordinary tools are meant for extraordinary circumstances." He further expressed the concern that using QE at other times and for other reasons, such as in response to regulatory policy, would increase the risk of political involvement in monetary policy.
During the discussion, Chicago Fed president Charles Evans argued that QE is likely to remain a necessary part of the Fed's toolkit. He observed that slowing labor force growth, moderate productivity growth, and low inflation are likely to keep equilibrium short-term interest rates low. As a result, the Fed's ability to lower interest rates in a future recession is likely to remain constrained, meaning that balance sheet expansion will remain a necessary tool for economic stimulus.
Ethan Harris, head of global economics research at Bank of America Merrill Lynch, highlighted the potential stress the next downturn would place on the Fed. Harris observed that "other central banks have virtually no ammunition" to fight the next downturn, a reference to the negative policy rates and relatively larger balance sheets of some other major central banks. This dynamic prompted his question, "How is the Fed, on its own, going to fight the next crisis?"
The conference made clear the importance of the links between financial markets and the macroeconomy, and this blog post focused on just three of them. I encourage you to delve into the rest of the conference materials to see these and other important discussions.
January 16, 2019
I didn't coin the title of this blog post. It was the label on a chart of the Federal Reserve's balance sheet that appeared in an issue of The Wall Street Journal last week. I've led with this phrase because it does seem to capture some of the sentiment around what has become the elephant in the monetary policy room: Is the rundown in the size of the Fed's balance sheet causing an unanticipated, and unwarranted, tightening of monetary policy conditions? I think the answer to that question is "no." Let me explain why.
In June 2017, the Federal Open Market Committee (FOMC) determined that it was appropriate to begin the process of reducing the size of the Fed's balance sheet, which had more than quadrupled as a result of efforts to combat the financial crisis and support the subsequent recovery from a very deep recession.
As I noted in a speech last November, I see the Committee's strategy for shrinking the balance sheet as having two essential elements.
- First, the normalization process is designed to be gradual. It was phased in over the course of about a year and a half and is now subject to monthly caps so the run-down is not too rapid.
- Second, the normalization process is designed to be as predictable as possible. The schedule of security retirements was announced in advance so that uncertainty about the pace of normalization can be minimized. (In other words, "quantitative tightening" is decidedly not on the QT.) As a result, the normalization process also reduces complexity. Balance-sheet reduction has moved into the background so that ongoing policy adjustments can focus solely on the traditional interest-rate channel.
In his recent remarks at the annual meeting of the American Economic Association, Chairman Jerome Powell was very clear that the fairly mechanical balance-sheet strategy adopted by the FOMC thus far should not be interpreted as inflexibility in the conduct of monetary policy or an unwillingness to recognize that balance-sheet reduction is in fact monetary policy tightening.
I will speak for myself. Balance-sheet policy is an element of the monetary policy mix. The decision to adopt a relatively deterministic approach to balance-sheet reduction is not a decision to ignore the possibility that it has led or might lead to a somewhat more restrictive stance of monetary policy. It is a decision to make whatever adjustments are necessary through the Fed's primary interest-rate tools to the greatest extent possible.
I maintain that there is still wisdom in this approach. The effects of our interest-rate tools are much more familiar to both policymakers and markets than balance-sheet tools are. That, to my mind, makes them the superior instrument for reaching and maintaining our dual goals of stable inflation and maximum employment. It is my belief that reducing the number of moving pieces makes monetary policy more transparent and predictable, which enhances the Committee's capacity for a smooth transition toward those goals.
It should now be clear that nothing is written in stone. Whether the FOMC uses active interest-rate policy with passive balance-sheet policy or uses both instruments actively, policy decisions will ultimately be driven by the facts on the ground as best Committee members can judge, and by assessments of risks that surround those judgments.
In my own judgment, it is far from clear that the ongoing reduction in the balance sheet is having an outsized impact on the stance of monetary policy. I think it is widely accepted that one of the ways balance-sheet policies work is by affecting the term premia associated with holding longer-term securities. (There are many good discussions about balance-sheet mechanisms, including this one by Edison Yu, which can be found in the first quarter 2016 edition of the Philadelphia Fed's Economic Insights, or this article by Jane Ihrig, Elizabeth Klee, Canlin Li, Min Wei, and Joe Kachovec in the March 2018 issue of the International Journal of Central Banking.)
Lots of things can push term premia up and down. But one of the factors is the presumed willingness of the central bank to purchase long-term securities in scale—or not. The idea that running down the balance sheet tightens monetary policy is that, in so doing, the FOMC is removing a crucial measure of support to the bond market. This makes longer-term securities riskier by transferring more duration risk back to the market, which raises term premia and, all else equal, pushes rates higher.
Although estimating term premia is as much art as science, I don't think the evidence supports the argument that these premia have been materially rising as a result of our normalization process. The New York Fed publishes one well-known real-time estimate of the term premia associated with 10-year Treasury securities. But isolating and quantifying the effect of balance-sheet changes on term premia is challenging. It is possible that a number of factors, such as the continued high demand for U.S. Treasuries by financial institutions and a low inflation risk premium, might have dampened the independent effect of balance sheet run-off. But if the term premia channel is a critical piece of what makes balance-sheet policy work, I'm hard pressed to see much evidence of financial tightening via rising term premia in the data so far.
Lest anyone think I am overly influenced by one particular theory, I will emphasize that I am not taking anything for granted. In addition to my monitoring of developments on Main Street, I will be watching financial conditions and term premia as I assess the outlook for the economy. My view is that a patient approach to monetary policy adjustments in the coming year is fully warranted in light of the uncertainties about the state of the economy, about what level of policy rates is consistent with a neutral stance, and about the overall impact of balance-sheet normalization. This patience is one of the characteristics of what I mean by data dependence.
October 26, 2018
On Maximizing Employment, a Case for Caution
Over the past few months, I have been asked one question regularly: Why is the Fed removing monetary policy stimulus when there is little sign that inflation has run amok and threatens to undermine economic growth? This is a good question, and it speaks to a philosophy of how to maintain the stability of both economic performance and prices, which I view as important for the effective implementation of monetary policy.
In assessing the degree to which the Fed is achieving the congressional mandate of price stability, the Federal Open Market Committee (FOMC) identified 2 percent inflation in consumption prices as a benchmark—see here for more details. Based on currently available data, it seems that inflation is running close to this benchmark.
The Fed's other mandate from Congress is to foster maximum employment. A key metric for performance relative to that mandate is the official unemployment rate. So, when some people ask why the FOMC is reducing monetary policy stimulus in the absence of clear inflationary pressure, what they really might be thinking is, "Why doesn't the Fed just conduct monetary policy to help the unemployment rate go as low as physically possible? Isn't this by definition the representation of maximum employment?"
While this is indeed one definition of full employment, I think this is a somewhat short-sighted perspective that doesn't ultimately serve the economy and American workers well. One important reason for being skeptical of this view is our nation's past experience with "high-pressure" economic periods. High-pressure periods are typically defined as periods in which the unemployment rate falls below the so-called natural rate—using an estimate of the natural rate, such as the one produced by the Congressional Budget Office (CBO).
As the CBO defines it, the natural rate is "the unemployment rate that arises from all sources other than fluctuations in demand associated with business cycles." These "other sources" include frictions like the time it takes people to find a job or frictions that result from a mismatch between the set of skills workers currently possess and the set of skills employers want to find.
When the actual unemployment rate declines substantially below the natural rate—highlighted as the red areas in the following chart—the economy has moved into a "high-pressure period."
For the purposes of this discussion, the important thing about high-pressure economies is that, virtually without exception, they are followed by a recession. Why? Well, as I described in a recent speech:
"One view is that it is because monetary policy tends to take on a much more 'muscular' stance—some might say too muscular—at the end of these high-pressure periods to combat rising nominal pressures.
"The other alternative is that the economy destabilizes when it pushes beyond its natural potential. These high-pressure periods lead to a buildup of competitive excesses, misdirected investment, and an inefficient allocation of societal resources. A recession naturally results and is needed to undo all the inefficiencies that have built up during the high-pressure period.
"Yet, some people suggest that deliberately running these high-pressure periods can improve outcomes for workers in communities who have been less attached to the labor market, such as minorities, those with lower incomes, and those living in rural communities. These workers have long had higher unemployment rates than other workers, and they are often the last to benefit from periods of extended economic growth.
"For example, the gap between the unemployment rates of minority and white workers narrows as recoveries endure. So, the argument goes, allowing the economy to run further and longer into these red areas on the chart provides a net benefit to these under-attached communities.
"But the key question isn't whether the high-pressure economy brings new people from disadvantaged groups into the labor market. Rather, the right question is whether these benefits are durable in the face of the recession that appears to inevitably follow.
"This question was explored in a research paper by Atlanta Fed economist Julie Hotchkiss and her research colleague Robert Moore. Unfortunately, they found that while workers in these aforementioned communities tend to experience greater benefits from these high-pressure periods, the pain and dislocation associated with the aftermath of the subsequent recession is just as significant, if not more so.
"Importantly, this research tells me we ought to guard against letting the economy slip too far into these high-pressure periods that ultimately impose heavy costs on many people across the economy. Facilitating a prolonged period of low—and sustainable—unemployment rates is a far more beneficial approach."
In short, I conclude that the pain inflicted from shifting from a high-pressure to a low-pressure economy is too great, and this tells me that it is important for the Fed to beware the potential for the economy overheating.
Formulating monetary policy would all be a lot easier, of course, if we were certain about the actual natural rate of unemployment. But we are not. The CBO has an estimate—currently 4.5 percent. The FOMC produces projections, and other forecasters produce estimates of what it thinks the unemployment rate would be over the longer run.
For my part, I estimate that the natural rate is closer to 4 percent, and given the current absence of accelerating inflationary pressures, we can't completely dismiss the possibility that the natural rate is even lower. Nonetheless, with the unemployment rate currently at 3.7 percent, it seems likely that we're at least at our full employment mandate.
So, what is this policymaker to do? Back to my speech:
"My thinking will be informed by the evolution of the incoming data and from what I'm able to glean from my business contacts. And while I wrestle with that choice, one thing seems clear: there is little reason to keep our foot on the gas pedal."
August 23, 2018
What Does the Current Slope of the Yield Curve Tell Us?
As I make the rounds throughout the Sixth District, one of the most common questions I get these days is how Federal Open Market Committee (FOMC) participants interpret the flattening of the yield curve. I, of course, do not speak for the FOMC, but as the minutes from recent meetings indicate, the Committee has indeed spent some time discussing various views on this topic. In this blog post, I'll share some of my thoughts on the framework I use for interpreting the yield curve and what I'll be watching. Of course, these are my views alone and do not reflect the views of any other Federal Reserve official.
Many observers see a downward-sloping, or "inverted," yield curve as a reliable predictor for a recession. Chart 1 shows the yield curve's slope—specifically, the difference between the interest rates paid on 10-year and 2-year Treasury securities—is currently around 20 basis points. This is lowest spread since the last recession.
The case for worrying about yield-curve flattening is apparent in the chart. The shaded bars represent recessionary periods. Both of the last two recessions were preceded by a flat (and, for a time, inverted) 10-year/2-year spread.
As we all know, however, correlation does not imply causality. This is a particularly important point to keep in mind when discussing the yield curve. As a set of market-determined interest rates, the yield curve not only reflects market participants' views about the evolution of the economy but also their views about the FOMC's likely reaction to that evolution and uncertainty around these and other relevant factors. In other words, the yield curve represents not one signal, but several. The big question is, can we pull these signals apart to help appropriately inform the calibration of policy?
We can begin to make sense of this question by noting that Treasury yields of any given maturity can be thought of as the sum of two fundamental components:
- An expected policy rate path over that maturity: the market's best guess about the FOMC's rate path over time and in response to the evolution of the economy.
- A term premium: an adjustment (relative to the path of the policy rate) that reflects additional compensation investors receive for bearing risk related to holding longer-term bonds.
Among other things, this premium may be related to two factors: (1) uncertainty about how the economy will evolve over that maturity and how the FOMC might respond to events as they unfold and (2) the influence of supply and demand factors for U.S. Treasuries in a global market.
Let's apply this framework to the current yield curve. As several of my colleagues (including Fed governor Lael Brainard) have noted, the term premium is currently quite low. All else equal, this would result in lower long-term rates and a flatter yield curve. The term premium bears watching, but it is unclear that movements in the premium reflect particular concerns about the course of the economy.
I tend to focus on the other component: the expected path of policy. When we ask whether a flattening yield curve is a cause for concern, what we are really asking is: does the market expect an economic slowdown that will require the FOMC to reverse course and lower rates in the near future?
The eurodollar futures market shows us one measure of the market's expectation for the policy rate path. These derivative contracts are quoted in terms of a three-month rate that closely follows the FOMC's policy rate, which makes them well-suited for this kind of analysis. (Some technical details regarding this market can be found in a 2016 issue of the Atlanta Fed's "Notes from the Vault.")
Chart 2 illustrates the current estimate of the market's expected policy rate path. Read simply, the market appears to be forecasting continuing policy rate increases through 2020, and there is no evidence of a market forecast that the FOMC will need to reverse course in the medium term. However, the level of the policy rate is lower than the median of the FOMC's June Summary of Economic Projections (SEP) for 2019 and 2020.
Once we get past 2020, the market's expected policy path flattens. I read this as evidence that market participants overall expect a very gradual pace of tightening as the most likely outcome over the next two years. Interestingly, the market appears to expect a slower pace of tightening than the pace that at least some members of the FOMC currently view as "appropriate" as represented in their SEP submissions.
For this measure, I find the short-term perspective most informative. As one looks further into the future, the range of possible outcomes widens, as many the factors that influence the economy can evolve and interact widely. Thus, the precision of any signal the market is providing about policy expectations—if indeed there is any signal at all—is likely to be quite low.
With this information in mind, I do not interpret that the yield curve indicates that the market believes the evolution of the economy will cause the FOMC to lower rates in the foreseeable future. This interpretation is consistent with my own economic forecast, gleaned from macroeconomic data and a robust set of conversations with businesses both large and small. My modal outlook is for expansion to continue at an above-trend pace for the next several quarters, and I see the risks to that projection as balanced. Yes, there are downside risks, chief among them the effects of (and uncertainty about) trade policy. But those risks are countered by the potential for recent fiscal stimulus to have a much more transformative impact on the economy than I've marked into my baseline outlook.
I believe the yield curve gives us important and useful information about market participants' forecasts. But it is only one signal among many that we use for the complex task of forecasting growth in the U.S. economy. As the economy evolves, I will be assessing the response of the yield curve to incoming data and policy decisions along the lines I've laid out here, incorporating market signals along with a constellation of other information to achieve the FOMC's dual objectives of price stability and maximum employment.
April 02, 2018
Thoughts on a Long-Run Monetary Policy Framework, Part 4: Flexible Price-Level Targeting in the Big Picture
In the second post of this series, I enumerated several alternative monetary policy frameworks. Each is motivated by a recognition that the Federal Open Market Committee (FOMC) is likely to confront future scenarios where the effective lower bound on policy rates comes into play. Given such a possibility, it is important to consider the robustness of the framework.
My previous macroblog posts have focused on one of these frameworks: price-level targeting of a particular sort. As I hinted in the part 3 post, I view the specific framework I have in mind as a complement to, and not a substitute for, many of the other proposals that are likely to be considered. In this final post on the topic, I want to expand on that thought, considering in turn the options listed in part 2.
- Raising the FOMC's longer-run inflation target
The framework I described in part 3 was constructed to be consistent with the FOMC's current long-run objective of 2 percent inflation. But nothing in the structure of the plan I discussed would bind the Committee to the 2 percent objective. Obviously, a price-level target line can be constructed for any path that policymakers choose. The key is to have such a target and coherently manage monetary policy so that it achieves that target. The slope of the price-level path—that is, the underlying long-run inflation rate—is an entirely separate issue.
- Maintaining the 2 percent longer-run inflation target and policy framework more or less as is, relying on unconventional tools when needed
As noted, the flexible price-level targeting example I discussed in part 3 was constructed with a long-run 2 percent inflation rate as the key benchmark. In that regard, it is clearly consistent with the Fed's current inflation goal.
Further, a central question in the current framework is how to interpret a goal of 2 percent inflation in the longer run. One interpretation is that the central bank aims to deliver an inflation rate that averages 2 percent over some period of time. Another interpretation is that the central bank aims to deliver an inflation rate that tends toward 2 percent, letting bygones be bygones in the event that realized inflation rates deviate from 2 percent.
The bounded price-level targets I have presented do not force a particular answer to the question I raise, and both views can be supported within the framework. Hence, the framework is consistent with whichever view the FOMC might adopt. The only caveat is that deviations from 2 percent cannot be so large and persistent that they push the price level outside the target bounds.As to the problem of the federal funds rate falling to a level that makes further cuts infeasible, nothing in the notion of a price-level target rules out (or demands) any particular policy tool. If anything, bounded price-level targets could expand the existing toolkit. They certainly do not constrain it.
Targeting nominal gross domestic product (GDP) growth
Targeting nominal GDP growth, which is the sum of real GDP growth and the inflation rate, represents a deviation from the price-level targeting I have described. In this framework, the longer-run rate of inflation depends on the longer-run rate of real GDP growth.
To see how this works, consider the period from 2003 to 2013. In 2003, the Congressional Budget Office projected an average annual potential GDP growth rate of 2.9 percent over the next 10 years. Had there been a nominal GDP growth target of 5 percent at this time, the implicit annualized inflation target would have been just over 2 percent. However, current CBO estimates indicate that actual potential GDP growth over this period averaged just 1.5 percent, which would suggest an inflation target of 3.5 percent. As data came in and policymakers saw this lower level of growth, they would have responded by shifting upward the implicit inflation target.
For advocates of using a nominal GDP target, shifting inflation targets is a key feature and not a bug, as it allows policy to adjust in real time to unforeseen cyclical and structural developments. What nominal GDP targeting doesn't satisfy is the principle of bounded nominal uncertainty. Eventually, price-level bounds that are set with an assumed potential real growth path will be violated if shifts in potential growth are sufficiently large. The appeal of nominal GDP targeting depends on how one weighs the benefits of inflation-target flexibility against the costs of price-level uncertainty inherent in that framework.
- Adopting flexible inflation targets that are adjusted based on economic conditions
Recently, my colleague Eric Rosengren, president of the Boston Fed, offered a proposal (here and here) that has some of the flavor of nominal GDP targeting but differs in important respects. Like nominal GDP targeting, President Rosengren's framework would adjust the target inflation rate given structural shifts in the economy. However, if I understand his idea correctly, the FOMC would deliberate specifically on the desired rate of inflation and adjust the target within a predetermined range.
Relying on the target's appropriate range opens the possibility of compatibility between President Rosengren's framework and the one I presented. Policymakers could use price-level targeting concepts in developing a range of policy options given the state of the economy. The breadth of the range of options would depend on the bounds the FOMC felt represented an acceptable degree of price-level uncertainty.
Summing all of this up, then—to me, the important characteristic of a sound monetary policy framework is that it provides a credible nominal anchor while maintaining flexibility to address changing circumstances. I think some form of flexible price-level targeting can be a part of such a framework. I look forward to a robust and constructive debate.
March 28, 2018
Thoughts on a Long-Run Monetary Policy Framework, Part 3: An Example of Flexible Price-Level Targeting
I want to start my discussion in this post with two points I made in the previous two macroblog posts (here and here). First, I think a commitment to delivering a relatively predictable price-level path is a desirable feature of a well-constructed monetary framework. Price stability is in my view achieved if people can have confidence that the purchasing power of the dollars they hold today will fall within a certain range at any date in the future.
My second point was that, as a matter of fact, the Federal Open Market Committee (FOMC) delivered on this definition of price stability during the years 1995–2012. (The FOMC formally adopted its 2 percent long-run inflation target in 2012.)
If you are reading this blog, you're almost certainly aware that since 2012, the actual personal consumption expenditures (PCE) inflation rate has persistently fallen short of the 2 percent goal. That, of course, means that the price level has fallen increasingly short of a reference 2 percent path, as shown in chart 1 below.
Is this deviation from the price-level path a problem? The practical answer to that question will depend on how my proposed definition of price stability is implemented.
By way of example, let's suppose that the FOMC commits to conducting monetary policy in such a way that the price level will always fall within plus-or-minus 5 percent of the long-run target path (which itself we define as the path implied by a constant 2 percent inflation rate). This policy—and how it relates to the actual path of PCE price inflation—is illustrated in chart 2.
So would inflation falling short of the 2 percent longer-run goal be a problem if the Fed was operating within the framework depicted in chart 2? In a sense, the answer is no. The current price level would be within the bounds of a hypothetical commitment made in 1995. If the central bank could perpetually deliver 2 percent annual inflation, that promise would remain intact, as shown in chart 3.
Of course, chart 3 depicts a forward path for prices whose margin for error is quite slim. Continued inflation below 2 percent would, in short order, push the price level below the lower bound, likely requiring a relatively accommodative monetary policy stance—that is, if policymakers sought to satisfy a commitment to this framework's definition of price stability.
Central bankers in risk management mode might opt for policies designed to deliberately move the price level toward the 2 percent average inflation midpoint in cases where the price level moves too close for the Committee's comfort to one of the bounds (as, perhaps, in chart 3). It bears noting that in such cases there are a wide range of options available to policymakers with respect to the timing and pace of that adjustment.
This scenario illustrates the flexibility of the price-level targeting framework I'm describing. I think it's important to think in terms of gradual adjustments that don't risk whipsawing the economy or force the central bank to be overly precise in its short-run influence on inflation and economic activity. A key feature of such a policy framework includes considerable short- and medium-run flexibility in inflation outcomes.
But the other key feature is that the framework limits that same flexibility—that is, it satisfies the principle of bounded nominal uncertainty. Suppose you and another person agree that you will receive a $1 payment in 10 years in exchange for a service provided today. If the inflation rate over this 10-year period is exactly 2 percent per year, then the real value of that dollar in goods and services would be 82 cents.
In my example (the one with a plus-or-minus 5 percent bound on the price level), monetary policymakers have essentially committed that the agreed-upon payment would not result in real purchasing power of less than 78 cents (and the payer could be confident that the real purchasing power relinquished would not be more than 86 cents).
The crux of my argument is that a "good" monetary policy framework limits the degree of uncertainty associated with contracts involving transfers of dollars over time. In limiting uncertainty, monetary policy contributes to economic efficiency.
The 5 percent bound I chose for my illustration is obviously arbitrary. The magnitude of the acceptable deviations from the price-level path would be a policy decision. I'm not sure we know a whole lot about what range of deviations from an expected price path contributes most consistently to economic efficiency. A benefit of the framework I am describing is that it would focus research, discussion, and debate squarely on that question.
This series of posts is going on hiatus for a few days. Tomorrow, the Atlanta Fed is going to release its 2017 Annual Report, and I certainly don't want to steal its thunder. And Friday, of course, will begin the Easter weekend for many people.
But I want to conclude this post by emphasizing that the framework I am describing is more of a refinement of, and not a competitor to, many of the framework proposals I discussed in Monday's post. This is an important point and one that I will turn to in the final installment of this series, to be published next Monday.
March 27, 2018
Thoughts on a Long-Run Monetary Policy Framework, Part 2: The Principle of Bounded Nominal Uncertainty
In yesterday's macroblog post, I discussed one of the central monetary policy questions of the day: Is the possibility of hitting the lower bound on policy rates likely to be an issue for the Fed going forward, do we care, and—if we do—what can we do about it?
The answers to the first questions are, in my opinion, yes and yes. That's the easy part. The last question—what can we do about it?—is the hard part. In the end, this is a question about the framework for conducting monetary policy. The menu of options includes:
- Raising the Federal Open Market Committee's (FOMC) longer-run inflation target;
- Maintaining the current policy framework, including the 2 percent longer-run inflation target, relying on unconventional tools when needed;
- Targeting the growth rate of nominal gross domestic product;
- Adopting an inflation range with flexible inflation targets that are adjusted based on the state of the economy (a relatively recent entry to the list suggested by Boston Fed president Eric Rosengren );
- Price-level targeting.
Chicago Fed president Charles Evans, San Francisco Fed president John Williams, and former Federal Reserve chairman Ben Bernanke, among others, have advocated for some version of the last item on this list of options. I am going to add myself to the list of people sympathetic to a policy framework that has a form of price-level targeting at its center.
I'll explain my sympathies by discussing principles that are central to my thinking.
First, I think the Fed's commitment to the long-run 2 percent inflation objective has served the country well. I recognize that the word “commitment” in that sentence might be more important than the specific 2 percent target value. But credibility and commitment imply objectives that, though not immutable, rarely change—and then only with a clear consensus on a better course. With respect to changing the 2 percent objective as a longer-run goal, my feet are not set in concrete, but they are in pretty thick mud.
Second, former Fed chairman Alan Greenspan offered a well-known definition of what it means for a central bank to succeed on a charge to deliver price stability. Paraphrasing, Chairman Greenspan suggested that the goal of price stability is met when households and business ignore inflation when making key economic decisions that affect their financial futures.
I agree with the Greenspan definition, and I believe that the 2 percent inflation objective has helped us meet that criterion. But I don't think we have met the Greenspan definition of price stability solely because 2 percent is a sufficiently low rate of inflation. I think it is also critical that deviations of prices away from a path implied by an average inflation rate of 2 percent have, in the United States, been relatively small.
Here's how I see it: until recently, the 2 percent inflation objective in the United States has essentially functioned as a price-level target centered on a 2 percent growth path. The orange line in the chart below shows what a price-level path of 2 percent growth would have been over the period from 1995 to 2012. I chose to begin with 1995 because it arguably began the Fed's era of inflation targeting. Why does the chart end in 2012? I'll get to that tomorrow, when I lay out a specific hypothetical plan.
The green line in the chart is the actual path of the price level, as measured by the price index for personal consumption expenditures. The chart explains what I mean when I say the FOMC effectively delivered on a 2 percent price-level target. Over the period depicted in this chart, the price level did not deviate much from the 2 percent path.
I believe the inflation outcome apparent in the chart is highly desirable. Why? Because the resulting price-level path satisfies what I will call the “principle of bounded nominal uncertainty.” In essence, the principle of bounded nominal uncertainty means that if you save a dollar today you can be “reasonably confident” about what the real value of that saving will be in the future.
For example, suppose that in January 1995 you had socked away $1 in cash that you intended to spend exactly five years later. If you believed that the Fed was going to deliver an average annual inflation rate of 2 percent over this period, you'd expect that dollar to be worth about 90 cents in real purchasing power by January 2000. (Recall that cash depreciates at the rate of inflation—I didn't say this was the best way to save!)
In fact, because the price level's realized path over that time hewed very closely to the expected 2 percent growth path, the actual value of the dollar you saved would have been very close to the 90 cents you expected. And this, I think, epitomizes a reasonable definition of price stability. If you and I enter into a contract to exchange a dollar at some future date, we can confidently predict within some range that dollar's purchasing power. Good monetary policy, in my view, will satisfy the principle of bounded nominal uncertainty.
This is the starting point of my thinking about a useful monetary policy framework—and how I think about price-level targeting generally. Tomorrow, I will expand on this thought and offer a specific example of how a price-level target might be put into operation in a way that is both flexible and respectful of the principle of bounded nominal uncertainty.
March 26, 2018
Thoughts on a Long-Run Monetary Policy Framework: Framing the Question
"Should the Fed stick with the 2 percent inflation target or rethink it?" This was the very good question posed in a special conference hosted by the Brookings Institution this past January. Over the course of roughly two decades prior to the global financial crisis, a consensus had formed among monetary-policy experts and practitioners the world over that something like 2 percent is an appropriate goal—maybe even the optimal goal—for central banks to pursue. So why reconsider that target now?
The answer to that question starts with another consensus that has emerged in the aftermath of the global financial crisis. In particular, there is now a widespread belief that, once monetary policy has fully normalized, the federal funds rate—the Federal Open Market Committee's (FOMC) reference policy rate—will settle significantly below historical norms.
Several of my colleagues have spoken cogently about this phenomenon, which is often cast in terms of concepts like r-star, the natural rate of interest, the equilibrium rate of interest, or (in the case of my colleague Jim Bullard ), r-dagger. I like to think in terms of the "neutral" rate of interest; that is, the level of the policy rate consistent with the FOMC meeting its longer-run goals of price stability and maximum sustainable growth. In other words, the level of the federal funds rate should be consistent with 2 percent inflation, the unemployment rate at its sustainable level, and real gross domestic product at its potential.
Estimates of the neutral policy rate are subject to imprecision and debate. But a reasonable notion can be gleaned from the range of projections for the long-run federal funds rate reported in the Summary of Economic Projections (SEP) released just after last week's FOMC meeting. According to the latest SEP, neutral would be in a range 2.3 to 3.0 percent.
For some historical context, in the latter half of the 1990s, as the 2 percent inflation consensus was solidifying, the neutral federal funds rate would have been pegged in a range of something like 4.0 to 5.0 percent, roughly 2 percentage points higher than the range considered to be neutral today.
The implication for monetary policy is clear. If interest rates settle at levels that are historically low, policymakers will have limited scope for cutting rates in the event of a significant economic downturn (or at least more limited scope than they had in the past). I think it's fair to say that even relatively modest downturns are likely to yield policy reactions that drive the federal funds rate to zero, as happened in the Great Recession.
My view is that the nontraditional tools deployed after December 2008, when the federal funds rate effectively fell to zero, were effective. But it is accurate to say that our experience with these tools is limited, and the effectiveness of those tools remains controversial. I join the opinion that, all else equal, it would be vastly preferable to conduct monetary policy through the time-tested approach of raising and lowering short-term policy rates, if such an approach is available.
This point is where the challenge to the 2 percent inflation target enters the picture. The neutral rate I have been describing is a nominal rate. It is roughly the sum of an inflation-adjusted real rate—determined by fundamental saving and investment decisions in the global economy—and the rate of inflation. The downward drift in the neutral rate I have been describing is attributable to a downward drift in the inflation-adjusted real rate. A great deal of research has documented this phenomenon, such as some influential research by San Francisco Fed president John Williams and Thomas Laubach, the head of the monetary division at the Fed's Board of Governors.
In the long run, a central bank cannot reliably control the real rate of interest. So if we accept the following premises...
- A neutral rate that is too low to give the central bank enough room to fight even run-of-the-mill downturns is problematic;
- Cutting rates is the optimal strategy for addressing downturns; and
- The real interest rate is beyond the control of the central bank in the long run
...then we must necessarily accept that raising the neutral rate, thus affording monetary policymakers the desired rate-cutting scope when needed, would require raising the long-run inflation rate. Hence the argument for rethinking the Fed's 2 percent inflation target.
But is that the only option? And is it the best option?
The answer to the first question is clearly no. The purpose of the Brookings Institution sessions is addressing the pros and cons of the different strategies for dealing with the low neutral rate problem, and I commend them to you. But in upcoming macroblog posts, I want to share some of my thoughts on the second question.
Tomorrow, I will review some of the proposed options and explain why I am attracted to one in particular: price-level targeting. On Wednesday, I will propose what I think is a potentially useful model for implementing a price-level targeting scheme in practice. I want to emphasize that these are preliminary thoughts, offered in the spirit of stimulating the conversation and debate. I welcome that conversation and debate and look forward to making my contribution to moving it forward.
September 08, 2017
When Health Insurance and Its Financial Cushion Disappear
Personal health care costs can skyrocket with a new diagnosis or accident, often leading to catastrophic financial costs for people. Health insurance plays an important role in protecting individuals from unexpected large financial shocks as a result of adverse health events. Just as homeowner's insurance helps protect you from financial devastation if your house burns down, health insurance helps protects you from burning through your savings because of a heart attack. This 2008 report from the Commonwealth Fund shows that the uninsured are far more likely to have to use their savings and reduce other types of spending to pay medical bills.
Much research has been done on the impact of health insurance on financial and health outcomes. (This paper , for example, summarizes the history and impact of Medicaid.) However, most of the studies look at the case of individuals who are gaining health insurance. In a recent Atlanta Fed working paper and the related podcast episode , we measure the impact of losing public health insurance on measures of financial well-being such as credit scores, delinquent debt eligible to be sent to debt collectors, and bankruptcies. We performed these measurements by studying the case of Tennessee's Medicaid program, known as TennCare, in the mid-2000s. At that time, a large statewide Medicaid expansion that began in the 1990s ran into financial difficulties and was scaled back. As the following chart shows, some 170,000 individuals were removed from TennCare rolls between 2005 and 2006.
Our analysis of this episode, using data from the New York Fed's Consumer Credit Panel/Equifax, revealed some striking findings. Individuals who lost health insurance experienced lower credit scores, more debt eligible to be sent to collections, and a higher incidence of bankruptcy. Those who were already financially vulnerable suffered the worst. In particular, individuals who already had poor credit, as measured by Fannie Mae's lowest creditworthiness categories , and then lost Medicaid see their credit scores fall by close to 40 points on average and are almost 17 percent more likely to have their debt sent to collection agencies. Our analysis also finds that gaining or losing health insurance is not symmetric in its impact—losing insurance has larger negative financial effects than the positive financial impacts of gaining insurance.
Our results provide evidence that losing Medicaid coverage not only removes inexpensive access to health care but also eliminates an important layer of financial protection. A cost-benefit analysis of proposed cuts to Medicaid coverage (see here, here, and here for a discussion of recent legislative efforts in the U.S. Congress) would need to consider the negative financial consequences for individuals of the type that we have identified.
September 07, 2017
What Is the "Right" Policy Rate?
What is the right monetary policy rate? The Cleveland Fed, via Michael Derby in the Wall Street Journal, provides one answer—or rather, one set of answers:
The various flavors of monetary policy rules now out there offer formulas that suggest an ideal setting for policy based on economic variables. The best known of these is the Taylor Rule, named for Stanford University's John Taylor, its author. Economists have produced numerous variations on the Taylor Rule that don't always offer a similar story...
There is no agreement in the research literature on a single "best" rule, and different rules can sometimes generate very different values for the federal funds rate, both for the present and for the future, the Cleveland Fed said. Looking across multiple economic forecasts helps to capture some of the uncertainty surrounding the economic outlook and, by extension, monetary policy prospects.
Agreed, and this is the philosophy behind both the Cleveland Fed's calculations based on Seven Simple Monetary Policy Rules and our own Taylor Rule Utility. These two tools complement one another nicely: Cleveland's version emphasizes forecasts for the federal funds rate over different rules and Atlanta's utility focuses on the current setting of the rate over a (different, but overlapping) set of rules for a variety of the key variables that appear in the Taylor Rule (namely, the resource gap, the inflation gap, and the "neutral" policy rate). We update the Taylor Rule Utility twice a month after Consumer Price Index and Personal Income and Outlays reports and use a variety of survey- and model-based nowcasts to fill in yet-to-be released source data for the latest quarter.
We're introducing an enhancement to our Taylor Rule utility page, a "heatmap" that allows the construction of a color-coded view of Taylor Rule prescriptions (relative to a selected benchmark) for five different measures of the resource gap and five different measures of the neutral policy rate. We find the heatmap is a useful way to quickly compare the actual fed funds rate with current prescriptions for the rate from a relatively large number of rules.
In constructing the heatmap, users have options on measuring the inflation gap and setting the value of the "smoothing parameter" in the policy rule, as well establishing the weight placed on the resource gap and the benchmark against which the policy rule is compared. (The inflation gap is the difference between actual inflation and the Federal Open Market Committee's 2 percent longer-term objective. The smoothing parameter is the degree to which the rule is inertial, meaning that it puts weight on maintaining the fed funds rate at its previous value.)
For example, assume we (a) measure inflation using the four-quarter change in the core personal consumption expenditures price index; (b) put a weight of 1 on the resource gap (that is, specify the rule so that a percentage point change in the resource gap implies a 1 percentage point change in the rule's prescribed rate); and (c) specify that the policy rule is not inertial (that is, it places no weight on last period's policy rate). Below is the heatmap corresponding to this policy rule specification, comparing the rules prescription to the current midpoint of the fed funds rate target range:
We should note that all of the terms in the heatmap are described in detail in the "Overview of Data" and "Detailed Description of Data" tabs on the Taylor Rule Utility page. In short, U-3 (the standard unemployment rate) and U-6 are measures of labor underutilization defined here. We introduced ZPOP, the utilization-to-population ratio, in this macroblog post. "Emp-Pop" is the employment-population ratio. The natural (real) interest rate is denoted by r*. The abbreviations for the last three row labels denote estimates of r* from Kathryn Holston, Thomas Laubach, and John C. Williams, Thomas Laubach and John C. Williams, and Thomas Lubik and Christian Matthes.
The color coding (described on the webpage) should be somewhat intuitive. Shades of red mean the midpoint of the current policy rate range is at least 25 basis points above the rule prescription, shades of green mean that the midpoint is more than 25 basis points below the prescription, and shades of white mean the midpoint is within 25 basis points of the rule.
The heatmap above has "variations on the Taylor Rule that don't always offer a similar story" because the colors range from a shade of red to shades of green. But certain themes do emerge. If, for example, you believe that the neutral real rate of interest is quite low (the Laubach-Williams and Lubik-Mathes estimates in the bottom two rows are −0.22 and −0.06) your belief about the magnitude of the resource gap would be critical to determining whether this particular rule suggests that the policy rate is already too high, has a bit more room to increase, or is just about right. On the other hand, if you are an adherent of the original Taylor Rule and its assumption that a long-run neutral rate of 2 percent (the top row of the chart) is the right way to think about policy, there isn't much ambiguity to the conclusion that the current rate is well below what the rule indicates.
"[D]ifferent rules can sometimes generate very different values for the federal funds rate, both for the present and for the future." Indeed.
- Making Analysis of the Current Population Survey Easier
- Mapping the Financial Frontier at the Financial Markets Conference
- The Tax Cut and Jobs Act, SALT, and the Blue State Blues: It's All Relative
- Improving Labor Force Participation
- Young Hispanic Women Investing More in Education: Good News for Labor Force Participation
- A Different Type of Tax Reform
- X Factor: Hispanic Women Drive the Labor-Force Comeback
- Tariff Worries and U.S. Business Investment, Take Two
- Trends in Hispanic Labor Force Participation
- Quantitative Frightening?
- July 2019
- June 2019
- May 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- August 2018
- Business Cycles
- Business Inflation Expectations
- Capital and Investment
- Capital Markets
- Data Releases
- Economic conditions
- Economic Growth and Development
- Exchange Rates and the Dollar
- Fed Funds Futures
- Federal Debt and Deficits
- Federal Reserve and Monetary Policy
- Financial System
- Fiscal Policy
- Health Care
- Inflation Expectations
- Interest Rates
- Labor Markets
- Latin America/South America
- Monetary Policy
- Money Markets
- Real Estate
- Saving, Capital, and Investment
- Small Business
- Social Security
- This, That, and the Other
- Trade Deficit
- Wage Growth