macroblog

About


The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.

Authors for macroblog are Dave Altig, John Robertson, and other Atlanta Fed economists and researchers.


September 08, 2016


Introducing the Atlanta Fed's Taylor Rule Utility

Simplicity isn't always a virtue, but when it comes to complex decision-making processes—for example, a central bank setting a policy rate—having simple benchmarks is often helpful. As students and observers of monetary policy well know, the common currency in the central banking world is the so-called "Taylor rule."

The Taylor rule is an equation introduced by John Taylor in a seminal 1993 paper that prescribes a value for the federal funds rate—the interest rate targeted by the Federal Open Market Committee (FOMC)—based on readings of inflation and the output gap. The output gap measures the percentage point difference between real gross domestic product (GDP) and an estimate of its trend or potential.

Since 1993, academics and policymakers have introduced and used many alternative versions of the rule. The alternative forms of the rule can supply policy prescriptions that differ significantly from Taylor's original rule, as the following chart illustrates.

Effective federal funds rate and prescriptions from alternative versions of the Taylor rule
(enlarge)

The green line shows the policy prescription from a rule identical to the one in Taylor's paper, apart from some minor changes in the inflation and output gap measures. The red line uses an alternative and commonly used rule that gives the output gap twice the weight used for the Taylor (1993) rule, derived from a 1999 paper by John Taylor. The red line also replaces the 2 percent value used in Taylor's 1993 paper with an estimate of the natural real interest rate, called r*, from a paper by Thomas Laubach, the Federal Reserve Board's director of monetary affairs, and John Williams, president of the San Francisco Fed. Federal Reserve Chair Janet Yellen also considered this alternative estimate of r* in a 2015 speech.

Both rules use real-time data. The Taylor (1993) rule prescribed liftoff for the federal funds rate materially above the FOMC's 0 to 0.25 percent target range from December 2008 to December 2015 as early as 2012. The alternative rule did not prescribe a positive fed funds rate since the end of the 2007–09 recession until this quarter. The third-quarter prescriptions incorporate nowcasts constructed as described here. Neither the nowcasts nor the Taylor rule prescriptions themselves necessarily reflect the outlook or views of the Federal Reserve Bank of Atlanta or its president.

Additional variables that get plugged into this simple policy rule can influence the rate prescription. To help you sort through the most common variations, we at the Atlanta Fed have created a Taylor Rule Utility. Our Taylor Rule Utility gives you a number of choices for the inflation measure, inflation target, the natural real interest rate, and the resource gap. Besides the Congressional Budget Office–based output gap, alternative resource gap choices include those based on a U-6 labor underutilization gap and the ZPOP ratio. The latter ratio, which Atlanta Fed President Dennis Lockhart mentioned in a November 2015 speech while addressing the Taylor rule, gauges underemployment by measuring the share of the civilian population working their desired number of hours.

Many of the indicator choices use real-time data. The utility also allows you to establish your own weight for the resource gap and whether you want the prescription to put any weight on the previous quarter's federal funds rate. The default choices of the Taylor Rule Utility coincide with the Taylor (1993) rule shown in the above chart. Other organizations have their own versions of the Taylor Rule Utility (one of the nicer ones is available on the Cleveland Fed's Simple Monetary Policy Rules web page). You can find more information about the Cleveland Fed's web page on the Frequently Asked Questions page.

Although the Taylor rule and its alternative versions are only simple benchmarks, they can be useful tools for evaluating the importance of particular indicators. For example, we see that the difference in the prescriptions of the two rules plotted above has narrowed in recent years as slack has diminished. Even if the output gap were completely closed, however, the current prescriptions of the rules would differ by nearly 2 percentage points because of the use of different measures of r*. We hope you find the Taylor Rule Utility a useful tool to provide insight into issues like these. We plan on adding further enhancements to the utility in the near future and welcome any comments or suggestions for improvements.

September 8, 2016 in Banking, Federal Reserve and Monetary Policy, Monetary Policy | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

July 18, 2016


What’s Moving the Market’s Views on the Path of Short-Term Rates?

As today's previous macroblog post highlighted, it seems that the United Kingdom's vote to leave the European Union—commonly known as the Brexit—got the attention of business decision makers and made their business outlook more uncertain.

How might this uncertainty be weighing on financial market assessments of the future path for Fed policy? Several recent articles have opined, often citing the CME Group's popular FedWatch tool, that the Brexit vote increased the probability that the Federal Open Market Committee (FOMC) might reverse course and lower its target for the fed funds rate. For instance, the Wall Street Journal reported on June 28 that fed funds futures contracts implied a 15 percent probability that rates would increase 25 basis points and an 8 percent probability of a 25 basis point decrease by December's meeting. Prior to the Brexit vote, the probabilities of a 25 basis point increase and decrease by December's meeting were roughly 50 percent and 0 percent, respectively.

One limitation of using fed funds futures to assess market participant views is that this method is restricted to calculating the probability of a rate change by a fixed number of basis points. But what if we want to consider a broader set of possibilities for FOMC rate decisions? We could look at options on fed funds futures contracts to infer these probabilities. However, since the financial crisis their availability has been quite limited. Instead, we use options on Eurodollar futures contracts.

Eurodollars are deposits denominated in U.S. dollars but held in foreign banks or in the foreign branches of U.S. banks. The rate on these deposits is the (U.S. dollar) London Interbank Offered Rate (LIBOR). Because Eurodollar deposits are regulated similarly to fed funds and can be used to meet reserve requirements, financial institutions often view Eurodollars as close substitutes for fed funds. Although a number of factors can drive a wedge between otherwise identical fed funds and Eurodollar transactions, arbitrage and competitive forces tend to keep these differences relatively small.

However, using options on Eurodollar futures is not without its own challenges. Three-month Eurodollar futures can be thought of as the sum of an average three-month expected overnight rate (the item of specific interest) plus a term premium. Each possible target range for fed funds is associated with its own average expected overnight rate, and there may be some slippage between these two. Additionally, although we can use swaps market data to estimate the expected term premium, uncertainty around this expectation can blur the picture somewhat and make it difficult to identify specific target ranges, especially as we look farther out into the future.

Despite these challenges, we feel that options on Eurodollar futures can provide a complementary and more detailed view on market expectations than is provided by fed funds futures data alone.

Our approach is to use the Eurodollar futures option data to construct an entire probability distribution of the market's assessment of future LIBOR rates. The details of our approach can be found here. Importantly, our approach does not assume that the distribution will have a typical bell shape. Using a flexible approach allows multiple peaks with different heights that can change dynamically in response to market news.

The results of this approach are illustrated in the following two charts for contracts expiring in September (left-hand chart) and December (right-hand chart) of this year for the day before and the day after Brexit. With these distributions in hand, we can calculate the implied probabilities of a rate change consistent with what you would get if you simply used fed funds futures. However, we think that specific features of the distributions help provide a richer story about how the market is processing incoming information.

Prior to the Brexit vote (depicted by the green curve), market participants were largely split in their assessment on a rate increase through September's FOMC meeting, as indicated by the two similarly sized modes, or peaks, of the distribution. Post-Brexit (depicted by the blue curve), most weight was given to no change, but with a non-negligible probability of a rate cut (the mode on the left between 0 and 25 basis points). For December's FOMC meeting, market participants shifted their views away from the likelihood of one additional increase in the fed funds target toward the possibility that the FOMC leaves rates where they are currently.

The market turmoil immediately following the vote subsided somewhat over the subsequent days. The next two charts indicate that by July 7, market participants seem to have backed away from the assessment that a rate cut may occur this year, evidenced by the disappearance of the mode between 0 and 25 basis points (show by the green curve). And following the release of the June jobs report from the U.S. Bureau of Labor Statistics on July 8, market participants increased their assessment of the likelihood of a rate hike by year end, though not by much (see the blue curve). However, the labor report was, by itself, not enough to shift the market view that the fed funds target is unlikely to change over the near future.

One other feature of our approach is that comparing the heights of the modes across contracts allows us to assess the market's relative certainty of particular outcomes. For instance, though the market continues to put the highest weight on "no move" for both September and December, we can see that the market is much less certain regarding what will happen by December relative to September.

The greater range of possible rates for December suggests that there is still considerable market uncertainty about the path of rates six months out and farther. And, as we saw with the labor report release, incoming data can move these distributions around as market participants assess the impact on future FOMC deliberations.



July 18, 2016 in Europe, Interest Rates, Monetary Policy | Permalink

Comments

Could you share more details about the methodology. The Simplex Regression paper linked under "The details of our approach can be found here" above is too general. Thank you.

Posted by: Hong Le | July 21, 2016 at 11:31 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 16, 2016


Experts Debate Policy Options for China's Transition

After nearly three decades of rapid economic growth, China today faces the challenge of economic rebalancing against the backdrop of slow and uncertain global growth. Although investment and exports have been a motor for growth, China is increasingly experiencing structural issues: widening inequality, overcapacity as a consequence of policy distortions, unsustainable environmental costs, volatile financial markets, and rising systemic risk.

On April 28–29, I attended the First Research Workshop on China's Economy, organized jointly by the International Monetary Fund (IMF) and the Atlanta Fed. The workshop, held at the IMF's headquarters in Washington DC, explored a series of questions that have emerged as China shifts toward a new growth model. Is this the end of the growth miracle? Will the Chinese renminbi one day be as important as the U.S. dollar? Should the rapidly increasing shadow banking activity in China be a source of concern? How worrisome is the rapid rise in China's housing prices?

Panelists shared their views on these and other issues facing the world's second-largest economy (or largest, if measured on a purchasing-power-parity basis). Plans are under way for a second workshop to be held in 2017.

The following is a nice summary of the research discussed at the workshop. It was originally published in the IMF Survey Magazine, and was written by Hui He, IMF Institute for Capacity Development, and Nan Li, IMF Research Department. Thanks to the IMF for allowing me to repost it here.

Is China's economic growth sustainable?
Understanding the source of China's tremendous growth was a recurring theme at the workshop. "China's economy combines enormous dynamism with huge distortions," observed Loren Brandt (University of Toronto). Brandt described his research based on China's firm-level data and emphasized that firm dynamics (entry and exit), especially firm entry, have been the main source of the productivity growth in the manufacturing sector.

Echoing Brandt's message, Kjetil Storesletten (University of Oslo) discussed regional growth disparities and showed that barriers preventing firms from entering an industry account for most of the disparities. Such barriers are more severe for privately owned firms in regions in which state-owned enterprises (SOE) dominate, he said.

In his keynote speech, Nicholas Lardy (Peterson Institute for International Economics) offered an upbeat view on China's transition to a new growth model, one in which the service sector plays a larger role than manufacturing. The bright side of the service sector, he noted, is its continued strong productivity growth. The development of financial deepening and the stronger social safety net are contributing to increased consumption, which helps to rebalance the economy.

However, he emphasized, SOE reforms remain critical as the service sector cannot provide a silver bullet for a successful transition.

Central bank's policy decisions
Several participants tried to discern how the People's Bank of China (PBC) conducts monetary policy. Tao Zha (of the Atlanta Fed's Center for Quantitative Economic Research and Emory University) found that the PBC reacts sharply when the gross domestic product's growth rate falls below its target, increasing the money supply by 11.5 percentage points for every 1 percentage point shortfall.

Mark Spiegel (Center for Pacific Basin Studies) discussed the trade-offs involved in Chinese monetary policy—for example, controlling the exchange rate versus maintaining inflation stability. He also argued that the heavy use of reserve requirements on banks as a monetary policy tool might have an unintentional consequence to reallocate capital from SOEs to more efficient privately owned firms and could therefore offset the resource misallocation caused by the easy credit to SOEs that banks granted in the high growth years.

Renminbi versus the dollar
Eswar Prasad (Cornell University and Brookings Institution) argued that China's capital account will become more open and the renminbi will be used more widely to denominate and settle cross-border transactions. But he also noted that legal and institutional constraints in China were likely to prevent the renminbi from serving as a safe-haven currency as the U.S. dollar does today.

Moreover, he said, the current sequencing of liberalization initiatives—that is, removal of capital account restrictions before appropriate financial market supervision and regulation and exchange rate reform—poses financial stability risks.

Shadow banking and the housing market
Recently, volatile Chinese financial markets and continued housing price appreciation have raised serious financial stability concerns.

Michael Song (Chinese University of Hong Kong) argued that rapidly rising shadow banking activity is an unintended consequence of financial regulation. Restrictions on deposit rates and loan-to-deposit ratios have led to the issuance by banks of "wealth management products" to attract savers with higher returns. Because these restrictions had a greater impact on small banks, the big state banks had more room to undercut the smaller banks by offering wealth management products with higher returns and then restricting liquidity to them in interbank markets, ultimately making the banking system more prone to liquidity distress and runs.

Hanming Fang (University of Pennsylvania) found that, except in big cities such as Beijing and Shanghai, housing prices in China's urban areas between 2003 and 2013 more or less tracked rising household incomes. In his view, the Chinese housing boom is thus unlikely to trigger an imminent financial crisis. He warned, however, that housing prices may fall rapidly if economic growth slows dramatically, and that such a development could, in turn, amplify the economic downturn.

Rising wage inequality
China's rapid growth over the past two decades has been accompanied by rising wage inequality, an issue highlighted by two conference participants. Dennis Yang (University of Virginia) explored the distributional effects of trade openness in China and found a significant impact on wage inequality of China's accession to the World Trade Organization in 2001.

Chong-En Bai (Tsinghua University) argued that the decline after 2008 of the skill premium—that is, the ratio of the skilled labor wage to the unskilled labor wage—can be explained by the Chinese government's targeted credit extension to unskilled labor-intensive infrastructure sector (as part of the fiscal stimulus following the global financial crisis). Such distortionary policies might have short-run growth benefits but could lead to long-run welfare losses, he said, especially when rural-to-urban migration has run its course.

June 16, 2016 in Asia, Economic Growth and Development, Labor Markets, Monetary Policy, Real Estate | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 06, 2016


After the Conference, Another Look at Liquidity

When it comes to assessing the impact of central bank asset purchase programs (often called quantitative easing or QE), economists tend to focus their attention on the potential effects on the real economy and inflation. After all, the Federal Reserve's dual mandate for monetary policy is price stability and full employment. But there is another aspect of QE that may also be quite important in assessing its usefulness as a policy tool: the potential effect of asset purchases on financial markets through the collateral channel.

Asset purchase programs involve central bank purchases of large quantities of high-quality, highly liquid assets. Postcrisis, the Fed has purchased more than $3 trillion of U.S. Treasury securities and agency mortgage-backed securities, the European Central Bank (ECB) has purchased roughly 727 billion euros' worth of public-sector bonds (issued by central governments and agencies), and the Bank of Japan is maintaining an annual purchase target of 80 trillion yen. These bonds are not merely assets held by investors to realize a return; they are also securities highly valued for their use as collateral in financial transactions. The Atlanta Fed's 21st annual Financial Markets Conference explored the potential consequences of these asset purchase programs in the context of financial market liquidity.

The collateral channel effect focuses on the role that these low-risk securities play in the plumbing of U.S. financial markets. Financial firms fund a large fraction of their securities holdings in the repurchase (or repo) markets. Repurchase agreements are legally structured as the sale of a security with a promise to repurchase the security at a fixed price at a given point in the future. The economics of this transaction are essentially similar to those of a collateralized loan.

The sold and repurchased securities are often termed "pledged collateral." In these transactions, which are typically overnight, the lender will ordinarily lend cash equal to only a fraction of the securities value, with the remaining unfunded part called the "haircut." The size of the haircut is inversely related to the safety and liquidity of the security, with Treasury securities requiring the smallest haircuts. When the securities are repurchased the following day, the borrower will pay back the initial cash plus an additional amount known as the repo rate. The repo rate is essentially an overnight interest rate paid on a collateralized loan.

Central bank purchases of Treasury securities may have a multiplicative effect on the potential efficiency of the repo market because these securities are often used in a chain of transactions before reaching a final holder for the evening. Here's a great diagram presented by Phil Prince of Pine River Capital Management illustrating the role that bonds and U.S. Treasuries play in facilitating a variety of transactions. In this example, the UST (U.S. Treasury) securities are first used as collateral in an exchange between the UST securities lender and the globally systemically important financial institution (GSIFI bank/broker dealer), then between the GSIFI bank and the cash provider, a money market mutual fund (MMMF), corporation, or sovereign wealth fund (SWF). The reuse of the UST collateral reduces the funding cost of the GSIFI bank and, hence, the cost to the levered investor/hedge fund who is trying to exploit discrepancies in the pricing of a corporate bond and stock.

Just how important or large is this pool of reusable collateral? Manmohan Singh of the International Monetary Fund presented the following charts, depicting the pledged collateral at major U.S. and European financial institutions that can be reused in other transactions.

So how do central bank purchases of high-quality, liquid assets affect the repo market—and why should macroeconomists care? In his presentation, Marvin Goodfriend of Carnegie Mellon University concluded that central bank asset purchases, which he terms "pure monetary policy," lower short-term interest rates (especially bank-to-bank lending) but increase the cost of funding illiquid assets through the repo market. And Singh noted that repo rates are an important part of the constellation of short-term interest rates and directly link overnight markets with the longer-term collateral being pledged. Thus, the interaction between a central bank's interest-rate policy and its balance sheet policy is an important aspect of the transmission of monetary policy to longer-term interest rates and real economic activity.

Ulrich Bindseil, director of general market operations at the ECB, discussed a variety of ways in which central bank actions may affect, or be affected by, bond market liquidity. One way that central banks may mitigate any adverse impact on market liquidity is through their securities lending programs, according to Bindseil. Central banks use such programs to lend particular bonds back out to the market to "provide a secondary and temporary source of securities to the financing market...to promote smooth clearing of Treasury and Agency securities."

On June 2, for example, the New York Fed lent $17.8 billion of UST securities from the Fed's portfolio. These operations are structured as collateral swaps—dealers pledge other U.S. Treasury bonds as collateral with the Fed. During the financial crisis, the Federal Reserve used an expanded version of its securities lending program called the Term Securities Lending Facility to allow firms to replace lower-quality collateral that was difficult to use in repo transactions with Treasury securities.

Finally, the Fed currently releases some bonds to the market each day in return for cash, through its overnight reverse repo operations, a supplementary facility used to support control of the federal funds rate as the Federal Open Market Committee proceeds with normalization. However, this release has an important limitation: these operations are conducted in the triparty repo market, and the bonds released through these operations can be reused only within that market. In contrast, if the Fed were to sell its U.S. Treasuries, the securities could not only be used in the triparty repo market but also as collateral in other transactions including ones in the bilateral repo market (you can read more on these markets here). As long as central bank portfolios remain large and continue to grow as in Europe and Japan, policymakers are integrally linked to the financial plumbing at its most basic level.

To see a video of the full discussion of these issues as well as other conference presentations on bond market liquidity, market infrastructure, and the management of liquidity within financial institutions, please visit Getting a Grip on Liquidity: Markets, Institutions, and Central Banks. My colleague Larry Wall's conference takeaways on the elusive definition of liquidity, along with the impact of innovation and regulation on liquidity, are here.

June 6, 2016 in Banking, Financial System, Interest Rates, Monetary Policy | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

November 04, 2014


Data Dependence and Liftoff in the Federal Funds Rate

When asked "at which upcoming meeting do you think the FOMC [Federal Open Market Committee] will FIRST HIKE its target for the federal funds rate," 46 percent of the October Blue Chip Financial Forecasts panelists predicted that "liftoff" would occur at the June 2015 meeting, and 83 percent chose liftoff at one of the four scheduled meetings in the second and third quarters of next year.

Of course, this result does not imply that there is an 83 percent chance of liftoff occurring in the middle two quarters of next year. Respondents to the New York Fed's most recent Primary Dealer Survey put this liftoff probability for the middle two quarters of 2015 at only 51 percent. This more relatively certain forecast horizon for mid-2015 is consistent with the "data-dependence principle" that Chair Yellen mentioned at her September 17 press conference. The idea of data dependence is captured in this excerpt from the statement following the October 28–29 FOMC meeting:

[I]f incoming information indicates faster progress toward the Committee's employment and inflation objectives than the Committee now expects, then increases in the target range for the federal funds rate are likely to occur sooner than currently anticipated. Conversely, if progress proves slower than expected, then increases in the target range are likely to occur later than currently anticipated.

If the timing of liftoff is indeed data dependent, a natural extension is to gauge the likely "liftoff reaction function." In the current zero-lower bound (ZLB) environment, researchers at the University of North Carolina and the St. Louis Fed have analyzed monetary policy using shadow fed funds rates, shown in figure 1 below, estimated by Wu and Xia (2014) and Leo Krippner.

Unlike the standard fed funds rate, a shadow rate can be negative at the ZLB. The researchers found that the shadow rates, particularly Krippner's, act as fairly good proxies for monetary policy in the post-2008 ZLB period. Krippner also produces an expected time to liftoff, estimated from his model, shown in figure 1 above. His model's liftoff of December 2015 is six months after the most likely liftoff month identified by the aforementioned Blue Chip survey.

I included Krippner's shadow rate (spliced with the standard fed funds rate prior to December 2008) in a monthly Bayesian vector autoregression alongside the six other variables shown in figure 2 below.

The model assumes that the Fed cannot see contemporaneous values of the variables when setting the spliced policy—that is, the fed funds/shadow rate. This assumption is plausible given the approximately one-month lag in economic release dates. The baseline path assumes (and mechanically generates) liftoff in June 2015 with outcomes for the other variables, shown by the black lines, that roughly coincide with professional forecasts.

The alternative scenarios span the range of eight possible outcomes for low inflation/baseline inflation/high inflation and low growth/baseline growth/high growth in the figures above. For example, in figure 2 above, the high growth/low inflation scenario coincides with the green lines in the top three charts and the red lines in the bottom three charts. Forecasts for the spliced policy rate are conditional on the various growth/inflation scenarios, and "liftoff" in each scenario occurs when the spliced policy rate rises above the midpoint of the current target range for the funds rate (12.5 basis points).

The outcomes are shown in figure 3 below. At one extreme—high growth/high inflation—liftoff occurs in March 2015. At the other—low growth/low inflation—liftoff occurs beyond December 2015.

One should not interpret these projections too literally; the model uses a much narrower set of variables than the FOMC considers. Nonetheless, these scenarios illustrate that the model's forecasted liftoffs in the spliced policy rate are indeed consistent with the data-dependence principle.

Photo of Pat HigginsBy Pat Higgins, senior economist in the Atlanta Fed's research department

November 4, 2014 in Economics, Employment, Federal Reserve and Monetary Policy, Forecasts, Inflation, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01b7c7003c6e970b

Listed below are links to blogs that reference Data Dependence and Liftoff in the Federal Funds Rate:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

August 12, 2014


Are We There Yet?

Editor’s note: This macroblog post was published yesterday with some content inadvertently omitted. Below is the complete post. We apologize for the error.

Anyone who has undertaken a long road trip with children will be familiar with the frequent “are we there yet?” chorus from the back seat. So, too, it might seem on the long post-2007 monetary policy road trip. When will the economy finally look like it is satisfying the Federal Open Market Committee’s (FOMC) dual mandate of price stability and full employment? The answer varies somewhat across the FOMC participants. The difference in perspectives on the distance still to travel is implicit in the range of implied liftoff dates for the FOMC’s short-term interest-rate tool in the Summary of Economic Projections (SEP).

So how might we go about assessing how close the economy truly is to meeting the FOMC’s objectives of price stability and full employment? In a speech on July 17, President James Bullard of the St. Louis Fed laid out a straightforward approach, as outlined in a press release accompanying the speech:

To measure the distance of the economy from the FOMC’s goals, Bullard used a simple function that depends on the distance of inflation from the FOMC’s long-run target and on the distance of the unemployment rate from its long-run average. This version puts equal weight on inflation and unemployment and is sometimes used to evaluate various policy options, Bullard explained.

We think that President Bullard’s quadratic-loss-function approach is a reasonable one. Chart 1 shows what you get using this approach, assuming a goal of year-over-year personal consumption expenditure inflation at 2 percent, and the headline U-3 measure of the unemployment rate at 5.4 percent. (As the U.S. Bureau of Labor Statistics defines unemployment, U-3 measures the total unemployed as a percent of the labor force.) This rate is about the midpoint of the central tendency of the FOMC’s longer-run estimate for unemployment from the June SEP.

Chart 1: Progress toward Objectives: U-3 Gap

Notice that the policy objective gap increased dramatically during the recession, but is currently at a low value that’s close to precrisis levels. On this basis, the economy has been on a long, uncomfortable trip but is getting pretty close to home. But other drivers of the monetary policy minivan may be assessing how far there is still to travel using an alternate road map to chart 1. For example, Atlanta Fed President Dennis Lockhart has highlighted the role of involuntary part-time work as a signal of slack that is not captured in the U-3 unemployment rate measure. Indeed, the last FOMC statement noted that

Labor market conditions improved, with the unemployment rate declining further. However, a range of labor market indicators suggests that there remains significant underutilization of labor resources.

So, although acknowledging the decline in U-3, the Committee is also suggesting that other labor market indicators may suggest somewhat greater residual slack in the labor market. For example, suppose we used the broader U-6 measure to compute the distance left to travel based on President Bullard’s formula. The U-6 unemployment measure counts individuals who are marginally attached to the labor force as unemployed and, importantly, also counts involuntarily part-time workers as unemployed. One simple way to incorporate the U-6 gap is to compute the average difference between U-6 and U-3 prior to 2007 (excluding the 2001 recession), which was 3.9 percent, and add that to the U-3 longer-run estimate of 5.4 percent, to give an estimate of the longer-run U-6 rate of 9.3 percent. Chart 2 shows what you get if you run the numbers through President Bullard’s formula using this U-6 adjustment (scaling the U-6 gap by the ratio of the U-3 and U-6 steady-state estimates to put it on a U-3 basis).

Chart 2: Progress toward Objectives: U-3 Gap versus U-6 Gap

What the chart says is that, up until about four years ago, it didn’t really matter at all what your preferred measure of labor market slack was; they told a similar story because they tracked each other pretty closely. But currently, your view of how close monetary policy is to its goals depends quite a bit on whether you are a fan of U-3 or of U-6—or of something in between. I think you can put the Atlanta Fed’s current position as being in that “in-between” camp, or at least not yet willing to tell the kids that home is just around the corner.

In an interview last week with the Wall Street Journal, President Lockhart effectively put some distance between his own view and those who see the economy as being close to full employment. The Journal’s Real Time Economics blog quoted Lockhart:

“I’m not ruling out” the idea the Fed may need to raise short-term interest rates earlier than many now expect, Mr. Lockhart said in an interview with The Wall Street Journal. But, at the same time, “I’m a little bit cautious” about the policy outlook, and still expect that when the first interest rate hike comes, it will likely happen somewhere in the second half of next year.

“I remain one who is looking for further validation that we are on a track that is going to make the path to our mandate objectives pretty irreversible,” Mr. Lockhart said. “It’s premature, even with the good numbers that have come in ... to draw the conclusion that we are clearly on that positive path,” he said.

Mr. Lockhart said the current unemployment rate of 6.2% will likely continue to decline and tick under 6% by the end of the year. But, he said, there remains evidence of underlying softness in the job sector, and, he also said, while inflation shows signs of firming, it remains under the Fed’s official 2% target.

Our view is that the current monetary policy journey has made considerable progress toward its objectives. But the trip is not yet complete, and the road ahead remains potentially bumpy. In the meantime, I recommend these road-trip sing-along selections.

Photo of John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department


August 12, 2014 in Economics, Employment, Federal Reserve and Monetary Policy, Inflation, Labor Markets, Monetary Policy, Pricing, Unemployment | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a3fd455f7a970b

Listed below are links to blogs that reference Are We There Yet?:

Comments

Major problems with U6 include the fact that someone working 34 hours but wants to work 35 or more is considered unemployed (not partially unemployed) -- a very loose definition of an unemployed person. Also, some policymakers conflate marginally attached with discouraged workers. Only one-third of the marginally attached are discouraged about job prospects (the other two-thirds didn't look for work because of illness, school, etc. -- i.e., for reasons monetary policy cannot address). So there are very good reasons for President Bullard's objective function to be based on U3 rather than U6. Additionally, what policymakers should consider, to follow through with your analogy, is when you arrive at your destination should you still have the accelerator pressed to the floor? Or does it not make sense to let off of the gas a bit as you approach your destination (to avoid driving the minivan right through your home).

Posted by: Conrad DeQuadros | August 14, 2014 at 12:57 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

May 16, 2014


Which Flavor of QE?

Yesterday's report on consumer price inflation from the U.S. Bureau of Labor Statistics moved the needle a bit on inflation trends—but just a bit. Meanwhile, the European Central Bank appears to be locked and loaded to blast away at its own (low) inflation concerns. From the Wall Street Journal:

The European Central Bank is ready to loosen monetary policy further to prevent the euro zone from succumbing to an extended period of low inflation, its vice president said on Thursday.

"We are determined to act swiftly if required and don't rule out further monetary policy easing," ECB Vice President Vitor Constancio said in a speech in Berlin.

One of the favorite further measures is apparently charging financial institutions for funds deposited with the central bank:

On Wednesday, the ECB's top economist, Peter Praet, in an interview with German newspaper Die Zeit, said the central bank is preparing a number of measures to counter low inflation. He mentioned a negative rate on deposits as a possible option in combination with other measures.

I don't presume to know enough about financial institutions in Europe to weigh in on the likely effectiveness of such an approach. I do know that we have found reasons to believe that there are limits to such a tool in the U.S. context, as the New York Fed's Ken Garbade and Jamie McAndrews pointed out a couple of years back.

In part, the desire to think about an option such as negative interest rates on deposits appears to be driven by considerable skepticism about deploying more quantitative easing, or QE.

A drawback, in my view, of general discussions about the wisdom and effectiveness of large-scale asset purchase programs is that these policies come in many flavors. My belief, in fact, is that the Fed versions of QE1, QE2, and QE3 can be thought of as three quite different programs, useful to address three quite distinct challenges. You can flip through the slide deck of a presentation I gave last week at a conference sponsored by the Global Interdependence Center, but here is the essence of my argument:

  • QE1, as emphasized by former Fed Chair Ben Bernanke, was first and foremost credit policy. It was implemented when credit markets were still in a state of relative disarray and, arguably, segmented to some significant degree. Unlike credit policy, the focus of traditional or pure QE "is the quantity of bank reserves" (to use the Bernanke language). Although QE1 per se involved asset purchases in excess of $1.7 trillion, the Fed's balance sheet rose by less than $300 billion during the program's span. The reason, of course, is that the open-market purchases associated with QE1 largely just replaced expiring lending from the emergency-based facilities in place through most of 2008. In effect, with QE1 the Fed replaced one type of credit policy with another.
  • QE2, in contrast, looks to me like pure, traditional quantitative easing. It was a good old-fashioned Treasury-only asset purchase program, and the monetary base effectively increased in lockstep with the size of the program. Importantly, the salient concern of the moment was a clear deterioration of market-based inflation expectations and—particularly worrisome to us at the Atlanta Fed—rising beliefs that outright deflation might be in the cards. In retrospect, old-fashioned QE appears to have worked to address the old-fashioned problem of influencing inflation expectations. In fact, the turnaround in expectations can be clearly traced to the Bernanke comments at the August 2010 Kansas City Fed Economic Symposium, indicating that the Federal Open Market Committee (FOMC) was ready and willing pull the QE tool out of the kit. That was an early lesson in the power of forward guidance, which brings us to...
  • ...QE3. I think it is a bit early to draw conclusions about the ultimate impact of QE3. I think you can contend that the Fed's latest large-scale asset purchase program has not had a large independent effect on interest rates or economic activity while still believing that QE3 has played an important role in supporting the economic recovery. These two, seemingly contradictory, opinions echo an argument suggested by Mike Woodford at the Kansas City Fed's Jackson Hole conference in 2012: QE3 was important as a signaling device in early stages of the deployment of the FOMC's primary tool, forward guidance regarding the period of exceptionally low interest rates. I would in fact argue that the winding down of QE3 makes all the more sense when seen through the lens of a forward guidance tool that has matured to the point of no longer requiring the credibility "booster shot" of words put to action via QE.

All of this is to argue that QE, as practiced, is not a single policy, effective in all variants in all circumstances, which means that the U.S. experience of the past might not apply to another time, let alone another place. But as I review the record of the past seven years, I see evidence that pure QE worked pretty well precisely when the central concern was managing inflation expectations (and, hence, I would say, inflation itself).

Photo of Dave AltigBy Dave Altig, executive vice president and research director of the Atlanta Fed


May 16, 2014 in Federal Reserve and Monetary Policy, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a73dc50a43970d

Listed below are links to blogs that reference Which Flavor of QE?:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

May 13, 2014


Pondering QE

Today’s news brings another indication that low inflation rates in the euro area have the attention of the European Central Bank. From the Wall Street Journal (Update: via MarketWatch):

Germany's central bank is willing to back an array of stimulus measures from the European Central Bank next month, including a negative rate on bank deposits and purchases of packaged bank loans if needed to keep inflation from staying too low, a person familiar with the matter said...

This marks the clearest signal yet that the Bundesbank, which has for years been defined by its conservative opposition to the ECB's emergency measures to combat the euro zone's debt crisis, is fully engaged in the fight against super-low inflation in the euro zone using monetary policy tools...

Notably, these tools apparently do not include Fed-style quantitative easing:

But the Bundesbank's backing has limits. It remains resistant to large-scale purchases of public and private debt, known as quantitative easing, the person said. The Bundesbank has discussed this option internally but has concluded that with government and corporate bond yields already quite low in Europe, the purchases wouldn't do much good and could instead create financial stability risks.

Should we conclude that there is now a global conclusion about the value and wisdom of large-scale asset purchases, a.k.a. QE? We certainly have quite a bit of experience with large-scale purchases now. But I think it is also fair to say that that experience has yet to yield firm consensus.

You probably don’t need much convincing that QE consensus remains elusive. But just in case, I invite you to consider the panel discussion we titled “Greasing the Skids: Was Quantitative Easing Needed to Unstick Markets? Or Has it Merely Sped Us toward the Next Crisis?” The discussion was organized for last month’s 2014 edition of the annual Atlanta Fed Financial Markets Conference.

Opinions among the panelists were, shall we say, diverse. You can view the entire session via this link. But if you don’t have an hour and 40 minutes to spare, here is the (less than) ten-minute highlight reel, wherein Carnegie Mellon Professor Allan Meltzer opines that Fed QE has become “a foolish program,” Jeffries LLC Chief Market Strategist David Zervos declares himself an unabashed “lover of QE,” and Federal Reserve Governor Jeremy Stein weighs in on some of the financial stability questions associated with very accommodative policy:


You probably detected some differences of opinion there. If that, however, didn’t satisfy your craving for unfiltered debate, click on through to this link to hear Professor Meltzer and Mr. Zervos consider some of Governor Stein’s comments on monitoring debt markets, regulatory approaches to pursuing financial stability objectives, and the efficacy of capital requirements for banks.

Photo of Dave AltigBy Dave Altig, executive vice president and research director of the Atlanta Fed.


May 13, 2014 in Banking, Capital Markets, Economic conditions, Federal Reserve and Monetary Policy, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a3fd07f60d970b

Listed below are links to blogs that reference Pondering QE:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 31, 2014


A Brief Interview with Sergio Rebelo on the Euro-Area Economy

Last month, we at the Atlanta Fed had the great pleasure of hosting Sergio Rebelo for a couple of days. While he was here, we asked Sergio to share his thoughts on a wide range of current economic topics. Here is a snippet of a Q&A we had with him about the state of the euro-area economy:

Sergio, what would you say was the genesis of the problems the euro area has faced in recent years?

The contours of the euro area’s problems are fairly well known. The advent of the euro gave peripheral countries—Ireland, Spain, Portugal, and Greece—the ability to borrow at rates that were similar to Germany's. This convergence of borrowing costs was encouraged through regulation that allowed banks to treat all euro-area sovereign bonds as risk free.

The capital inflows into the peripheral countries were not, for the most part, directed to the tradable sector. Instead, they financed increases in private consumption, large housing booms in Ireland and Spain, and increases in government spending in Greece and Portugal. The credit-driven economic boom led to a rise in labor costs and a loss of competitiveness in the tradable sector.

Was there a connection between the financial crisis in the United States and the sovereign debt crisis in the euro area?

Simply put, after Lehman Brothers went bankrupt, we had a sudden stop of capital flows into the periphery, similar to that experienced in the past by many Latin American countries. The periphery boom quickly turned into a bust.

What do you see as the role for euro area monetary policy in that context?

It seems clear that more expansionary monetary policy would have been helpful. First, it would have reduced real labor costs in the peripheral countries. In those countries, the presence of high unemployment rates moderates nominal wage increases, so higher inflation would have reduced real wages. Second, inflation would have reduced the real value of the debts of governments, banks, households, and firms. There might have been some loss of credibility on the part of the ECB [European Central Bank], resulting in a small inflation premium on euro bonds for some time. But this potential cost would have been worth paying in return for the benefits.

And did this happen?

In my view, the ECB did not follow a sufficiently expansionary monetary policy. In fact, the euro-area inflation rate has been consistently below 2 percent and the euro is relatively strong when compared to a purchasing-power-parity benchmark. The euro area turned to contractionary fiscal policy as a panacea. There are good theoretical reasons to believe that—when the interest rate remains constant that so the central bank does not cushion the fall in government spending—the multiplier effect of government spending cuts can be very large. See, for example, Gauti Eggertsson and Michael Woodford, “The Zero Interest-rate Bound and Optimal Monetary Policy,” and Lawrence Christiano, Martin Eichenbaum, and Sergio Rebelo, "When Is the Government Spending Multiplier Large?

Theory aside, the results of the austerity policies implemented in the euro area are clear. All of the countries that underwent this treatment are now much less solvent than in the beginning of the adjustment programs managed by the European Commission, the International Monetary Fund, and the ECB.

Bank stress testing has become a cornerstone of macroprudential financial oversight. Do you think they helped stabilize the situation in the euro area during the height of the crisis in 2010 and 2011?

No. Quite the opposite. I think the euro-area problems were compounded by the weak stress tests conducted by the European Banking Association in 2011. Almost no banks failed, and almost no capital was raised. Banks largely increased their capital-to-asset ratios by reducing assets, which resulted in a credit crunch that added to the woes of the peripheral countries.

But we’re past the worst now, right? Is the outlook for the euro-area economy improving?

After hitting the bottom, a very modest recovery is under way in Europe. But the risk that a Japanese-style malaise will afflict Europe is very real. One useful step on the horizon is the creation of a banking union. This measure could potentially alleviate the severe credit crunch afflicting the periphery countries.

Thanks, Sergio, for this pretty sobering assessment.

John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department

Editor’s note: Sergio Rebelo is the Tokai Bank Distinguished Professor of International Finance at Northwestern University’s Kellogg School of Management. He is a fellow of the Econometric Society, the National Bureau of Economic Research, and the Center for Economic Policy Research.


January 31, 2014 in Banking, Capital and Investment, Economics, Europe, Interest Rates, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a73d66a0e3970d

Listed below are links to blogs that reference A Brief Interview with Sergio Rebelo on the Euro-Area Economy:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 14, 2014


A Football Field of Labor Market Progress

The December meeting of the Federal Open Market Committee (FOMC), as summarized in the minutes published last week, debated the context for tapering the quantitative easing (QE) program of asset purchases and adjusting the FOMC’s forward guidance on the federal funds rate. One of the issues debated was postrecession progress in the labor market. For example, participants struggled with the reasons for the large drop in labor force participation in recent years:

Some participants cited research that found that demographic and other structural factors, particularly rising retirements by older workers, accounted for much of the recent decline in participation. However, several others continued to see important elements of cyclical weakness in the low labor force participation rate and cited other indicators of considerable slack in the labor market, including the still-high levels of long-duration unemployment and of workers employed part time for economic reasons and the still-depressed ratio of employment to population for workers ages 25 to 54. In addition, although a couple of participants had heard reports of labor shortages, particularly for workers with specialized skills, most measures of wages had not accelerated. A few participants noted the risk that the persistent weakness in labor force participation and low rates of productivity growth might indicate lasting structural economic damage from the financial crisis and ensuing recession.

In a speech on Monday, Atlanta Fed President Dennis Lockhart emphasized similar concerns. He posed the question of whether the improvement in the unemployment rate since the end of the recession, now having recovered about 65 percent of its 2007–09 increase, is overstating the actual progress in the utilization of the nation’s labor resources. President Lockhart observes:

But the unemployment rate is influenced by labor force participation, and there has been a sizable decline in the share of the population in the labor force since 2009. This explains how you could get a big drop in the unemployment rate with anemic job gains, as occurred in December.

The labor force participation rate has fallen from 65.8 percent of the population at the end of 2008 to 62.8 percent in December 2013. On this, President Lockhart notes:

Some of the decline in labor force participation since 2009 is due to the baby boomers retiring, but even among prime-age workers—those aged 25 to 54—the participation rate is down significantly [2.1 percentage points]. This suggests that other factors, such as low prospects of finding a job, are playing a role.

To examine this possibility, we can look at the sum of marginally attached workers. These are people who say they are willing to work and have looked for work recently but are not currently looking.

The marginally attached are not counted in the official labor force statistic. During the recession, the number of marginally attached swelled (from around 1.4 million at the end of 2007 to 2.4 million at the end of 2009). Since the end of 2009, the marginally attached rate (as a share of the labor force including marginally attached) has retraced only 12 percent of the recessionary increase. From this, President Lockhart concludes:

It’s accurate to say the country has a large number of people in the so-called “shadow labor force.”

Because the sharp decline in labor force participation is not fully understood, and because the unemployment rate decline conflates declines in participation with employment gains, President Lockhart suggests it is useful to also look at the share of the prime-age population that is employed. Between the end of 2007 and 2009 the employment-to-population rate for this group declined from 79.7 to 74.8 percent. Since 2009, employment gains for the core of the workforce have advanced only 27 percent toward the prerecession peak (for the entire population over age 16, the recovery is essentially zero). Variations on this theme can be seen here and here.

Usually, the employment to population rate and the unemployment rate move in lock step (because labor force movements are very gradual). But that has not been the case during this recovery.

In addition to unemployment, President Lockhart highlights the issue of underemployment:

Many Americans are working fewer hours than they would prefer because their employers are offering them only part-time work. The share of workers who are involuntarily working part-time doubled during the recession and has moved only about 30 percent lower since the recovery began.

So, on the question of whether the unemployment rate decline has overstated actual progress in labor utilization, Lockhart says yes:

To sum up, these comparisons of employment data suggest that the labor market is not as healthy as the improved unemployment rate might suggest. The unemployment rate drop may overstate progress achieved.

The Atlanta Fed has been featuring the labor market spider chart tool on its website as a way to track relative progress in a number of labor market indicators since the end of the recession. For the purposes of President Lockhart’s speech, the relative improvement in various indicators of the rate of labor utilization was presented graphically in the form of yardage gains from the goal-line of a football field. The changes can be seen here (the data are from the U.S. Bureau of Labor Statistics and Atlanta Fed calculations). The idea is that the labor utilization “team” was driven back to its own goal line from the end of 2007 through the end of 2009, and the graphic shows how many yards (percent) the team has recovered as of the January 10 labor report. (The use of a football field image is perhaps appropriate, given that the recent BCS championship game featured two teams from the Sixth District.)

Labor Utilization Recovery: How Far Have We Come?

President Lockhart also suggests a link between labor market slack and the weak pricing trends we have experienced in recent years:

It’s worth noting that wage and salary income growth remains weak. I hear very little from business contacts about upward wage pressures except in a few specialized job categories. Wage pressures usually accompany growing demand and rising inflation but, although demand appears to be growing, inflation is very soft.

Inflation Y-O-Y Percent Change

In fact, looking at the recent disinflation apparent in virtually all consumer price statistics relative to the FOMC’s longer-run objective, President Lockhart acknowledges the risk of an inflation “safety”:

...I think inflation will stabilize and begin to move back in the direction of the FOMC’s 2 percent objective as the economy gathers momentum. So I’m interpreting the soft inflation numbers as a risk signal. Through the lens of prices, the economy could be weaker than we currently believe.

John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department


January 14, 2014 in Labor Markets, Monetary Policy, Sports, Unemployment | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a510e4c311970c

Listed below are links to blogs that reference A Football Field of Labor Market Progress:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

Google Search



Recent Posts


November 2016


Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30      

Archives


Categories


Powered by TypePad