About


The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.

Authors for macroblog are Dave Altig, John Robertson, and other Atlanta Fed economists and researchers.


March 26, 2018


Thoughts on a Long-Run Monetary Policy Framework: Framing the Question

"Should the Fed stick with the 2 percent inflation target or rethink it?" This was the very good question posed in a special conference hosted by the Brookings Institution this past January. Over the course of roughly two decades prior to the global financial crisis, a consensus had formed among monetary-policy experts and practitioners the world over that something like 2 percent is an appropriate goal—maybe even the optimal goal—for central banks to pursue. So why reconsider that target now?

The answer to that question starts with another consensus that has emerged in the aftermath of the global financial crisis. In particular, there is now a widespread belief that, once monetary policy has fully normalized, the federal funds rate—the Federal Open Market Committee's (FOMC) reference policy rate—will settle significantly below historical norms.

Several of my colleagues have spoken cogently about this phenomenon, which is often cast in terms of concepts like r-star, the natural rate of interest, the equilibrium rate of interest, or (in the case of my colleague Jim Bullard ), r-dagger. I like to think in terms of the "neutral" rate of interest; that is, the level of the policy rate consistent with the FOMC meeting its longer-run  goals of price stability and maximum sustainable growth. In other words, the level of the federal funds rate should be consistent with 2 percent inflation, the unemployment rate at its sustainable level, and real gross domestic product at its potential.

Estimates of the neutral policy rate are subject to imprecision and debate. But a reasonable notion can be gleaned from the range of projections for the long-run federal funds rate reported in the Summary of Economic Projections (SEP) released just after last week's FOMC meeting. According to the latest SEP, neutral would be in a range 2.3 to 3.0 percent.

For some historical context, in the latter half of the 1990s, as the 2 percent inflation consensus was solidifying, the neutral federal funds rate would have been pegged in a range of something like 4.0 to 5.0 percent, roughly 2 percentage points higher than the range considered to be neutral today.

The implication for monetary policy is clear. If interest rates settle at levels that are historically low, policymakers will have limited scope for cutting rates in the event of a significant economic downturn (or at least more limited scope than they had in the past). I think it's fair to say that even relatively modest downturns are likely to yield policy reactions that drive the federal funds rate to zero, as happened in the Great Recession.

My view is that the nontraditional tools deployed after December 2008, when the federal funds rate effectively fell to zero, were effective. But it is accurate to say that our experience with these tools is limited, and the effectiveness of those tools remains controversial. I join the opinion that, all else equal, it would be vastly preferable to conduct monetary policy through the time-tested approach of raising and lowering short-term policy rates, if such an approach is available.

This point is where the challenge to the 2 percent inflation target enters the picture. The neutral rate I have been describing is a nominal rate. It is roughly the sum of an inflation-adjusted real rate—determined by fundamental saving and investment decisions in the global economy—and the rate of inflation. The downward drift in the neutral rate I have been describing is attributable to a downward drift in the inflation-adjusted real rate. A great deal of research has documented this phenomenon, such as some influential research  by San Francisco Fed president John Williams and Thomas Laubach, the head of the monetary division at the Fed's Board of Governors.

In the long run, a central bank cannot reliably control the real rate of interest. So if we accept the following premises...

  • A neutral rate that is too low to give the central bank enough room to fight even run-of-the-mill downturns is problematic;
  • Cutting rates is the optimal strategy for addressing downturns; and
  • The real interest rate is beyond the control of the central bank in the long run

...then we must necessarily accept that raising the neutral rate, thus affording monetary policymakers the desired rate-cutting scope when needed, would require raising the long-run inflation rate. Hence the argument for rethinking the Fed's 2 percent inflation target.

But is that the only option? And is it the best option?

The answer to the first question is clearly no. The purpose of the Brookings Institution sessions is addressing the pros and cons of the different strategies for dealing with the low neutral rate problem, and I commend them to you. But in upcoming macroblog posts, I want to share some of my thoughts on the second question.

Tomorrow, I will review some of the proposed options and explain why I am attracted to one in particular: price-level targeting. On Wednesday, I will propose what I think is a potentially useful model for implementing a price-level targeting scheme in practice. I want to emphasize that these are preliminary thoughts, offered in the spirit of stimulating the conversation and debate. I welcome that conversation and debate and look forward to making my contribution to moving it forward.



March 26, 2018 in Inflation , Monetary Policy | Permalink | Comments ( 0)

March 23, 2018


What Are Businesses Saying about Tax Reform Now?

In a recent macroblog post, we shared some results of a joint national survey that is an ongoing collaboration between the Atlanta Fed, Nick Bloom of Stanford University, and Steve Davis of the University of Chicago, and Jose Barrero of Stanford University. (By the way, we're planning on calling this work the "Survey of Business Executives," or SBE.).

In mid-November, we posed this question to our panel of firms:

If passed in its current form, how would the Tax Cuts and Jobs Act affect your capital expenditures in 2018?

At the time, we (and perhaps others) were a little surprised to find that roughly two-thirds of respondents indicated that tax reform hasn't enticed them into changing their investment plans for 2018. Our initial interpretation was that the lack of an investment response by firms made it unlikely that we'd see a sharp acceleration in output growth in 2018.

Another interpretation of those results might be that firms were unwilling to speculate on how they'd respond to legislation that was not yet set in stone. Now that the ink has been dry on the bill for a while, we decided to ask again.

In our February survey—which was in the field from February 12 through February 23—we asked firms, "How has the recently enacted Tax Cuts and Jobs Act (TCJA) led you to revise your plans for capital expenditures in 2018?" The results shown below—restricted to the 218 firms that responded in both November 2017 and February 2018—suggest that, if anything, these firms have revised down their expectations for this year:

You may be thinking that perhaps firms had already set their capital expenditure plans for 2018, so asking about changes in firms' 2018 plans isn't too revealing—which is why we asked them about their 2019 plans as well. The results (showing all 272 responses in February) are not statistically different from their 2018 response. Roughly three-quarters of firms don't plan to change their capital expenditure plans in 2019 as a result of the TCJA:

These results contain some nuance. It seems that larger firms (those with more than 500 employees) responded more favorably to the tax reform. But it is still the case that the typical (or median) large firm has not revised its 2019 capex plans in response to tax changes.

Why the disparity between smaller and larger firms? We're not sure yet—but we have an inkling. In a separate survey we had in the field in February—the Business Inflation Expectations (BIE) survey—we asked Sixth District firms to identify their tax reporting structure and whether or not they expected to see a reduction in their tax bill as a result of the TCJA. Larger firms—which are more likely to be organized as C corporations—appear to be more sure of the TCJA's impact on their bottom lines. Conversely, smaller "pass-through" entities appear to be less certain of its impact, as shown here:

For now, we're sticking with our initial assessment that the potential for a sharp acceleration in near-term output growth is limited. However, there is some upside risk to that view if more pass-through entities start to see significantly smaller tax bills as a result of the TCJA.

March 23, 2018 in Business Inflation Expectations , Fiscal Policy | Permalink | Comments ( 0)

March 06, 2018


A First Look at Employment

One Friday morning each month at 8:30 is always an exciting time here at the Atlanta Fed. Why, you might ask? Because that's when the U.S. Bureau of Labor Statistics (BLS) issues the newest employment and labor force statistics from the Employment Situation Summary. Just after the release, Atlanta Fed analysts compile a "first look" report based on the latest numbers. We have found this initial view to be a very useful glimpse into the broad health of the national labor market.

Because we find this report useful, we thought you might also find it of interest. To that end, we have added the Labor Report First Look tool to our website, and we'll strive to post updated data soon after the release of the BLS's Employment Situation Report. Our Labor Report First Look includes key data for the month and changes over time from both the payroll and household surveys, presented as tables and charts. 

Additionally, we will also use the bureau's data to create other indicators included in the Labor Report First Look. For example, one of these is a depiction of changes in payroll employment by industry, in which we rank industry employment changes by average hourly pay levels. This tool allows us to see if payrolls are gaining or losing higher- or lower-paying jobs, as the following chart shows.

But wait, there's more! We will also report information on the so-called job finding rate—an estimate of the share of unemployed last month who are employed this month—and a broad measure of labor underutilization. Our underutilization concept is related to another statistic we created called Z-Pop, computed as the share of the population who are either unemployed or underemployed (working part-time hours but wanting full-time work) or who say they currently want a job but are not actively looking. We have found this to be a useful supplement to the BLS's employment-to-population ratio (see the chart).

The Labor Report First Look tool also allows you to dig a bit deeper into Atlanta Fed labor market analysis via links to our Human Capital Data & Tools (which includes the Wage Growth Tracker and Labor Force Dynamics web pages) and links to some of our blog posts on labor market developments and related research. (In fact, it's easy to stay informed of all Labor Report First Look updates by subscribing to our RSS feed or following the Atlanta Fed on Twitter.

We hope you'll look for the inaugural Labor Report First Look next Friday morning...we know you'll be as excited as we will!

March 6, 2018 in Economic conditions , Employment , Labor Markets | Permalink | Comments ( 0)

February 28, 2018


Weighting the Wage Growth Tracker

The Atlanta Fed's Wage Growth Tracker (WGT) has shown its usefulness as an indicator of labor market conditions, producing a better-fitting Phillips curve than other measures of wage growth. So we were understandably surprised to see the WGT decline from 3.5 percent in 2016 to 3.2 percent in 2017, even as the unemployment rate moved lower from 4.9 to 4.4 percent.

This unexpected disconnect between the WGT and the unemployment rate naturally led us to wonder if it was a consequence of the way the WGT is constructed. Essentially, the WGT is the median of an unweighted sample of individual wage growth observations. This sample is quite large, but it does not perfectly represent the population of wage and salary earners.

Importantly, the WGT sample has too few young workers, because young workers are much more likely to be in and out of employment and hence less likely to have a wage observation in both the current and prior years. To examine the effect of this underrepresentation, we recomputed median wage growth after weighting the WGT sample to be consistent with the distribution of demographic and job characteristics of the workforce in each year. It turns out that this adjustment is important when the labor market is tight.

During periods of low unemployment, young people who stay employed tend to experience larger proportionate wage bumps than older workers. In 2017, for example, the weighted median is 40 basis points higher than the unweighted version. However, both the unweighted version (the gray line in the chart below) and the weighted version of the WGT (the blue line) declined by a similar amount from 2016 to 2017. The decline in the weighted median is also statistically significant (the p-value for the test is 0.07, indicating that the observed difference is unlikely to be due to chance).

Another issue that could affect comparisons of wage growth over time is the changing demographic characteristics of the workforce. In particular, we know that workers' wage growth tends to slow as they approach retirement age, and the fraction of older workers has increased markedly in recent years. To examine this trend, we re-computed the weighted median, but fixed the demographic and job characteristics of the workforce so they would look as they did in 1997.

Our 1997-fixed version shows that median wage growth in recent years would be a bit higher if not for the aging of the workforce (the dashed orange line in the chart below). Moreover, this demographic shift appears to explain some of the slowing in median wage growth from 2016 to 2017. Whereas the 1997-fixed median also slows over the year, the difference is not statistically significant (a test of the null hypothesis of no change in the 1997-fixed weighted median between 2016 and 2017 yielded a p-value of 0.38).

Long story short, our analysis suggests that median wage growth of the population of wage and salary earners is currently higher than the WGT would indicate, reflecting the strong wage gains young workers experience in a tight labor market. Moreover, the increasing share of older workers is acting to restrain median wage growth. Although the decline in median wage growth from 2016 to 2017 appears to be partly the result of the aging workforce, there still may be more to it than just that, and so we will continue to monitor the WGT and related measures closely in 2018 for signs of a pickup. We also want to note that with the release of the February wage data in mid-March, we will make a monthly version of the weighted WGT available.

 

February 28, 2018 in Data Releases , Employment , Labor Markets , Wage Growth | Permalink | Comments ( 0)

February 13, 2018


GDPNow's Forecast: Why Did It Spike Recently?

If you felt whipsawed by GDPNow recently, it's understandable. On February 1, the Atlanta Fed's GDPNow model estimate of first-quarter real gross domestic product (GDP) growth surged from 4.2 percent to 5.4 percent (annualized rates) after a manufacturing report from the Institute for Supply Management. GDPNow's estimate then fell to 4.0 percent on February 2 after the employment report from the U.S. Bureau of Labor Statistics. GDPNow displayed a similar undulating pattern early in the forecast cycle for fourth-quarter GDP growth.

What accounted for these sawtooth patterns? The answer lies in the treatment of the ISM manufacturing release. To forecast the yet-to-be released monthly GDP source data apart from inventories, GDPNow uses an indicator of growth in economic activity from a statistical model called a dynamic factor model. The factor is estimated from 127 monthly macroeconomic indicators, many of which are used to estimate the Chicago Fed National Activity Index (CFNAI). Indices like these can be helpful for forecasting macroeconomic data, as demonstrated here  and here.

Perhaps not surprisingly, the CFNAI and the GDPNow factor are highly correlated, as the red and blue lines in the chart below indicate. Both indices, which are normalized to have an average of 0 and a standard deviation of 1, are usually lower in recessions than expansions.

A major difference in the indices is how yet-to-be-released values are handled for months in the recent past that have reported values for some, but not all, of the source data. For example, on February 2, January 2018 values had been released for data from the ISM manufacturing and employment reports but not from the industrial production or retail sales reports. The CFNAI is released around the end of each month when about two-thirds of the 85 indicators used to construct it have reported values for the previous month. For the remaining indicators, the Chicago Fed fills in statistical model forecasts for unreported values. In contrast, the GDPNow factor is updated continuously and extended a month after each ISM manufacturing release. On the dates of the ISM releases, around 17 of the 127 indicators GDPNow uses have reported values for the previous month, with six coming from the ISM manufacturing report.

Chart-01-of-01-factor-model-estimates-of-growth-in-us-economic-activity

[ Enlarge ]

For months with partially missing data, GDPNow updates its factor with an approach similar to the one used in a 2008 paper by economists Domenico Giannone, Lucrezia Reichlin and David Small. That paper describes a dynamic factor model used to nowcast GDP growth similar to the one that generates the New York Fed's staff nowcast of GDP growth. In the Atlanta Fed's GDPNow factor model, the last month of ISM manufacturing data have large weights when calculating the terminal factor value right after the ISM report. These ISM weights decrease significantly after the employment report, when about 50 of the indicators have reported values for the last month of data.

In the above figure, we see that the January 2018 GDPNow factor reading was 1.37 after the February 1 ISM release, the strongest reading since 1994 and well above either its forecasted value of 0.42 prior to the ISM release or its estimated value of 0.43 after the February 2 employment release. The aforementioned rise and decline in the GDPNow forecast of first-quarter growth is largely a function of the rise and decline in the January 2018 estimates of the dynamic factor.

Although the January 2018 reading of 59.2 for the composite ISM purchasing managers index (PMI) was higher than any reading from 2005 to 2016, it was little different than either a consensus forecast from professional economists (58.8) or the forecast from a simple model (58.9) that uses the strong reading in December 2017 (59.3). Moreover, it was well above the reading the GDPNow dynamic factor model was expecting (54.5).

A possible shortcoming of the GDPNow factor model is that it does not account for the previous month's forecast errors when forecasting the 127 indicators. For example, the predicted composite ISM PMI reading of 54.4 in December 2017 was nearly 5 points lower than the actual value. For this discussion, let's adjust GDPNow's factor model to account for these forecast errors and consider a forecast evaluation period with revised current vintage data after 1999. Then, the average absolute error of the 85–90 day-ahead adjusted model forecasts of GDP growth after ISM manufacturing releases (1.40 percentage points) is lower than the average absolute forecast error on those same dates for the standard version of GDPNow (1.49 percentage points). Moreover, the forecasts using the adjusted factor model are significantly more accurate than the GDPNow forecasts, according to a standard statistical test . If we decide to incorporate adjustments to GDPNow's factor model, we will do so at an initial forecast of quarterly GDP growth and note the change here .

Would the adjustment have made a big difference in the initial first-quarter GDP forecast? The February 1 GDP growth forecast of GDPNow with the adjusted factor model was "only" 4.7 percent. Its current (February 9) forecast of first-quarter GDP growth was the same as the standard version of GDPNow: 4.0 percent. These estimates are still much higher than both the recent trend in GDP growth and the median forecast of 3.0 percent from the Philadelphia Fed's Survey of Professional Forecasters (SPF).

Most of the difference between the GDPNow and SPF forecasts of GDP growth is the result of inventories. GDPNow anticipates inventories will contribute 1.2 percentage points to first-quarter growth, and the median SPF projection implies an inventory contribution of only 0.4 percentage points. It's not unusual to see some disagreement between these inventory forecasts and it wouldn't be surprising if one—or both—of them turn out to be off the mark.



February 13, 2018 in Business Cycles , Forecasts , GDP , Productivity , This, That, and the Other | Permalink | Comments ( 0)

January 18, 2018


How Low Is the Unemployment Rate, Really?

In 2017, the unemployment rate averaged 4.4 percent. That's quite low on a historical basis. In fact, it's the lowest level since 2000, when unemployment averaged 4.0 percent. But does that mean that the labor market is only 0.4 percentage points away from being as strong as it was in 2000? Probably not. Let's talk about why.

As observed by economist George Perry in 1970, although movement in the aggregate unemployment rate is mostly the result of changes in unemployment rates within demographic groups, demographic shifts can also change the overall unemployment rate even if unemployment within demographic groups has not changed. Adjusting for demographic changes makes for a better apples-to-apples comparison of unemployment today with past rates.

Three large demographic shifts underway since the early 2000s are the rise in the average age and educational attainment of the labor force, and the decline in the share who are white and non-Hispanic. These changes are potentially important because older workers and those with more education have lower rates of unemployment across age and education groups respectively, and white non-Hispanics tend to have lower rates of unemployment than other ethnicities.

The following chart shows the results of a demographic adjustment that jointly controls for year-to-year changes in two sex, three education, four race/ethnicity, and six age labor force groups, (see here for more details). Relative to the year 2000, the unemployment rate in 2017 is about 0.6 percentage points lower than it would have been otherwise simply because the demographic composition of the labor force has changed (depicted by the blue line in the chart).

In other words, even though the 2017 unemployment rate is only 0.4 percentage points higher than in 2000, the demographically adjusted unemployment rate (the green line in the chart) is 1.0 percentage points higher. In terms of unemployment, after adjusting for changes in the composition of the labor force, we are not as close to the 2000 level as you might have thought.

The demographic discrepancy is even larger for the broader U6 measure of unemployment, which includes marginally attached and involuntarily part-time workers. The 2017 demographically adjusted U6 rate is 2.5 percentage points higher than in 2000, whereas the unadjusted U6 rate is only 1.5 percentage points higher. That is, on a demographically adjusted basis, the economy had an even larger share of marginally attached and involuntarily part-time workers in 2017 than in 2000.

The point here is that when comparing unemployment rates over long periods, it's advisable to use a measure that is reasonably insulated from demographic changes. However, you should also keep in mind that demographics are only one of several factors that can cause fluctuation. Changes in labor market and social policies, the mix of industries, as well as changes in the technology of how people find work can also result in changes to how labor markets function. This is one reason why estimates of the so-called natural rate of unemployment are quite uncertain and subject to revision. For example, participants at the December 2012 Federal Open Market Committee meeting had estimates for the unemployment rate that would prevail over the longer run ranging from 5.2 to 6.0 percent. At the December 2017 meeting, the range of estimates was almost a whole percentage point lower at 4.3 to 5.0 percent.

January 18, 2018 in Business Cycles , Economic conditions , Labor Markets , Unemployment | Permalink | Comments ( 0)

January 17, 2018


What Businesses Said about Tax Reform

Many folks are wondering what impact the Tax Cuts and Jobs Act—which was introduced in the House on November 2, 2017, and signed into law a few days before Christmas—will have on the U.S. economy. Well, in a recent speech, Atlanta Fed president Raphael Bostic had this to say: "I'm marking in a positive, but modest, boost to my near-term GDP [gross domestic product] growth profile for the coming year."

Why the measured approach? That might be our fault. As part of President Bostic's research team, we've been curious about the potential impact of this legislation for a while now, especially on how firms were responding to expected policy changes. Back in November 2016 (the week of the election, actually), we started asking firms in our Sixth District Business Inflation Expectations (BIE) survey how optimistic they were (on a 0–100 scale) about the prospects for the U.S. economy and their own firm's financial prospects. We've repeated this special question in three subsequent surveys. For a cleaner, apples-to-apples approach, the charts below show only the results for firms that responded in each survey (though the overall picture is very similar).

As the charts show, firms have become more optimistic about the prospects for the U.S. economy since November 2016, but not since February 2017, and we didn't detect much of a difference in December 2017, after the details of the tax plan became clearer. But optimism is a vague concept and may not necessarily translate into actions that firms could take that would boost overall GDP—namely, increasing capital investment and hiring.

In November, we had two surveys in the field—our BIE survey (undertaken at the beginning of the month) and a national survey conducted jointly by the Atlanta Fed, Nick Bloom of Stanford University, and Steven Davis of the University of Chicago. (That survey was in the field November 13–24.) In both of these surveys, we asked firms how the pending legislation would affect their capital expenditure plans for 2018. In the BIE survey, we also asked how tax reform would affect hiring plans.

The upshot? The typical firm isn't planning on a whole lot of additional capital spending or hiring.

In our national survey, roughly two-thirds of respondents indicated that the tax reform hasn't enticed them into changing their investment plans for 2018, as the following chart shows.

The chart below also makes apparent that small firms (fewer than 100 employees) are more likely to significantly ramp up capital investment in 2018 than midsize and larger firms.

For our regional BIE survey, the capital investment results were similar (you can see them here). And as for hiring, the typical firm doesn't appear to be changing its plans. Interestingly, here too, smaller firms were more likely to say they'd ramp up hiring. Among larger firms (more than 100 employees), nearly 70 percent indicated that they'd leave their hiring plans unchanged.

One interpretation of these survey results is that the potential for a sharp acceleration in GDP growth is limited. And that's also how President Bostic described things in his January 8 speech: "For now, I am treating a more substantial breakout of tax-reform-related growth as an upside risk to my outlook."



January 17, 2018 in Business Cycles , Data Releases , Economic conditions , Economic Growth and Development , Economics , Taxes | Permalink | Comments ( 0)

January 04, 2018


Financial Regulation: Fit for New Technologies?

In a recent interview, the computer scientist Andrew Ng said, "Just as electricity transformed almost everything 100 years ago, today I actually have a hard time thinking of an industry that I don't think AI [artificial intelligence] will transform in the next several years." Whether AI effects such widespread change so soon remains to be seen, but the financial services industry is clearly in the early stages of being transformed—with implications not only for market participants but also for financial supervision.

Some of the implications of this transformation were discussed in a panel at a recent workshop titled "Financial Regulation: Fit for the Future?" The event was hosted by the Atlanta Fed and cosponsored by the Center for the Economic Analysis of Risk at Georgia State University (you can see more on the workshop here and here). The presentations included an overview of some of AI's implications for financial supervision and regulation, a discussion of some AI-related issues from a supervisory perspective, and some discussion of the application of AI to loan evaluation.

As a part of the panel titled "Financial Regulation: Fit for New Technologies?," I gave a presentation based on a paper  I wrote that explains AI and discusses some of its implications for bank supervision and regulation. In the paper, I point out that AI is capable of very good pattern recognition—one of its major strengths. The ability to recognize patterns has a variety of applications including credit risk measurement, fraud detection, investment decisions and order execution, and regulatory compliance.

Conversely, I observed that machine learning (ML), the more popular part of AI, has some important weaknesses. In particular, ML can be considered a form of statistics and thus suffers from the same limitations as statistics. For example, ML can provide information only about phenomena already present in the data. Another limitation is that although machine learning can identify correlations in the data, it cannot prove the existence of causality.

This combination of strengths and weaknesses implies that ML might provide new insights about the working of the financial system to supervisors, who can use other information to evaluate these insights. However, ML's inability to attribute causality suggests that machine learning cannot be naively applied to the writing of binding regulations.

John O'Keefe from the Federal Deposit Insurance Corporation (FDIC) focused on some particular challenges and opportunities raised by AI for banking supervision. Among the challenges O'Keefe discussed is how supervisors should give guidance on and evaluate the application of ML models by banks, given the speed of developments in this area.

On the other hand, O'Keefe observed that ML could assist supervisors in performing certain tasks, such as off-site identification of insider abuse and bank fraud, a topic he explores in a paper  with Chiwon Yom, also at the FDIC. The paper explores two ML techniques: neural networks and Benford's Digit Analysis. The premise underlying Benford's Digit Analysis is that the digits resulting from a nonrandom number selection may differ significantly from expected frequency distributions. Thus, if a bank is committing fraud, the accounting numbers it reports may deviate significantly from what would otherwise be expected. Their preliminary analysis found that Benford's Digit Analysis could help bank supervisors identify fraudulent banks.

Financial firms have been increasingly employing ML in their business areas, including consumer lending, according to the third participant in the panel, Julapa Jagtiani from the Philadelphia Fed. One consequence of this use of ML is that it has allowed both traditional banks and nonbank fintech firms to become important providers of loans to both consumers and small businesses in markets in which they do not have a physical presence.

Potentially, ML also more effectively measures a borrower's credit risk than a consumer credit rating (such as a FICO score) alone allows. In a paper  with Catharine Lemieux from the Chicago Fed, Jagtiani explores the credit ratings produced by the Lending Club, an online lender that that has become the largest lender for personal unsecured installment loans in the United States. They find that the correlation between FICO scores and Lending Club rating grades has steadily declined from around 80 percent in 2007 to a little over 35 percent in 2015.

It appears that the Lending Club is increasingly taking advantage of alternative data sources and ML algorithms to evaluate credit risk. As a result, the Lending Club can more accurately price a loan's risk than a simple FICO score-based model would allow. Taken together, the presenters made clear that AI is likely to also transform many aspects of the financial sector.

January 4, 2018 in Banking , Financial System , Regulation | Permalink | Comments ( 0)

January 03, 2018


Is Macroprudential Supervision Ready for the Future?

Virtually everyone agrees that systemic financial crises are bad not only for the financial system but even more importantly for the real economy. Where the disagreements arise is how best to reduce the risk and costliness of future crises. One important area of disagreement is whether macroprudential supervision alone is sufficient to maintain financial stability or whether monetary policy should also play an important role.

In an earlier Notes from the Vault post, I discussed some of the reasons why many monetary policymakers would rather not take on the added responsibility. For example, policymakers would have to determine the appropriate measure of the risk of financial instability and how a change in monetary policy would affect that risk. However, I also noted that many of the same problems also plague the implementation of macroprudential policies.

Since that September 2014 post, additional work has been done on macroprudential supervision. Some of that work was the topic of a recent workshop, "Financial Regulation: Fit for the Future?," hosted by the Atlanta Fed and cosponsored by the Center for the Economic Analysis of Risk at Georgia State University. In particular, the workshop looked at three important issues related to macroprudential supervision: governance of macroprudential tools, measures of when to deploy macroprudential tools, and the effectiveness of macroprudential supervision. This macroblog post discusses some of the contributions of three presentations at the conference.

The question of how to determine when to deploy a macroprudential tool is the subject of a paper  by economists Scott Brave (from the Chicago Fed) and José A. Lopez (from the San Francisco Fed). The tool they consider is countercyclical capital buffers, which are supplements to normal capital requirements that are put into place during boom periods to dampen excessive credit growth and provide banks with larger buffers to absorb losses during a downturn.

Brave and Lopez start with existing financial conditions indices and use these to estimate the probability that the economy will transition from economic growth to falling gross domestic product (GDP) (and vice versa), using the indices to predict a transition from a recession to growth. Their model predicted a very high probability of transition to a path of falling GDP in the fourth quarter of 2007, a low probability of transitioning to a falling path in the fourth quarter of 2011, and a low but slightly higher probability in the fourth quarter of 2015.

Brave and Lopez then put these probabilities into a model of the costs and benefits associated with countercyclical capital buffers. Looking back at the fourth quarter of 2007, their results suggest that supervisors should immediately adopt an increase in capital requirements of 25 basis points. In contrast, in the fourth quarters of both 2011 and 2015, their results indicated that no immediate change was needed but that an increase in capital requirements of 25 basis points might be need to be adopted within the next six or seven quarters.

The related question—who should determine when to deploy countercyclical capital buffers—was the subject of a paper  by Nellie Liang, an economist at the Brookings Institution and former head of the Federal Reserve Board's Division of Financial Stability, and Federal Reserve Board economist Rochelle M. Edge. They find that most countries have a financial stability committee, which has an average of four or more members and is primarily responsible for developing macroprudential policies. Moreover, these committees rarely have the ability to adopt countercyclical macroprudential policies on their own. Indeed, in most cases, all the financial stability committee can do is recommend policies. The committee cannot even compel the competent regulatory authority in its country to either take action or explain why it chose not to act.

Implicit in the two aforementioned papers is the belief that countercyclical macroprudential tools will effectively reduce risks. Federal Reserve Board economist Matteo Crosignani presented a paper  he coauthored looking at the recent effectiveness of two such tools in Ireland.

In February 2015, the Irish government watched as housing prices climbed from their postcrisis lows at a potentially unsafe rate. In an attempt to limit the flow of funds into risky mortgage loans, the government imposed limits on the maximum permissible loan-to-value (LTV) ratio and loan-to-income ratio (LTI) for new mortgages. These regulations became effective immediately upon their announcement and prevented the Irish banks from making loans that violated either the LTV or LTI requirements.

Crosignani and his coauthors were able to measure a large decline in loans that did not conform to the new requirements. However, they also find that a sharp increase in mortgage loans that conformed to the requirements largely offset this drop. Additionally, Crosignani and his coauthors find that the banks that were most exposed to the LTV and LTI requirements sought to recoup the lost income by making riskier commercial loans and buying greater quantities of risky securities. Their findings suggest that the regulations may have stopped higher-risk mortgage lending but that other changes in their portfolio at least partially undid the effect on banks' risk exposure.

January 3, 2018 in Banking , Financial System , Regulation | Permalink | Comments ( 0)

November 15, 2017


Labor Supply Constraints and Health Problems in Rural America

A recent research study by Alison Weingarden at the Federal Reserve's Board of Governors found that wages for relatively low-skilled workers in nonmetropolitan areas of the country have been growing more rapidly than those in metropolitan areas. In a talk yesterday in Montgomery, Alabama, Atlanta Fed President Raphael Bostic provided some evidence that differences in labor supply resulting from disability and illness may be behind this shrinking urban wage premium.

For prime-age workers (those between 25 and 54 years old), the dynamics of labor force participation (LFP) differ widely between metropolitan and nonmetropolitan areas. (These data define a metropolitan statistical area, or MSA). The LFP rate in MSAs declined by about 1.1 percentage points between 2007 and 2017 versus a 3.3 percentage point decline in non-MSA areas.

The disparity is also evident within education groups. For those without a college degree, the MSA LFP rate is down 2.6 percentage points, versus 5.0 percentage points in non-MSAs. For those with a college degree, the MSA LFP rate is down 0.7 percentage points, versus a decline of 2.5 percentage points for college graduates in non-MSAs. Moreover, although LFP rates in MSAs have shown signs of recovery in the last couple of years, this is not happening in non-MSAs.

A recent macroblog post by my colleague Ellyn Terry and the Atlanta Fed's updated Labor Force Dynamics web page have shown that the decline in prime-age LFP is partly a story of nonparticipation resulting from a rise in health and disability problems that limit the ability to work. This rise is occurring even as the population is gradually becoming more educated. (Better health outcomes generally accompany increased educational attainment.)

The following chart explores the role of disability/illness in explaining the relatively larger decline in non-MSA LFP. It breaks the cumulative change in the LFP rates since 2007 into the part attributable to demographic trends and the part attributable to behavioral or cyclical changes within demographic groups.

The demographic changes—and especially the increased share of the population with a college degree—has put mild upward pressure on the prime-age LFP rate for both the MSA and non-MSA population. Controlling for the contribution from these demographic trends, increased nonparticipation because of poor health and disability pulled down the LFP rate in MSAs by 0.8 percentage points and lowered the rate in non-MSAs by 2.0 percentage points over the past decade. For those without a college degree, disability/illness accounted for about 1.2 percentage points of the 2.6 percentage point decline in the MSA participation rate, and it accounted for 2.6 percentage points of the 5.0 percentage point decline in the non-MSA participation rate.

Taken together with evidence from business surveys and anecdotal reports about hiring difficulties, it appears that the non-MSA labor market is relatively tight. The greater inward shift of the rural supply of labor is showing through to wage costs, and especially for rural jobs that require less education.

Although the move to higher wages is welcome news for those with a job, it also raises troubling questions about why labor force nonparticipation because of disability and illness has increased so much in the first place—especially among those with less education living in nonmetropolitan areas of the country.

It is clear that the health problems for rural communities have been intensifying. Several interrelated factors have likely contributed to this worsening trend, including poverty, deeply rooted cultural and social norms, and the characteristics of rural jobs, as well as geographic barriers and shortages of healthcare providers that have limited access to care. This complex set of circumstances suggests that finding effective solutions could prove difficult.

November 15, 2017 in Health Care , Labor Markets , Unemployment | Permalink | Comments ( 0)

Google Search



Recent Posts


Archives


Categories


Powered by TypePad