macroblog

About


The Atlanta Fed's macroblog provides commentary and analysis on economic topics including monetary policy, macroeconomic developments, inflation, labor economics, and financial issues.

Authors for macroblog are Dave Altig, John Robertson, and other Atlanta Fed economists and researchers.


August 11, 2016


Forecasting Loan Losses for Stress Tests

Bank capital requirements are back in the news with the recent announcements of the results of U.S. stress tests by the Federal Reserve and the European Union (E.U.) stress tests by the European Banking Authority (EBA). The Federal Reserve found that all 33 of the bank holding companies participating in its test would have continued to meet the applicable capital requirements. The EBA found progress among the 51 banks in its test, but it did not define a pass/fail threshold. In summarizing the results, EBA Chairman Andrea Enria is widely quoted as saying, "Whilst we recognise the extensive capital raising done so far, this is not a clean bill of health," and that there remains work to do.

The results of the stress tests do not mean that banks could survive any possible future macroeconomic shock. That standard would be an extraordinarily high one and would require each bank to hold capital equal to its total assets (or maybe even more if the bank held derivatives). However, the U.S. approach to scenario design is intended to make sure that the "severely adverse" scenario is indeed a very bad recession.

The Federal Reserve's Policy Statement on the Scenario Design Framework for Stress Testing indicates that the severely adverse scenario will have an unemployment increase of between 3 and 5 percentage points or a level of 10 percent overall. That statement observes that during the last half century, the United States has seen four severe recessions with that large of an increase in the unemployment rate, with the rate peaking at more than 10 percent in last three severe recessions.

To forecast the losses from such a severe recession, the banks need to estimate loss models for each of their portfolios. In these models, the bank estimates the expected loss associated with a portfolio of loans as a function of the variables in the scenario. In estimating these models, banks often have a very large number of loans with which to estimate losses in their various portfolios, especially the consumer and small business portfolios. However, they have very few opportunities to observe how the loans perform in a downturn. Indeed, in almost all cases, banks started keeping detailed loan loss data only in the late 1990s and, in many cases, later than that. Thus, for many types of loans, banks might have at best data for only the relatively mild recession of 2001–02 and the severe recession of 2007–09.

Perhaps the small number of recessions—especially severe recessions—would not be a big problem if recessions differed only in their depth and not their breadth. However, even comparably severe recessions are likely to hit different parts of the economy with varying degrees of severity. As a result, a given loan portfolio may suffer only small losses in one recession but take very large losses in the next recession.

With the potential for models to underestimate losses given there are so few downturns to calibrate to, the stress testing process allows humans to make judgmental changes (or overlays) to model estimates when the model estimates seem implausible. However, the Federal Reserve requires that bank holding companies should have a "transparent, repeatable, well-supported process" for the use of such overlays.

My colleague Mark Jensen recently made some suggestions about how stress test modelers could reduce the uncertainty around projected losses because of limited data from directly comparable scenarios. He recommends using estimation procedures based on a probability theorem attributed to Reverend Thomas Bayes. When applied to stress testing, Bayes' theorem describes how to incorporate additional empirical information into an initial understanding of how losses are distributed in order to update and refine loss predictions.

One of the benefits of using techniques based on this theorem is that it allows the incorporation of any relevant data into the forecasted losses. He gives the example of using foreign data to help model the distribution of losses U.S. banks would incur if U.S. interest rates become negative. We have no experience with negative interest rates, but Sweden has recently been accumulating experience that could help in predicting such losses in the United States. Jensen argues that Bayesian techniques allow banks and bank supervisors to better account for the uncertainty around their loss forecasts in extreme scenarios.

Additionally, I have previously argued that the existing capital standards provide further way of mitigating the weaknesses in the stress tests. The large banks that participate in the stress tests are also in the process of becoming subject to a risk-based capital requirement commonly called Basel III that was approved by an international committee of banking supervisors after the financial crisis. Basel III uses a different methodology to estimate losses in a severe event, one where the historical losses in a loan portfolio provide the parameters to a loss distribution. While Basel III faces the same problem of limited loan loss data—so it almost surely underestimates some risks—those errors are likely to be somewhat different from those produced by the stress tests. Hence, the use of both measures is likely to somewhat reduce the possibility that supervisors end up requiring too little capital for some types of loans.

Both the stress tests and risk-based models of the Basel III type face the unavoidable problem of inaccurately measuring risk because we have limited data from extreme events. The use of improved estimation techniques and multiple ways of measuring risk may help mitigate this problem. But the only way to solve the problem of limited data is to have a greater number of extreme stress events. Given that alternative, I am happy to live with imperfect measures of bank risk.

Author's note: I want to thank the Atlanta Fed's Dave Altig and Mark Jensen for helpful comments.


August 11, 2016 in Banking, Financial System, Regulation | Permalink

Comments

When looking at these short duration data sets on losses, you have to go back to the fundamentals of the situation.

Housing had a near 50 year long series of only regional losses. The statistical analysts (which was everybody except the bond bankers) assumed therefore losses were highly unlikely.

Meanwhile, fundamentals were stagnant wages, stagnant lifetime incomes, increasing share going to housing, education, loss of retirement. The fundamental analyst could see the losses were already there with certainty. The fundamental analyst though could not say when they would show up.

And when they did, all the risks correlated.

You guys should promote separating risk business from deposit taking, full stop.

Posted by: john | August 12, 2016 at 09:05 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

June 06, 2016


After the Conference, Another Look at Liquidity

When it comes to assessing the impact of central bank asset purchase programs (often called quantitative easing or QE), economists tend to focus their attention on the potential effects on the real economy and inflation. After all, the Federal Reserve's dual mandate for monetary policy is price stability and full employment. But there is another aspect of QE that may also be quite important in assessing its usefulness as a policy tool: the potential effect of asset purchases on financial markets through the collateral channel.

Asset purchase programs involve central bank purchases of large quantities of high-quality, highly liquid assets. Postcrisis, the Fed has purchased more than $3 trillion of U.S. Treasury securities and agency mortgage-backed securities, the European Central Bank (ECB) has purchased roughly 727 billion euros' worth of public-sector bonds (issued by central governments and agencies), and the Bank of Japan is maintaining an annual purchase target of 80 trillion yen. These bonds are not merely assets held by investors to realize a return; they are also securities highly valued for their use as collateral in financial transactions. The Atlanta Fed's 21st annual Financial Markets Conference explored the potential consequences of these asset purchase programs in the context of financial market liquidity.

The collateral channel effect focuses on the role that these low-risk securities play in the plumbing of U.S. financial markets. Financial firms fund a large fraction of their securities holdings in the repurchase (or repo) markets. Repurchase agreements are legally structured as the sale of a security with a promise to repurchase the security at a fixed price at a given point in the future. The economics of this transaction are essentially similar to those of a collateralized loan.

The sold and repurchased securities are often termed "pledged collateral." In these transactions, which are typically overnight, the lender will ordinarily lend cash equal to only a fraction of the securities value, with the remaining unfunded part called the "haircut." The size of the haircut is inversely related to the safety and liquidity of the security, with Treasury securities requiring the smallest haircuts. When the securities are repurchased the following day, the borrower will pay back the initial cash plus an additional amount known as the repo rate. The repo rate is essentially an overnight interest rate paid on a collateralized loan.

Central bank purchases of Treasury securities may have a multiplicative effect on the potential efficiency of the repo market because these securities are often used in a chain of transactions before reaching a final holder for the evening. Here's a great diagram presented by Phil Prince of Pine River Capital Management illustrating the role that bonds and U.S. Treasuries play in facilitating a variety of transactions. In this example, the UST (U.S. Treasury) securities are first used as collateral in an exchange between the UST securities lender and the globally systemically important financial institution (GSIFI bank/broker dealer), then between the GSIFI bank and the cash provider, a money market mutual fund (MMMF), corporation, or sovereign wealth fund (SWF). The reuse of the UST collateral reduces the funding cost of the GSIFI bank and, hence, the cost to the levered investor/hedge fund who is trying to exploit discrepancies in the pricing of a corporate bond and stock.

Just how important or large is this pool of reusable collateral? Manmohan Singh of the International Monetary Fund presented the following charts, depicting the pledged collateral at major U.S. and European financial institutions that can be reused in other transactions.

So how do central bank purchases of high-quality, liquid assets affect the repo market—and why should macroeconomists care? In his presentation, Marvin Goodfriend of Carnegie Mellon University concluded that central bank asset purchases, which he terms "pure monetary policy," lower short-term interest rates (especially bank-to-bank lending) but increase the cost of funding illiquid assets through the repo market. And Singh noted that repo rates are an important part of the constellation of short-term interest rates and directly link overnight markets with the longer-term collateral being pledged. Thus, the interaction between a central bank's interest-rate policy and its balance sheet policy is an important aspect of the transmission of monetary policy to longer-term interest rates and real economic activity.

Ulrich Bindseil, director of general market operations at the ECB, discussed a variety of ways in which central bank actions may affect, or be affected by, bond market liquidity. One way that central banks may mitigate any adverse impact on market liquidity is through their securities lending programs, according to Bindseil. Central banks use such programs to lend particular bonds back out to the market to "provide a secondary and temporary source of securities to the financing market...to promote smooth clearing of Treasury and Agency securities."

On June 2, for example, the New York Fed lent $17.8 billion of UST securities from the Fed's portfolio. These operations are structured as collateral swaps—dealers pledge other U.S. Treasury bonds as collateral with the Fed. During the financial crisis, the Federal Reserve used an expanded version of its securities lending program called the Term Securities Lending Facility to allow firms to replace lower-quality collateral that was difficult to use in repo transactions with Treasury securities.

Finally, the Fed currently releases some bonds to the market each day in return for cash, through its overnight reverse repo operations, a supplementary facility used to support control of the federal funds rate as the Federal Open Market Committee proceeds with normalization. However, this release has an important limitation: these operations are conducted in the triparty repo market, and the bonds released through these operations can be reused only within that market. In contrast, if the Fed were to sell its U.S. Treasuries, the securities could not only be used in the triparty repo market but also as collateral in other transactions including ones in the bilateral repo market (you can read more on these markets here). As long as central bank portfolios remain large and continue to grow as in Europe and Japan, policymakers are integrally linked to the financial plumbing at its most basic level.

To see a video of the full discussion of these issues as well as other conference presentations on bond market liquidity, market infrastructure, and the management of liquidity within financial institutions, please visit Getting a Grip on Liquidity: Markets, Institutions, and Central Banks. My colleague Larry Wall's conference takeaways on the elusive definition of liquidity, along with the impact of innovation and regulation on liquidity, are here.

June 6, 2016 in Banking, Financial System, Interest Rates, Monetary Policy | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 13, 2016


Putting the MetLife Decision into an Economic Context

In a recently released decision, a U.S. district court has ruled that the Financial Stability Oversight Council's (FSOC's) decision to designate MetLife as a potential threat to financial stability was "arbitrary and capricious" and rescinded that designation. This decision raises many questions, among them:

  • Why did MetLife sue to end its status as a too-big-to-fail (TBTF) firm?
  • How will this decision affect the Federal Reserve's regulation of nonbank financial firms?
  • What else can be done to reduce the risk of crisis arising from nonbank financial firms?

Why does MetLife want to end its TBTF status?
An often-expressed concern is that market participants will consider FSOC-designated firms too big to fail, and investors will accord these firms lower risk premiums  (see, for example, Peter J. Wallison). The result is that FSOC-designated firms will gain a competitive advantage. If so, why did MetLife sue to have the designation rescinded? And why did the announcement of the court's determination result in an immediate 5 percent increase in the MetLife's stock price?

One possible explanation is that the FSOC's designation guarantees the firm will be subject to higher regulatory costs, but it only marginally changes the likelihood it would receive a government bailout. The Dodd-Frank Act (DFA) requires that FSOC-designated firms be subject to consolidated prudential supervision by the Federal Reserve using standards that are more stringent than the requirements for other nonbank financial firms.

Moreover, the argument that such designation automatically conveys a competitive advantage has at least two weaknesses. First, although Title II of the DFA authorizes the Federal Deposit Insurance Corporation (FDIC) to resolve a failing nonbank firm in certain circumstances, DFA does not provide FDIC insurance for any of the nonbank firm's liabilities, nor does it provide the FDIC with funds to undertake a bailout. The FDIC is supposed to recover its costs from the failed firm's assets. Admittedly, DFA does allow for the possibility that the FDIC would need to assess other designated firms for part of the cost of a resolution. However, MetLife could as easily have been assessed to pay for another firm as it could have been the beneficiary of assessments on other systemically important firms.

A second potential weakness in the competitive advantage argument is that the U.S. Treasury Secretary decides to invoke FDIC resolution only after receiving a recommendation from the Federal Reserve Board and one other federal financial regulatory agency (depending upon the type of failing firm). Invocation of resolution is not automatic. Moreover, a part of any decision authorizing FDIC resolution are findings that at the time of authorization:

  • the firm is in default or in danger of default,
  • resolution under other applicable law (bankruptcy statutes) would have "serious adverse consequences" on financial stability, and
  • those adverse effects could be avoided or mitigated by FDIC resolution.

Although it would seem logical that FSOC-designated firms are more likely to satisfy these criteria than other financial firms, the Title II criteria for FDIC resolution are the same for both types of firms.

How does this affect the Fed's regulation of nonbank firms?
Secretary of the Treasury Jack Lew has indicated his strong disagreement with the district court's decision, and the U.S. Treasury has said it will appeal. Suppose, however, that FSOC designation ultimately does become far more difficult. How significantly would that affect the Federal Reserve's regulatory power over nonbank financial firms?

Although the obvious answer would be that it would greatly reduce the Fed's regulatory power, recent experience casts some doubt on this view. Nonbank financial firms appear to regard FSOC designation as imposing costly burdens that substantially exceed any benefits they receive. Indeed, GE Capital viewed the costs as so significant that it had been selling large parts of its operations and recently petitioned the FSOC to rescind its designation. Unless systemically important activities are a core part of the firm's business model, nonbank financial firms may decide to avoid undertaking activities that would risk FSOC designation.

Thus, a plausible set of future scenarios is that the Federal Reserve would be supervising few, if any, nonbank financial firms regardless of the result of the MetLife case. Rather, ultimate resolution of the case may have more of an impact on whether large nonbank financial firms conduct systemically important activities (if designation becomes much harder) or the activities are conducted by some combination of smaller nonbank financial firms and by banks that are already subject to Fed regulation (if the ruling does not prevent future designations).

Lessons learned?
Regardless of how the courts and the FSOC respond to this recent judicial decision, the financial crisis should have taught us valuable lessons about the importance of the nonbank financial sector to financial stability. However, those lessons should go beyond merely the need to impose prudential supervision on any firms that are systemically important.

The cause of the financial crisis was not the failure of one or two large nonbank financial firms. Rather, the cause was that almost the entire financial system stood on the brink of collapse because almost all the major participants were heavily exposed to the weak credit standards that were pervasive in the residential real estate business. Yet if the real problem was the risk of multiple failures as a result of correlated exposures to a single large market, perhaps we ought to invest more effort in evaluating the riskiness of markets that could have systemic consequences.

In an article in Notes from the Vault and other forums, I have called for systematic end-to-end reviews of major financial markets starting with the origination of the risks and ending with the ultimate holder(s) of the risks. This analysis would involve both quantitative analysis of risk measures and qualitative analysis of the safeguards designed to reduce risk.

The primary goal would be to identify and try to correct weaknesses in the markets. A secondary goal would be to give the authorities a better sense of where problems are likely to arise if a market does encounter problems.


April 13, 2016 in Banking, Regulation | Permalink

Comments

Looking at market micro-structure is an excellent idea. One might even be able to look at the incentives of the different participants, ala Ashcraft and Schuermann

https://www.newyorkfed.org/medialibrary/media/research/staff_reports/sr318.pdf

Posted by: Brian Peters | April 13, 2016 at 04:23 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 11, 2016


The Rise of Shadow Banking in China

China's banking system has suffered significant losses over the past two years, which has raised concerns about the health of China's financial industry. Such losses are perhaps not all that surprising. Commercial banks have been increasing their risk-taking activities in the form of shadow lending. See, for example, here, here, and here for some discussion of the evolution of China's shadow banking system.

The increase in risk taking by banks has occurred despite a rapid decline in money growth since 2009 and the People's Bank of China's efforts to limit credit expansions to real estate and other industries that appear to be over capacity.

One area of expanded activity has been investment in asset-backed "securities" by China's large non-state banks. This investment has created potentially significant risks to the balance sheets of these institutions (see the charts below). Using the micro-transaction-based data on shadow entrusted loans, Chen, Ren, and Zha (2016) have provided theoretical and empirical insights into this important issue (see also this Vox article that summarizes the paper).

Recent regulatory reforms in China have taken a positive step to try to limit such risk-taking behavior, although the success of these efforts remains to be seen. An even more challenging task lies ahead for designing a comprehensive and sustainable macroprudential framework to support the healthy functioning of China's traditional and shadow banking industries.

April 11, 2016 in Asia, Banking, Regulation | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

March 15, 2016


Collateral Requirements and Nonbank Online Lenders: Evidence from the 2015 Small Business Credit Survey

Businesses can secure a bank loan by offering collateral—typically a business asset such as equipment or real estate. However, the recently released 2015 Small Business Credit Survey (SBCS) Report on Employer Firms,conducted by seven regional Reserve Banks, found that 63 percent of business owners who had borrowed also used their personal assets or guarantee to secure financing. Surprisingly, the use of personal collateral was common not only among startups. Older and relatively larger small firms (see the following chart) also relied heavily on personal assets.

160315a
Source: 2015 Small Business Credit Survey
Note: "Unsure", "None", and "Other" were also options but are not shown on the chart.

(enlarge)

Alternative lending options also exercised
Not every small business owner has sufficient hard assets, such as real estate or equipment, that can be used as collateral to secure a traditional bank loan or line of credit. For these circumstances, there are options such as credit cards and products offered by nonbank lenders (mostly operating online) that have less stringent underwriting requirements than banks. Many online nonbank lenders advertise unsecured loans or require only a general lien on business assets, without valuing those business assets.

In the 2015 SBCS, 20 percent of small firms seeking loans or lines of credit applied at nonbank online lenders. These lenders have a good reputation for quick application turnaround, and the collateral requirements can be looser than those applied by traditional lenders. But when borrowers were asked about their overall experience, only a net 15 percent of businesses approved at nonbank online lenders were satisfied (40.6 percent were satisfied and 25.3 percent were dissatisfied). In contrast, small banks received a relatively high net satisfaction score of 75 percent (see the chart).

160315b
Source: 2015 Small Business Credit Survey Report on Employer Firms
1 Satisfaction score is the share satisfied with lender minus the share dissatisfied.
2 "Online lenders" are defined as alternative and marketplace lenders, including Lending Club, OnDeck, CAN Capital, and PayPal Working Capital.
3 "Other" includes government loan funds and community development financial institutions.

(enlarge)

The survey also showed that high interest rates were the primary reason for dissatisfaction at nonbank online lenders (see the chart).

160315c
Source: 2015 Small Business Credit Survey Report on Employer Firms
Note: Respondents could select multiple options. Select responses shown due to low observation count.

(enlarge)

Merchant cash advances make advances
Most applicants to nonbank online lenders were seeking loans and lines of credit, but some were seeking a product that tends to be particularly expensive relative to other finance options: merchant cash advances (MCA). MCAs have been around for decades, but their popularity has risen in the wake of the financial crisis. Typically a lump-sum payment in exchange for a portion of future credit card sales, the terms of MCAs can be enticing because repayment seems easier than paying off a structured business loan that requires a fixed monthly payment. Instead, the lender is paid back as the business generates revenue, in theory making cash flow easier to manage.

One potential challenge for users of MCA products is interpreting the repayment terms. Instead of displaying an annual percentage rate (APR), MCAs are usually advertised with a "buy rate" (typically 1.2 to 1.4). For example, a buy rate of 1.3 on $100,000 would require the borrower to pay back $130,000. However, a percentage of the principal is not the same as an APR. The table below compares total interest payments made on a 1.3 MCA versus a 30 percent APR business loan repaid over 12 months and over six months. With a 12-month business loan, a 30 percent APR would equal total interest payments of roughly $17,000. With a six-month business loan, repayment would include about $9,000 in interest.

Because an MCA is structured as a commercial transaction instead of a loan, it is regulated by the Uniform Commercial Code in each state instead of by banking laws such as the Truth in Lending Act. Consequently, the provider does not have to follow all of the regulations and documentation requirements (such as displaying an APR) associated with making loans.

Converting a buy rate into an APR is not straightforward for many potential users, as was made clear in a recent online lending focus group study with small business owners conducted by the Cleveland Fed. When asked what the APR was on a $40,000 MCA that required a repayment of $52,000 (the same as a 1.3 buy rate), their answers were the following: (Product A is the MCA type of product; see the study for exactly how it was presented to respondents.)

160315e
Source: Federal Reserve Bank of Cleveland
(enlarge)

The correct answer is that "it depends on how long it takes to pay back." For example, if the debt is repaid over six months, the APR would be 110 percent (as this calculator shows).

Nonbank online lenders can fill gaps in the borrowing needs of small business. But there may also be a role for greater clarity to ensure borrowers understand the terms they are signing up for. In a September 2015 speech, Federal Reserve Governor Lael Brainard highlights one self-policing movement already well under way:

Some have raised concerns about the high APRs associated with some online alternative lending products. Others have raised concerns about the risk that some small business borrowers may have difficulty fully understanding the terms of the various loan products or the risk of becoming trapped in layered debt that poses risks to the survival of their businesses. Some industry participants have recently proposed that online lenders follow a voluntary set of guidelines designed to standardize best practices and mitigate these risks. It is too soon to determine whether such efforts of industry participants to self-police will be sufficient. Even with these efforts, some have suggested a need for regulators to take a more active role in defining and enforcing standards that apply more broadly in this sector.

Many, but not all, nonbank online lenders have already signed the Small Business Borrower Bill of Rights. Results from the 2015 Small Business Credit Survey Report on Employer Firms can be found on our website.


March 15, 2016 in Banking, Small Business | Permalink

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

May 13, 2014


Pondering QE

Today’s news brings another indication that low inflation rates in the euro area have the attention of the European Central Bank. From the Wall Street Journal (Update: via MarketWatch):

Germany's central bank is willing to back an array of stimulus measures from the European Central Bank next month, including a negative rate on bank deposits and purchases of packaged bank loans if needed to keep inflation from staying too low, a person familiar with the matter said...

This marks the clearest signal yet that the Bundesbank, which has for years been defined by its conservative opposition to the ECB's emergency measures to combat the euro zone's debt crisis, is fully engaged in the fight against super-low inflation in the euro zone using monetary policy tools...

Notably, these tools apparently do not include Fed-style quantitative easing:

But the Bundesbank's backing has limits. It remains resistant to large-scale purchases of public and private debt, known as quantitative easing, the person said. The Bundesbank has discussed this option internally but has concluded that with government and corporate bond yields already quite low in Europe, the purchases wouldn't do much good and could instead create financial stability risks.

Should we conclude that there is now a global conclusion about the value and wisdom of large-scale asset purchases, a.k.a. QE? We certainly have quite a bit of experience with large-scale purchases now. But I think it is also fair to say that that experience has yet to yield firm consensus.

You probably don’t need much convincing that QE consensus remains elusive. But just in case, I invite you to consider the panel discussion we titled “Greasing the Skids: Was Quantitative Easing Needed to Unstick Markets? Or Has it Merely Sped Us toward the Next Crisis?” The discussion was organized for last month’s 2014 edition of the annual Atlanta Fed Financial Markets Conference.

Opinions among the panelists were, shall we say, diverse. You can view the entire session via this link. But if you don’t have an hour and 40 minutes to spare, here is the (less than) ten-minute highlight reel, wherein Carnegie Mellon Professor Allan Meltzer opines that Fed QE has become “a foolish program,” Jeffries LLC Chief Market Strategist David Zervos declares himself an unabashed “lover of QE,” and Federal Reserve Governor Jeremy Stein weighs in on some of the financial stability questions associated with very accommodative policy:


You probably detected some differences of opinion there. If that, however, didn’t satisfy your craving for unfiltered debate, click on through to this link to hear Professor Meltzer and Mr. Zervos consider some of Governor Stein’s comments on monitoring debt markets, regulatory approaches to pursuing financial stability objectives, and the efficacy of capital requirements for banks.

Photo of Dave AltigBy Dave Altig, executive vice president and research director of the Atlanta Fed.


May 13, 2014 in Banking, Capital Markets, Economic conditions, Federal Reserve and Monetary Policy, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a3fd07f60d970b

Listed below are links to blogs that reference Pondering QE:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 31, 2014


A Brief Interview with Sergio Rebelo on the Euro-Area Economy

Last month, we at the Atlanta Fed had the great pleasure of hosting Sergio Rebelo for a couple of days. While he was here, we asked Sergio to share his thoughts on a wide range of current economic topics. Here is a snippet of a Q&A we had with him about the state of the euro-area economy:

Sergio, what would you say was the genesis of the problems the euro area has faced in recent years?

The contours of the euro area’s problems are fairly well known. The advent of the euro gave peripheral countries—Ireland, Spain, Portugal, and Greece—the ability to borrow at rates that were similar to Germany's. This convergence of borrowing costs was encouraged through regulation that allowed banks to treat all euro-area sovereign bonds as risk free.

The capital inflows into the peripheral countries were not, for the most part, directed to the tradable sector. Instead, they financed increases in private consumption, large housing booms in Ireland and Spain, and increases in government spending in Greece and Portugal. The credit-driven economic boom led to a rise in labor costs and a loss of competitiveness in the tradable sector.

Was there a connection between the financial crisis in the United States and the sovereign debt crisis in the euro area?

Simply put, after Lehman Brothers went bankrupt, we had a sudden stop of capital flows into the periphery, similar to that experienced in the past by many Latin American countries. The periphery boom quickly turned into a bust.

What do you see as the role for euro area monetary policy in that context?

It seems clear that more expansionary monetary policy would have been helpful. First, it would have reduced real labor costs in the peripheral countries. In those countries, the presence of high unemployment rates moderates nominal wage increases, so higher inflation would have reduced real wages. Second, inflation would have reduced the real value of the debts of governments, banks, households, and firms. There might have been some loss of credibility on the part of the ECB [European Central Bank], resulting in a small inflation premium on euro bonds for some time. But this potential cost would have been worth paying in return for the benefits.

And did this happen?

In my view, the ECB did not follow a sufficiently expansionary monetary policy. In fact, the euro-area inflation rate has been consistently below 2 percent and the euro is relatively strong when compared to a purchasing-power-parity benchmark. The euro area turned to contractionary fiscal policy as a panacea. There are good theoretical reasons to believe that—when the interest rate remains constant that so the central bank does not cushion the fall in government spending—the multiplier effect of government spending cuts can be very large. See, for example, Gauti Eggertsson and Michael Woodford, “The Zero Interest-rate Bound and Optimal Monetary Policy,” and Lawrence Christiano, Martin Eichenbaum, and Sergio Rebelo, "When Is the Government Spending Multiplier Large?

Theory aside, the results of the austerity policies implemented in the euro area are clear. All of the countries that underwent this treatment are now much less solvent than in the beginning of the adjustment programs managed by the European Commission, the International Monetary Fund, and the ECB.

Bank stress testing has become a cornerstone of macroprudential financial oversight. Do you think they helped stabilize the situation in the euro area during the height of the crisis in 2010 and 2011?

No. Quite the opposite. I think the euro-area problems were compounded by the weak stress tests conducted by the European Banking Association in 2011. Almost no banks failed, and almost no capital was raised. Banks largely increased their capital-to-asset ratios by reducing assets, which resulted in a credit crunch that added to the woes of the peripheral countries.

But we’re past the worst now, right? Is the outlook for the euro-area economy improving?

After hitting the bottom, a very modest recovery is under way in Europe. But the risk that a Japanese-style malaise will afflict Europe is very real. One useful step on the horizon is the creation of a banking union. This measure could potentially alleviate the severe credit crunch afflicting the periphery countries.

Thanks, Sergio, for this pretty sobering assessment.

John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department

Editor’s note: Sergio Rebelo is the Tokai Bank Distinguished Professor of International Finance at Northwestern University’s Kellogg School of Management. He is a fellow of the Econometric Society, the National Bureau of Economic Research, and the Center for Economic Policy Research.


January 31, 2014 in Banking, Capital and Investment, Economics, Europe, Interest Rates, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a73d66a0e3970d

Listed below are links to blogs that reference A Brief Interview with Sergio Rebelo on the Euro-Area Economy:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 04, 2013


Is (Risk) Sharing Always a Virtue?

The financial system cannot be made completely safe because it exists to allocate funds to inherently risky projects in the real economy. Thus, an important question for policymakers is how best to structure the financial system to absorb these losses while minimizing the risk that financial sector failures will impair the real economy.

Standard theories would predict that one good way of reducing financial sector risk is diversification. For example, the financial system could be structured to facilitate the development of large banks, a point often made by advocates for big banks such as Steve Bartlett. Another, not mutually exclusive, way of enhancing diversification is to create a system that shares risks across banks. An example is the Dodd-Frank Act mandate requiring formerly over-the-counter derivatives transactions to be centrally cleared.
 
However, do these conclusions based on individual bank stability necessarily imply that risk sharing will make the financial system safer? Is it even relevant to the principal risks facing the financial system? Some of the papers presented at the recent Atlanta Fed conference, "Indices of Riskiness: Management and Regulatory Implications," broadly addressed these questions and others. Other papers discuss the impact of bank distress on local economies, methods of predicting bank failure, and various aspects of incentive compensation paid to bankers (which I discuss in a recent Notes from the Vault).

The stability implications of greater risk sharing across banks are explored in "Systemic Risk and Stability in Financial Networks" by Daron Acemoglu, Asuman Ozdaglar, and Alireza Tahbaz-Salehi. They develop a theoretical model of risk sharing in networks of banks. The most relevant comparison they draw is between what they call a "complete financial network" (maximum possible diversification) and a "weakly connected" network in which there is substantial risk sharing between pairs of banks but very little risk sharing outside the individual pairs. Consistent with the standard view of diversification, the complete networks experience few, if any, failures when individual banks are subject to small shocks, but some pairs of banks do fail in the weakly connected networks. However, at some point the losses become so large that the complete network undergoes a phase transition, spreading the losses in a way that causes the failure of more banks than would have occurred with less risk sharing.

Extrapolating from this paper, one could imagine that risk sharing could induce a false sense of security that would ultimately make a financial system substantially less stable. At first a more interconnected system shrugs off smaller shocks with seemingly no adverse impact. This leads bankers and policymakers to believe that the system can handle even more risk because it has become more stable. However, at some point the increased risk taking leads to losses sufficiently large to trigger a phase transition, and the system proves to be even less stable than it was with weaker interconnections.

While interconnections between financial firms are a theoretically important determinant of contagion, how important are these connections in practice? "Financial Firm Bankruptcy and Contagion," by Jean Helwege and Gaiyan Zhang, analyzes the spillovers from distressed and failing financial firms from 1980 to 2010. Looking at the financial firms that failed, they find that counterparty risk exposure (the interconnections) tend to be small, with no single exposure above $2 billion and the average a mere $53.4 million. They note that these small exposures are consistent with regulations that limit banks' exposure to any single counterparty. They then look at information contagion, in which the disclosure of distress at one financial firm may signal adverse information about the quality of a rival's assets. They find that the effect of these signals is comparable to that found for direct credit exposure.

Helwege and Zhang's results suggest that we should be at least as concerned about separate banks' exposure to an adverse shock that hits all of their assets as we should be about losses that are shared through bank networks. One possible common shock is the likely increase in the level and slope of the term structure as the Federal Reserve begins tapering its asset purchases and starts a process ultimately leading to the normalization of short-term interest rate setting. Although historical data cannot directly address banks' current exposure to such shocks, such data can provide evidence on banks' past exposure. William B. English, Skander J. Van den Heuvel, and Egon Zakrajšek presented evidence on this exposure in the paper "Interest Rate Risk and Bank Equity Valuations." They find a significant decrease in bank stock prices in response to an unexpected increase in the level or slope of the term structure. The response to slope increases (likely the primary effect of tapering) is somewhat attenuated at banks with large maturity gaps. One explanation for this finding is that these banks may partially recover their current losses with gains they will accrue when booking new assets (funded by shorter-term liabilities).

Overall, the papers presented in this part of the conference suggest that more risk sharing among financial institutions is not necessarily always better. Even though it may provide the appearance of increased stability in response to small shocks, the system is becoming less robust to larger shocks. However, it also suggests that shared exposures to a common risk are likely to present at least as an important a threat to financial stability as interconnections among financial firms, especially as the term structure and the overall economy respond to the eventual return to normal monetary policy. Along these lines, I recently offered some thoughts on how to reduce the risk of large widespread losses due to exposures to a common (credit) risk factor.

Photo of Larry WallBy Larry Wall, director of the Atlanta Fed's Center for Financial Innovation and Stability

 

Note: The conference "Indices of Riskiness: Management and Regulatory Implications" was organized by Glenn Harrison (Georgia State University's Center for the Economic Analysis of Risk), Jean-Charles Rochet, (University of Zurich), Markus Sticker, Dirk Tasche (Bank of England, Prudential Regulatory Authority), and Larry Wall (the Atlanta Fed's Center for Financial Innovation and Stability).


December 4, 2013 in Banking, Financial System | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019b0223ffd0970c

Listed below are links to blogs that reference Is (Risk) Sharing Always a Virtue?:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

March 08, 2013


Will the Next Exit from Monetary Stimulus Really Be Different from the Last?

Suppose you run a manufacturing business—let's say, for example, widgets. Your customers are loyal and steady, but you are never completely certain when they are going to show up asking you to satisfy their widget desires.

Given this uncertainty, you consider two different strategies to meet the needs of your customers. One option is to produce a large quantity of widgets at once, store the product in your warehouse, and when a customer calls, pull the widgets out of inventory as required.

A second option is to simply wait until buyers arrive at your door and produce widgets on demand, which you can do instantaneously and in as large a quantity as you like.

Thinking only about whether you can meet customer demand when it presents itself, these two options are basically identical. In the first case you have a large inventory to support your sales. In the second case you have a large—in fact, infinitely large—"shadow" inventory that you can bring into existence in lockstep with demand.

I invite you to think about this example as you contemplate this familiar graph of the Federal Reserve's balance sheet:

130307

I gather that a good measure of concern about the size of the Fed's (still growing) balance sheet comes from the notion that there is more inherent inflation risk with bank reserves that exceed $1.5 trillion than there would be with reserves somewhere in the neighborhood of $10 billion (which would be the ballpark value for the pre-crisis level of reserves).

I understand this concern, but I don't believe that it is entirely warranted. My argument is as follows: The policy strategy for tightening policy (or exiting stimulus) when the banking system is flush with reserves is equivalent to the strategy when the banking system has low (or even zero) reserves in the same way that the two strategies for meeting customer demand that I offered at the outset of this post are equivalent.

Here's why. Suppose, just for example, that bank reserves are literally zero and the Federal Open Market Committee (FOMC) has set a federal funds rate target of, say, 3 percent. Despite the fact that bank reserves are zero there is a real sense in which the potential size of the balance sheet—the shadow balance sheet, if you will—is very large.

The reason is that when the FOMC sets a target for the federal funds rate, it is sending very specific instructions to the folks from the Open Market Desk at the New York Fed, who run monetary policy operations on behalf of the FOMC. Those instructions are really pretty simple: If you have to inject more bank reserves (and hence expand the size of the Fed's balance sheet) to maintain the FOMC's funds rate target, do it.

To make sense of that statement, it is helpful to remember that the federal funds rate is an overnight interest rate that is determined by the supply and demand for bank reserves. Simplifying just a bit, the demand for reserves comes from the banking system, and the supply comes from the Fed. As in any supply and demand story, if demand goes up, so does the "price"—in this case, the federal funds rate.

In our hypothetical example, the Open Market Desk has been instructed not to let the federal funds rate deviate from 3 percent—at least not for very long. With such instructions, there is really only one thing to do in the case that demand from the banking system increases—create more reserves.

To put it in the terms of the business example I started out with, in setting a funds rate target the FOMC is giving the Open Market Desk the following marching orders: If customers show up, step up the production and meet the demand. The Fed's balance sheet in this case will automatically expand to meet bank reserve demand, just as the businessperson's inventory would expand to support the demand for widgets. As with the businessperson in my example, there is little difference between holding a large tangible inventory and standing ready to supply on demand from a shadow inventory.

Though the analogy is not completely perfect—in the case of the Fed's balance sheet, for example it is the banks and not the business (i.e., the Fed) that hold the inventory—I think the story provides an intuitive way to process the following comments (courtesy of Bloomberg) from Fed Chairman Ben Bernanke, from last week's congressional testimony:

"Raising interest rate on reserves" when the balance sheet is large is the functional equivalent to raising the federal funds rate when the actual balance sheet is not so large, but the potential or shadow balance sheet is. In both cases, the strategy is to induce banks to resist deploying available reserves to expand deposit liabilities and credit. The only difference is that, in the former case, the available reserves are explicit, and in the latter case they are implicit.

The Monetary Policy Report that accompanied the Chairman's testimony contained a fairly thorough summary of costs that might be associated with continued monetary stimulus. Some of these in fact pertain to the size of the Fed's balance sheet. But, as the Chairman notes in the video clip above, when it comes to the mechanics of exiting from policy stimulus, the real challenge is the familiar one of knowing when it is time to alter course.

Photo of Dave AltigBy Dave Altig, executive vice president and research director of the Atlanta Fed

 

March 8, 2013 in Banking, Fed Funds Futures, Federal Reserve and Monetary Policy, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef017d41937c63970c

Listed below are links to blogs that reference Will the Next Exit from Monetary Stimulus Really Be Different from the Last?:

Comments

One potential risk this time is that the Fed has been buying lots of assets that aren't treasuries, and some of the riskier assets can no longer be sold for the same price at which it was bought. In theory that situation could leave the Fed unable to recall all the money it put into circulation.

That said, you are right that interest on reserves could still be raised to have the same effects.

Posted by: Matthew Martin | March 08, 2013 at 03:30 PM

"In both cases, the strategy is to induce banks to resist deploying available reserves to expand deposit liabilities and credit."

Banks cannot lend their reserves. In fact, there is no balance sheet transaction that will allow a central bank liability to be loaned to a "non-bank" entity. Banks make loans by issuing a demand deposit and not by issuing reserves. Bank lending is never constrained by a reserve position.

The IoER policy implemented in 2008 moved the Federal Reserve out of a "corridor system" and into a "floor system". Under a floor system the level of reserves and the overnight interest rate are divorced. The IoER or "floor level" also becomes the deposit level. This disconnect works as long as there are sufficient excess reserves within the system, which in the case of the US, there are adequate excess reserves.

It should also be noted that future increases in the overnight rate are simply announced with the lending and deposit rates changing in tandem. Traditional models of draining reserves via FOMO are no longer required. Reserves are not the dual of overnight interest rates. Thus, when the Fed would like to "tighten" policy it will not be required to reduce the size of it's balance sheet as draining operations are no longer required to hit the overnight target.

Posted by: JJTV | March 08, 2013 at 05:58 PM

How about changing how monetary policy is conducted? Instead of using the blocked and saturated credit markets for monetary policy just bypass them and modify the fed so it deals directly with the public.

www.internationalmonetary.wordpress.com

Posted by: Daniel | March 09, 2013 at 12:55 AM

If the fed marks up its long position and passes the gain to the treasury wont it have to pass the loss when it hikes the fed rate? and what will be the impact to treasurys when it hikes the fed rate? wont it raise the cost to the government budget when rates go up and it has to finance the debt at 110% debt to gdp and a duration of less than 5 thanks to the fed? Aren't we underestimating the potential damage to hiking rates?

Posted by: Emilio Lamar | May 01, 2013 at 02:33 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

October 04, 2012


Trends in Small Business Lending

The Atlanta Fed's latest semiannual Small Business Survey is active through October 22, 2012. If you own a small business and would like to participate, send an e-mail to SmallBusinessResearch@atl.frb.org.

In our previous survey conducted in April 2012, we found that firms applying for credit at large national banks had notably less success than firms that applied to small banks.

121003a

We also found that the firms applying to large banks tended to be much younger than the firms that applied to small banks. We speculated that this "age factor" could be contributing to the lower overall success rates at large banks.

A difference between small businesses' success at large and small banks has also been documented by the online credit facilitator Biz2Credit. Biz2Credit works a bit like an online dating service—after answering a series of questions (and providing the typical financial documents required by lenders), small businesses are presented with five potential "matches." To determine the best five matches, Biz2Credit identifies what lenders are looking for—usually a certain credit score, a minimum number of years in business, an established banking relationship, and targeted industries.

The resulting credit applications are the basis for the Biz2Credit Small Business Lending Index. Biz2Credit also reports approval rates from the matching process for large banks, small banks, credit unions, and alternative lenders. These approval rates are plotted on the chart below.

121003b

Much like we saw in the Small Business Survey, Biz2Credit reports that small firms have had consistently less success in obtaining credit at large banks.
Confirming our results encourages us that our April observation was a good one. But confirmation isn't explanation—what accounts for the different experiences small businesses have in securing credit from small banks versus big banks? And so, we dig deeper.

Note: According to Biz2Credit, its index is based on 1,000 of the 10,000-plus applications submitted each month. To be included, the business has to have at least a 680 credit score, be at least two years old, and have an established relationship with the bank to which it is applying. Selection methods are also applied to provide for national representation.

Photo of Ellyn TerryBy Ellyn Terry, senior economic research analyst at the Atlanta Fed

October 4, 2012 in Banking, Small Business | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef017d3c7b00dc970c

Listed below are links to blogs that reference Trends in Small Business Lending:

Comments

According to Biz2Credit, its index is based on 1,000 of the 10,000-plus applications submitted each month.

Posted by: jewelry supplies | December 05, 2012 at 04:53 AM

Much like we saw in the Small Business Survey, Biz2Credit reports that small firms have had consistently less success in obtaining credit at large banks.

Posted by: badrum badkar | December 06, 2012 at 12:49 AM

To be included, the business has to have at least a 680 credit score, be at least two years old,

Posted by: Marvel War of Heroes hack | December 10, 2012 at 12:02 AM

Biz2Credit reports that small firms have had consistently less success in obtaining credit at large banks.

Posted by: nursing training | December 13, 2012 at 05:10 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

Google Search



Recent Posts


August 2016


Sun Mon Tue Wed Thu Fri Sat
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31      

Archives


Categories


Powered by TypePad