Close

This page had been redirected to a new URL, please update any bookmarks.

Font Size: A A A

macroblog

April 28, 2014

New Data Sources: A Conversation with Google's Hal Varian

New Data Sources: A Conversation with Google's Hal Varian

In recent years, there has been an explosion of new data coming from places like Google, Facebook, and Twitter. Economists and central bankers have begun to realize that these data may provide valuable insights into the economy that inform and improve the decisions made by policy makers.

Photo of Hal VarianAs chief economist at Google and emeritus professor at UC Berkeley, Hal Varian is uniquely qualified to discuss the issues surrounding these new data sources. Last week he was kind enough to take some time out of his schedule to answer a few questions about these data, the benefits of using them, and their limitations.

Mark Curtis: You've argued that new data sources from Google can improve our ability to "nowcast." Can you describe what this means and how the exorbitant amount of data that Google collects can be used to better understand the present?
Hal Varian: The simplest definition of "nowcasting" is "contemporaneous forecasting," though I do agree with David Hendry that this definition is probably too simple. Over the past decade or so, firms have spent billions of dollars to set up real-time data warehouses that track business metrics on a daily level. These metrics could include retail sales (like Wal-Mart and Target), package delivery (UPS and FedEx), credit card expenditure (MasterCard's SpendingPulse), employment (Intuit's small business employment index), and many other economically relevant measures. We have worked primarily with Google data, because it's what we have available, but there are lots of other sources.

Curtis: The ability to "nowcast" is also crucially important to the Fed. In his December press conference, former Fed Chairman Ben Bernanke stated that the Fed may have been slow to acknowledge the crisis in part due to deficient real-time information. Do you believe that new data sources such as Google search data might be able to improve the Fed's understanding of where the economy is and where it is going?
Varian: Yes, I think that this is definitely a possibility. The real-time data sources mentioned above are a good starting point. Google data seems to be helpful in getting real-time estimates of initial claims for unemployment benefits, housing sales, and loan modification, among other things.

Curtis: Janet Yellen stated in her first press conference as Fed Chair that the Fed should use other labor market indicators beyond the unemployment rate when measuring the health of labor markets. (The Atlanta Fed publishes a labor market spider chart incorporating a variety of indicators.) Are there particular indicators that Google produces that could be useful in this regard?
Varian: Absolutely. Queries related to job search seem to be indicative of labor market activity. Interestingly, queries having to do with killing time also seem to be correlated with unemployment measures!

Curtis: What are the downsides or potential pitfalls of using these types of new data sources?
Varian: First, the real measures—like credit card spending—are probably more indicative of actual outcomes than search data. Search is about intention, and spending is about transactions. Second, there can be feedback from news media and the like that may distort the intention measures. A headline story about a jump in unemployment can stimulate a lot of "unemployment rate" searches, so you have to be careful about how you interpret the data. Third, we've only had one recession since Google has been available, and it was pretty clearly a financially driven recession. But there are other kinds of recessions having to do with supply shocks, like energy prices, or monetary policy, as in the early 1980s. So we need to be careful about generalizing too broadly from this one example.

Curtis: Given the predominance of new data coming from Google, Twitter, and Facebook, do you think that this will limit, or even make obsolete, the role of traditional government statistical agencies such as Census Bureau and the Bureau of Labor Statistics in the future? If not, do you believe there is the potential for collaboration between these agencies and companies such as Google?
Varian: The government statistical agencies are the gold standard for data collection. It is likely that real-time data can be helpful in providing leading indicators for the standard metrics, and supplementing them in various ways, but I think it is highly unlikely that they will replace them. I hope that the private and public sector can work together in fruitful ways to exploit new sources of real-time data in ways that are mutually beneficial.

Curtis: A few years ago, former Fed Chairman Bernanke challenged researchers when he said, "Do we need new measures of expectations or new surveys? Information on the price expectations of businesses—who are, after all, the price setters in the first instance—as well as information on nominal wage expectations is particularly scarce." Do data from Google have the potential to fill this need?
Varian: We have a new product called Google Consumer Surveys that can be used to survey a broad audience of consumers. We don't have ways to go after specific audiences such as business managers or workers looking for jobs. But I wouldn't rule that out in the future.

Curtis: MIT recently introduced a big-data measure of inflation called the Billion Prices Project. Can you see a big future in big data as a measure of inflation?
Varian: Yes, I think so. I know there are also projects looking at supermarket scanner data and the like. One difficulty with online data is that it leaves out gasoline, electricity, housing, large consumer durables, and other categories of consumption. On the other hand, it is quite good for discretionary consumer spending. So I think that online price surveys will enable inexpensive ways to gather certain sorts of price data, but it certainly won't replace existing methods.

By Mark Curtis, a visiting scholar in the Atlanta Fed's research department


April 28, 2014 in Economics, Forecasts, Technology, Web/Tech | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a3fcfb87b0970b

Listed below are links to blogs that reference New Data Sources: A Conversation with Google's Hal Varian:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 31, 2014

A Brief Interview with Sergio Rebelo on the Euro-Area Economy

Last month, we at the Atlanta Fed had the great pleasure of hosting Sergio Rebelo for a couple of days. While he was here, we asked Sergio to share his thoughts on a wide range of current economic topics. Here is a snippet of a Q&A we had with him about the state of the euro-area economy:

Sergio, what would you say was the genesis of the problems the euro area has faced in recent years?

The contours of the euro area’s problems are fairly well known. The advent of the euro gave peripheral countries—Ireland, Spain, Portugal, and Greece—the ability to borrow at rates that were similar to Germany's. This convergence of borrowing costs was encouraged through regulation that allowed banks to treat all euro-area sovereign bonds as risk free.

The capital inflows into the peripheral countries were not, for the most part, directed to the tradable sector. Instead, they financed increases in private consumption, large housing booms in Ireland and Spain, and increases in government spending in Greece and Portugal. The credit-driven economic boom led to a rise in labor costs and a loss of competitiveness in the tradable sector.

Was there a connection between the financial crisis in the United States and the sovereign debt crisis in the euro area?

Simply put, after Lehman Brothers went bankrupt, we had a sudden stop of capital flows into the periphery, similar to that experienced in the past by many Latin American countries. The periphery boom quickly turned into a bust.

What do you see as the role for euro area monetary policy in that context?

It seems clear that more expansionary monetary policy would have been helpful. First, it would have reduced real labor costs in the peripheral countries. In those countries, the presence of high unemployment rates moderates nominal wage increases, so higher inflation would have reduced real wages. Second, inflation would have reduced the real value of the debts of governments, banks, households, and firms. There might have been some loss of credibility on the part of the ECB [European Central Bank], resulting in a small inflation premium on euro bonds for some time. But this potential cost would have been worth paying in return for the benefits.

And did this happen?

In my view, the ECB did not follow a sufficiently expansionary monetary policy. In fact, the euro-area inflation rate has been consistently below 2 percent and the euro is relatively strong when compared to a purchasing-power-parity benchmark. The euro area turned to contractionary fiscal policy as a panacea. There are good theoretical reasons to believe that—when the interest rate remains constant that so the central bank does not cushion the fall in government spending—the multiplier effect of government spending cuts can be very large. See, for example, Gauti Eggertsson and Michael Woodford, “The Zero Interest-rate Bound and Optimal Monetary Policy,” and Lawrence Christiano, Martin Eichenbaum, and Sergio Rebelo, "When Is the Government Spending Multiplier Large?

Theory aside, the results of the austerity policies implemented in the euro area are clear. All of the countries that underwent this treatment are now much less solvent than in the beginning of the adjustment programs managed by the European Commission, the International Monetary Fund, and the ECB.

Bank stress testing has become a cornerstone of macroprudential financial oversight. Do you think they helped stabilize the situation in the euro area during the height of the crisis in 2010 and 2011?

No. Quite the opposite. I think the euro-area problems were compounded by the weak stress tests conducted by the European Banking Association in 2011. Almost no banks failed, and almost no capital was raised. Banks largely increased their capital-to-asset ratios by reducing assets, which resulted in a credit crunch that added to the woes of the peripheral countries.

But we’re past the worst now, right? Is the outlook for the euro-area economy improving?

After hitting the bottom, a very modest recovery is under way in Europe. But the risk that a Japanese-style malaise will afflict Europe is very real. One useful step on the horizon is the creation of a banking union. This measure could potentially alleviate the severe credit crunch afflicting the periphery countries.

Thanks, Sergio, for this pretty sobering assessment.

John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department

Editor’s note: Sergio Rebelo is the Tokai Bank Distinguished Professor of International Finance at Northwestern University’s Kellogg School of Management. He is a fellow of the Econometric Society, the National Bureau of Economic Research, and the Center for Economic Policy Research.


January 31, 2014 in Banking, Capital and Investment, Economics, Europe, Interest Rates, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a73d66a0e3970d

Listed below are links to blogs that reference A Brief Interview with Sergio Rebelo on the Euro-Area Economy:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 23, 2013

Goodwill to Man

By pure coincidence, two interviews with Pennsylvania State University professor Neil Wallace have been published in recent weeks. One is in the December issue of the Federal Reserve Bank of Minneapolis’ excellent Region magazine. The other, conducted by Chicago Fed economist Ed Nosal and yours truly, is slated for the journal Macroeconomic Dynamics and is now available as a Federal Reserve Bank of Chicago working paper.

If you have any interest at all in the history of monetary theory over the past 40 years or so, I highly recommend to you these conversations. As Ed and I note of Professor Wallace in our introductory comments, very few people have such a coherent view of their own intellectual history, and fewer still have lived that history in such a remarkably consequential period for their chosen field.

Perhaps my favorite part of our interview was the following, where Professor Wallace reveals how he thinks about teaching economics, and macroeconomics specifically (link added):

If we were to construct an economics curriculum, independent of where we’ve come from, then what would it look like? The first physics I ever saw was in high school... I can vaguely remember something about frictionless inclined planes, and stuff like that. So that is what a first physics course is; it is Newtonian mechanics. So what do we have in economics that is the analogue of Newtonian mechanics? I would say it is the Arrow-Debreu general competitive model. So that might be a starting point. At the undergraduate level, do we ever actually teach that model?

[Interviewers] That means that you would not talk about money in your first course.

That is right. Suppose we taught the Arrow-Debreu model. Then at the end we’d have to say that this model has certain shortcomings. First of all, the equilibrium concept is a little hokey. It’s not a game, which is to say there are no outcomes associated with other than equilibrium choices. And second, where do the prices come from? You’d want to point out that the prices in the Arrow-Debreu model are not the prices you see in the supermarket because there’s no one in the model writing down the prices. That might take you to strategic models of trade. You would also want to point out that there are a lot of serious things in the world that we think we see that aren’t in the model: unemployment, money, and [an interesting notion of] firms aren’t in the Arrow-Debreu model. What else? Investing in innovation, which is critical to growth, isn’t in that model. Neither is asymmetric information. The curriculum, after this grounding in the analogue of Newtonian mechanics, which is the Arrow-Debreu model, would go into these other things. It would talk about departures from that theory to deal with such things; and it would describe unsolved problems.

So that’s a vision of a curriculum. Where would macro be? One way to think about macro is in terms of substantive issues. From that point of view, most of us would say macro is about business cycles and growth. Viewed in terms of the curriculum I outlined, business cycles and growth would be among the areas that are not in the Arrow-Debreu model. You can talk about attempts to shove them in the model, and why they fall short, and what else you can do.

Of the many things that I have learned from Professor Wallace, this one comes back to me again and again: Talk about how to get the things in the model that are essential to dealing with the unsolved problems, honestly assess why they fall short, and explore what else you can do. To me, this is not only a message of good science. It is one of intellectual generosity, the currency of good citizenship.

I was recently asked whether I align with “freshwater” or “saltwater” economics (roughly, I guess, whether I think of myself as an Arrow-Debreu type or a New Keynesian type). There are many similar questions that come up. Are you a policy “hawk” or a policy “dove”? Do you believe in old monetarism (willing to write papers with reduced-form models of money demand) or new monetarism (requiring, for example, some explicit statement about the frictions, or deviations from Arrow-Debreu, that give rise to money’s existence)?

What I appreciate about the Wallace formulation is that it asks us to avoid thinking in these terms. There are problems to solve. The models that we bring to those problems are not true or false. They are all false, and we—in the academic world and in the policy world—are on a common journey to figure out what we are missing and what else we can do.

It is deeply misguided to treat models as if they are immutable truths. All good economists appreciate this intellectually. And yet there is an awful lot of energy wasted, especially in the blogosphere, on casting aspersions at those who are perceived to be seeking answers within other theoretical tribes.

Some problems are well-suited to Newtonian mechanics, some are not. Some amendments to Arrow-Debreu are useful; some are not. And what is well-suited or useful in some circumstances may well be ill-suited or even harmful in others. Perhaps if we all acknowledge that none of us knows which is which 100 percent of the time, we can make just a little more progress on all those unsolved problems in the coming year. At a minimum, we would air our disagreements with a lot more civility.

Happy holidays.

David Altig By Dave Altig, executive vice president and research director at the Atlanta Fed


December 23, 2013 in Economics, Education, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019b038761ea970c

Listed below are links to blogs that reference Goodwill to Man:

Comments

This's surprisingly simplistic point of view, c'mon. That particular debate is not about which model is right (all are wrong in one way or another, yes), but about what economists should do when their model turns out to not reflect real developments nearly as good as the other models do

Posted by: Konstantin | December 25, 2013 at 08:39 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 23, 2013

The Dynamics of Economic Dynamism

Earlier today, Atlanta Fed President Dennis Lockhart gave a speech at the Creative Leadership Summit of the Louise Blouin Foundation. He posed the questions: Is the economic dynamism of the United States declining? Is America losing its economic mojo? He observed:

“... we see a picture in which fewer firms are expanding, and each expanding firm is adding fewer new jobs on average than in the past. Fewer firms are shrinking, and each is downsizing by less on average. Fewer people are being laid off or are quitting their job, and firms are hiring fewer people. In other words, the employment dynamics of the U.S. economy are slower.

The decline in job creation and destruction was also the theme of this recent macroblog post by Mark Curtis, which featured some pretty nifty dynamic charts of trends in job creation and destruction by industry and geography.

Identifying the policy implications of these slower dynamics requires careful diagnosis of the causal factors underlying the trends. The cutting edge of economic research looking at this issue was featured at the 2013 Comparative Analysis of Enterprise Data Conference hosted last week by the Atlanta Census Research Data Center (ACRDC), which is housed at the Atlanta Fed and directed by one of our senior research economists, Julie Hotchkiss. Through the ACRDC, qualified researchers in Atlanta and around the Southeast can perform statistical analyses on non-public Census microdata.

The agenda and papers presented at the conference are located here. Some of the papers, I think, were particularly relevant to what President Lockhart discussed. A few examples:

Reallocation in the Great Recession: Cleansing or Not?” by Lucia Foster and Cheryl Grim of the Center for Economic Studies at the U.S. Census Bureau and John Haltiwanger at the University of Maryland looked at the so-called “cleansing hypothesis,” in which recessions are not only periods of outsized job creation and destruction, but they are also periods in which the reallocation is especially productivity enhancing. They find that while previous recessions fit this pattern reasonably well, they do not see this kind of activity in the most recent recession. In fact, they find that in the manufacturing sector, the intensity of reallocation fell rather than rose (because of the especially sharp decline in job creation), and the reallocation that did occur was less productivity enhancing than in prior recessions.

How Firms Respond to Business Cycles: The Role of Firm Age and Firm Size,” by Javier Miranda, Teresa Fort, John Haltiwanger and Ron Jarmin, looked at the varying impact of recessions on firms by size and age. They show that young businesses (which are typically small) exhibit very different cyclical dynamics than small/older businesses and are more sensitive to the cycle than larger/older businesses. The paper also explores explanations for the finding that young/small businesses were hit especially hard during the last recession. They identify the collapse in housing prices as a primary culprit, with the decline in job creation at young firms especially pronounced in states with a large drop in housing prices.

As a side note, although not presented at the conference, “The Secular Decline in Business Dynamism in the U.S.,” a new paper by Ryan Decker, John Haltiwanger, Ron Jarmin and Javier Miranda, analyzes the overall secular decline in job reallocation across industries. They find that changes in industry composition (the decline in manufacturing and rise of service industries) are not driving the decline. Instead, the primary driver seems to be the decline in the pace of entrepreneurship and the accompanying decline in the share of young firms in the economy.

Finally, Steve Davis, from the University of Chicago, talked about his joint research with John Haltiwanger, Kyle Handley, Ron Jarmin, Josh Lerner and Javier Miranda on private equity in employment dynamics, Private equity critics claim that leveraged buyouts bring huge job losses. Davis shows that private-equity buyouts are followed by a decline in net employment at these firms relative to controls (similar firms that were not targets of a buyout). However, that net change pales compared with the amount of gross job creation and destruction that typically occurs within the target firm after the buyout. In particular, he finds that in addition to reducing employment at its existing establishments, including by selling some establishments to other firms, jobs are created at new establishments within the firm via acquisition and the opening of new establishments. Moreover, they show that this reallocation is generally productivity enhancing for the firm. Although the data used in the study go only through the mid-2000s, it seems reasonable to infer from the findings that the decline in private equity deals during and since the last recession has contributed to the overall lower level of employment dynamics in this recovery.

The Comparative Analysis of Enterprise Date Conference was an excellent representation of the type of high-quality research being conducted on questions that go to the heart of the cyclical-versus-structural debate about the future course of the U.S. economy. While this is an exciting and important time for researchers in this field, it is troubling to learn that the programs that collect the data used in these types of studies are being trimmed because of federal budget cuts.

Photo of John RobertsonBy John Robertson, vice president and senior economist in the Atlanta Fed’s research department


September 23, 2013 in Economics, Employment, Unemployment | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff90d6d7970d

Listed below are links to blogs that reference The Dynamics of Economic Dynamism:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 13, 2013

Job Reallocation over Time: Decomposing the Decline

One of the primary ways an economy expands is by quickly reallocating resources to the places where they are most productive. If new and productive firms are able to quickly grow and unproductive firms can quickly shrink, then the economy as a whole will experience faster growth and the many benefits (such as lower unemployment and higher wages) that are associated with that growth. Certain individuals may experience unemployment spells from this reallocation, but economists, starting with Joseph Schumpeter, have found that reallocation is associated with economic growth and wage growth, particularly for young workers.

Recently, a number of prominent economists such as John Haltiwanger have expressed concern that falling reallocation rates in the United States are a major contributor to the slow economic recovery. One simple way to quantify the speed of reallocation is to examine the job creation rate—defined as the number of new jobs in expanding firms divided by the total number of jobs in the economy—and the destruction rate, defined likewise but using the number of jobs lost by contracting firms. Chart 1 plots both the creation and the destruction rates of the U.S. economy starting in 1977. These measures track each other closely with creation rates exceeding destruction rates during periods of economic growth and vice versa during recessions. The most recent recession saw a particularly sharp decline in job creation (you can highlight the creation rate by clicking on the line), but it is clear this decline is part of a larger trend that far predates the current period. A decline in these rates could indicate less innovation or less labor market flexibility, both of which are likely to retard economic growth. Feel free to explore the measures for yourself using the figure’s interactivity.

To better understand these important trends we create a common variable called reallocation, which is defined as total jobs created plus total jobs destroyed, divided by total jobs in the economy. This formula creates one measure that describes how quickly jobs are moving from shrinking firms to expanding firms. Using data from the U.S. Census Bureau’s Business Dynamic Statistics, we examine differences in this variable across sectors and across states. Furthermore, using some basic data visualization tools, we can see how reallocation has evolved over time across these dimensions.

Chart 2 plots reallocation rates by industry from 1977 to 2011. The plot highlights the reallocation rate for all industries, but you can also select or deselect any industry to more clearly view how it has changed over time. Scrolling over the lines allows you to view the exact rates by industry in any time period. A few interesting patterns emerge. First, sectors have different levels of job reallocation in the cross section. Manufacturing stands out as having particularly low reallocation rates, probably the result of the large fixed-cost capital requirements required in production. Second, not all industries experienced sharp declines during this period. If you highlight the finance, insurance, and real estate sector, it is evident that reallocation rates actually increased for this sector until the most recent recession. Retail and construction, on the other hand, have experienced steady and significant declines during the past 35 years.

Chart 3 maps reallocation rates across states for the year 1977. This figure provides us with a cross sectional view of geographical differences in reallocation rates. States with the highest reallocation rates are dark brown, and states with the lowest rates are light brown. You can click through the years to visually capture how these rates have changed overtime for each state. Compare the color of the map in 1977 with the color in 2011. Scroll the mouse over any state to view that state’s reallocation rate in the particular year.

As with industries, states display clear cross sectional differences in their reallocation rates. The highest rates are found in western states, Florida, and Texas, and the lowest are in the Midwest. Scrolling through the years shows that the decline in reallocation rates is common to the entire country.

Overall, these figures display a stark trend. The economy is reallocating jobs at much slower rates than 20 or even 10 years ago, and this decline is, with only a few exceptions, common across states and industries. Economists are just now starting to explore the causes of this trend, and a single, compelling explanation has yet to emerge. But some explanation is clearly in order and clearly important for economic policymakers, monetary and otherwise.

By Mark Curtis, a visiting scholar in the Atlanta Fed's research department

Please note that the charts and maps in this post were updated and improved on November 27, 2013.


September 13, 2013 in Economics, Employment, Labor Markets | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff5e9087970c

Listed below are links to blogs that reference Job Reallocation over Time: Decomposing the Decline:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

August 30, 2013

Still Waiting for Takeoff...

On Thursday, we got a revised look at the economy’s growth rate in the second quarter. While the 2.5 percent annualized rate was a significant upward revision from the preliminary estimate, it comes off a mere 1.1 percent growth rate in the first quarter. That combines for a subpar first-half growth rate of 1.8 percent. OK, it’s growth, but not as strong as one would expect for a U.S. expansion and clearly a disappointment to the many forecasters who had once (again) expected this to be the year the U.S. economy shakes itself out of the doldrums.

Now, we’re not blind optimists when it comes to the record of economic forecasts. We know well that the evidence says you shouldn’t get overly confident in your favorite economists’ prediction. Most visions of the economy’s future have proven to be blurry at best.

Still, we at the Atlanta Fed want to know how to best interpret this upward revision to the second-quarter growth estimate and how it affects our president’s baseline forecast “for a pickup in real GDP growth over the balance of 2013, with a further step-up in economic activity as we move into 2014.”

What we can say about the report is that the revised second-quarter growth estimate is a decided improvement from the first quarter and a modest bump up from the recent four-quarter growth trend (1.6 percent). And there are some positive indicators within the GDP components. For example, real exports posted a strong turnaround last quarter, presumably benefiting from Europe’s emerging from its recession. And the negative influence of government spending cuts, while still evident in the data, was much smaller than during the previous two quarters.  Oh, and business investment spending improved between the first and second quarters.

All good, but these data simply give us a better fix on where we were in the second quarter, not necessarily a good signal of where we are headed. To that we turn to our “nowcast” estimate for the third quarter based on the incoming monthly data (the evolution of which is shown in the table below).

A "nowcasting" exercise generates quarterly GDP estimates in real time. The technical details of this exercise are described here, but the idea is fairly simple. We use incoming data on 100-plus economic series to forecast 12 components of GDP for the current quarter. We then aggregate those forecasts of GDP components to get a current-quarter estimate of overall GDP growth.

We caution that unlike others, our nowcast involves no interpretation whatsoever of these data. In what is purely a statistical exercise, we let the data do all the speaking for themselves.

Given the first data point of July—the July jobs report—the nowcast for the third quarter was pretty bleak (1.1 percent). Things improved a few days later with the release of strong international trade data for June, and stepped up further with the June wholesale trade report. But the remainder of the recent data point to a third-quarter growth rate that is very close to the lackluster performance of the first half.


In his speech a few weeks ago, President Dennis Lockhart indicated what he was looking for as drivers for stronger growth in the second half of this year.

“I expect consumer activity to strengthen.”

Today’s read on real personal consumption expenditures (PCE) probably isn’t bolstering confidence in that view. Real PCE was virtually flat in July, undermining private forecasters’ expectation of a moderate gain. Our nowcast for real GDP slipped down 0.5 percentage points to 1.4 percent on the basis of this data, and pegged consumer spending at 1.7 percent for Q3—in line with Q2’s 1.8 percent gain.

“I expect business investment to accelerate somewhat.”

The July data were pretty disappointing on this score. The durable-goods numbers released a few days ago were quite weak, causing our nowcast, and those of the others we follow, to revise down the third-quarter growth estimate.

“I expect the rebound we have seen in the housing sector to continue.”

Check. Our nowcast wasn’t affected much by the housing starts data, but the existing sales numbers produced a positive boost to the estimate. Our nowcast’s estimate of residential investment growth in the third quarter is well under what we saw in the second quarter. But at 5.3 percent, the rebound looks to be continuing.

“I expect the recent improvement in exports to last.”

Unfortunately, the July trade numbers don’t get reported until next week. So we’re going to mark this one as missing in action.  But as we said earlier, that June trade number was strong enough to cause our third-quarter nowcast to be revised up a bit.

“And I expect to see an easing of the public-sector spending drag at the federal, state, and local levels.”

Again, check. The July Treasury data indicated growth in government spending overall.

So the July data are a mixed bag: some positives, some disappointments, and some missing-in-actions. But if President Lockhart were to ask us (and something tells us he just might), we’re likely to say that on the basis of the July indicators, the “pickup in real GDP growth over the balance of 2013” isn’t yet very evident in the data.

This news isn’t likely to come as a big surprise to him. Again, here’s what he said publicly two weeks ago:

When I weigh the balance of risks around the medium-term outlook I laid out, I have some concerns about the potential for ambiguous or disappointing data. I also think that it is important to be realistic about the degree to which we are likely to have clarity in the near term about the direction of the economy. Both the quantity of information and the strength of the signal conveyed by the data will likely be limited. As of September, the FOMC will have in hand one more employment report, two reports on inflation, a revision to the second-quarter GDP data, and preliminary incoming signals about growth in the third quarter. I don't expect to have enough data to be sure of my outlook.

It’s still a little early to say with any confidence we won’t eventually see a pickup this quarter, and we can hope that the incoming August numbers show a more marked improvement. All we can say at this point is that after seeing most of the July data, it still feels like we’re stuck on the tarmac.

Photo of Mike BryanBy Mike Bryan, vice president and senior economist,

Photo of Patrick HigginsPatrick Higgins, senior economist, and

Photo of Brent MeyerBrent Meyer, economist, all in the Atlanta Fed's research department


August 30, 2013 in Data Releases, Economic Growth and Development, Economics, Forecasts, GDP | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff17e3fc970c

Listed below are links to blogs that reference Still Waiting for Takeoff...:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

August 19, 2013

Does Forward Guidance Reach Main Street?

The Federal Open Market Committee (FOMC) has been operating with two tools (well described in a recent speech by our boss here in Atlanta). The first is our large-scale asset purchase program, or QE to everyone outside of the Federal Reserve. The second is our forward guidance on the federal funds rate. Here’s what the fed funds guidance was following the July FOMC meeting:

[T]he Committee decided to keep the target range for the federal funds rate at 0 to 1/4 percent and currently anticipates that this exceptionally low range for the federal funds rate will be appropriate at least as long as the unemployment rate remains above 6-1/2 percent, inflation between one and two years ahead is projected to be no more than a half percentage point above the Committee's 2 percent longer-run goal, and longer-term inflation expectations continue to be well anchored. 

The quarterly projections of the June FOMC meeting participants give more specific guidance on the fed funds rate assuming “appropriate” monetary policy. All but one FOMC participant expects the funds rate to be lifted off the floor in 2015, with the median projection that the fed funds rate will be 1 percent by the end of 2015.



But forward guidance isn’t worth much if the public has a very different view of how long the fed funds rate will be held near zero. The Federal Reserve Bank of New York has a good read on Wall Street’s expectation for the federal funds rate. Its June survey of primary dealers (a set of institutions the Fed trades with when conducting open market operations) saw a 52 percent chance that the fed funds rate will rise from zero in 2015, and the median forecast of the group saw the fed funds rate at 0.75 percent at the end of 2015. In other words, the bond market is broadly in agreement with the fed funds rate projections made by FOMC meeting participants.

But what do we know about Main Street’s perspective on the fed funds rate? Do they even have an opinion on the subject?

Our perspective on Main Street comes from our panel of businesses who participate in the monthly Business Inflation Expectations (BIE) Survey. And we used our special question to the panel this month to see if we could gauge how, indeed whether, businesses have opinions about the future of the federal funds rate. Here’s the specific question we put to the group:

Currently the fed funds rate is near 0%. [In June, the Federal Reserve projected the federal funds rate to be 1% by the end of 2015.] Please assign a percentage likelihood to the following possible ranges for the federal funds rate at the end of 2015 (values should sum to 100%).

In the chart below, we plot the distribution of panelists’ median-probability forecast (the green bars) compared to the distribution of the FOMC’s June projection (we’ve simply smushed the FOMC’s dots into the appropriately categorized blue bars).

Seventy-five percent of our respondents had a median-probability forecast for the fed funds rate somewhere between 0.5 percent and 1.5 percent by the end of 2015. That forecast compares very closely to the 73 percent of the June FOMC meeting participants.



You may have noticed in the above question a bracketed bit of information about the Federal Reserve’s forecast for the federal funds rate: “In June, the Federal Reserve projected the federal funds rate to be 1% by the end of 2015.” Actually, this bit of extra information was supplied only to half of our panel (selected at random). A comparison between these two panel subsets is shown in the chart below.


These two subsets are very similar. (If you squint, you might see that the green bars appear a little more diffuse, but this isn’t a statistically significant difference…we checked.) This result suggests that the extra bit of information we provided was largely extraneous. Our business panel seems to have already had enough information on which to make an informed prediction about the federal funds rate.

Finally, the data shown in the two figures above are for those panelists who opted to answer the question we posed. But, at our instruction, not every firm chose to make a prediction for the federal funds rate. With this month’s special question, we instructed our panelists to “Please feel free to leave this question blank if you have no opinion.” A significant number of our panelists exercised this option.

The typical nonresponse rate from the BIE survey special question is about 2 percent. This month, it was 22 percent—which suggests that an unusually high share of our panel had no opinion on the future of the fed funds rate. What does this mean? Well, it could mean that a significant share of Main Street businesses are confused by the FOMC’s communications and are therefore unable to form an opinion. But a high nonresponse rate could also mean that some segment of Main Street businesses don’t believe that forward guidance on the fed funds rate affects their businesses much.

Unfortunately, the data we have don’t put us in a very good position to distinguish between confusion and apathy. Besides, we’re optimistic sorts. We’re going to emphasize that 78 percent of those businesses we surveyed responded to the question, and that typical response lined up pretty well with the opinions of FOMC meeting participants and the expectations of Wall Street. So, while not everyone is dialed in to our forward guidance, Main Street seems to get it.

Photo of Mike BryanBy Mike Bryan, vice president and senior economist,

Photo of Brent MeyerBrent Meyer, economist, and

Photo of Nicholas ParkerNicholas Parker, senior economic research analyst, all in the Atlanta Fed's research department


August 19, 2013 in Business Inflation Expectations, Economics, Fed Funds Futures, Federal Reserve and Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef0192ac9f9e5b970d

Listed below are links to blogs that reference Does Forward Guidance Reach Main Street?:

Comments

Forward guidance can prove to be an effective tool for monetary policy, especially, when it is first implemented, as it is unexpected as well. Later on, however, its impact is diminished as it is only the change in expected guidance that might have an impact.

Posted by: Javier | September 21, 2013 at 11:50 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

July 08, 2013

Let’s Talk about Oil

Given its role in touching nearly every aspect of life across the globe and given the higher and volatile prices over the past half-decade, oil supply has been an incessant topic of conversation for much of our recent memory. Yet the tone of the conversation has dramatically pivoted recently from arguments about whether peak oil or sky-high oil prices could spur a global economic meltdown (anyone remember 2008?) to the shifting energy balance as a result of rapidly growing oil production from North America.

Chip Cummins and Russell Gold recently published a piece in the Wall Street Journal discussing how new supply from U.S. shale oil and Canadian oil sands is helping to steady global oil prices.

Crude prices have remained remarkably stable over the past year in the face of a long list of supply disruptions, from Nigerian oil theft to Syrian civil war to an export standoff between Sudan and South Sudan. The reason in large part is a thick new blanket of North American oil cushioning the markets.

This chart helps demonstrate how quickly the oil landscape in the United States has indeed changed. The U.S. Energy Information Administration (EIA) expects national crude oil production to exceed net oil imports later this year, marking a rapid turnaround from the trend of ever-increasing reliance on imports.



However, despite the increase in U.S. oil production, global oil prices have stabilized at relatively high levels, as the chart below shows.



However, the two seemingly opposing narratives—that of high oil prices and that of an emerging oil and gas abundance—are fundamentally linked. In fact, if it hadn’t been for such high oil prices, this new surge in North American oil production may not have happened. It is much more difficult to rationalize drilling activity in deep offshore areas, hard shale, or tar sands—from which, by nature, oil is expensive to produce—without high oil prices. (West Texas Intermediate, or WTI, oil averaged $31 per barrel in 2003, which, even in real terms, is only about 2/5 of today’s prices.) Analysts at Morgan Stanley estimate that the break-even point for Bakken (North Dakota) crude oil is about $70 per barrel and that even a price of $85 per barrel could squeeze out many of the unconventional producers.

What does all this mean for prices? Well, keep in mind that oil is a global commodity. So the roughly two million barrels of oil per day that have entered the market from the U.S. fracking boom represent a big shift domestically but only just over 2 percent of global oil consumption.

And while the United States is seeing growing oil supplies and moderating demand, a different trend is taking place globally, with rising demand from China and other emerging economies coupled with declining supply from older fields and OPEC efforts to keep prices higher through production limits.

However, not everyone believes that higher prices are here to stay. Some analysts have begun to warn that a price crash may be looming. Paul Stevens, an energy specialist with Chatham House, argues that we may be headed for a replay of the price crash in 1986 when high prices triggered demand destruction while bringing new, more expensive sources of supply to the market from the North Sea and Alaska.

Only time will tell where global oil prices will ultimately shake out, but for now, the larger supply cushion has certainly been a welcome development in the United States. Back to the Wall Street Journal article:

The new supply...is acting as a shock absorber in a global supply chain that pumps 88 million barrels of oil to consumers each day. That helps everyone from manufacturers to motorists, by steadying fuel prices and making budgeting easier.

Photo of Laurel GraefeBy Laurel Graefe, Atlanta Fed REIN director, and

Photo of LRebekah DurhamRebekah Durham, economic policy analysis specialist at the New Orleans Branch of the Atlanta Fed

Authors’ note: We didn’t touch on the difference between WTI and Brent oil prices in this post, despite the fact that the changing global oil production landscape has undoubtedly contributed to that spread. For those interested, we recommend some recent analysis from the Energy Information Administration on the narrowing spread between WTI and Brent.


July 8, 2013 in Economics, Energy, Pricing | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef0192abec254b970d

Listed below are links to blogs that reference Let’s Talk about Oil:

Comments

A useful distinction is between the equilibrium price and the spot price which is notoriously volatile, in part, because of geopolitical risks in the Middle East. Increased production sourced in N. America reduces those risks and by adding 'spare capacity' also reduces overall costs by obviating the need for contingency arrangements.

Posted by: van schayk | July 09, 2013 at 12:42 PM

You only briefly mention moderating demand in the US, but the change in demand is about the same as the change in supply (US production). There has been lots of talk about increased production, but very little at the decreased domestic demand. Some of this is due to decreased miles driven, while some is due to higher CAFE standards prompting many new models to have significantly higher mileage than prior models (20% better for Altimas, Mazdas & others). Increased production is important, but reduced demand is equally important, and will be a better long-term solution as we will continue to see improvements as the nationwide fleet improves its mileage.

http://www.eia.gov/countries/country-data.cfm?fips=US#pet

Posted by: JimC | July 10, 2013 at 09:48 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

July 05, 2013

A Quick Independence Day Weekend, Post-Employment Report Update

From what I gather, a lot of people took notice of this statement, from Chairman Bernanke’s June 19 press conference:

If the incoming data are broadly consistent with this forecast, the Committee currently anticipates that it would be appropriate to moderate the monthly pace of purchases later this year. And if the subsequent data remain broadly aligned with our current expectations for the economy, we would continue to reduce the pace of purchases in measured steps through the first half of next year, ending purchases around midyear. In this scenario, when asset purchases ultimately come to an end, the unemployment rate would likely be in the vicinity of 7 percent, with solid economic growth supporting further job gains, a substantial improvement from the 8.1 percent unemployment rate that prevailed when the Committee announced this program.

That 7 percent assessment to which the Chairman was referring comes, of course, from the outlook summarized in the Summary of Economic Projections, published following the June 18–19 meeting of the Federal Open Market Committee.

Here are the unemployment forecasts specifically:

Macroblog_2013-07-05A

The highlighted numbers represent the “central tendency” projections for the average fourth quarter unemployment rate in 2013, 2014, and 2015 (in blue) and the “longer run” (in green). Naturally enough, getting to a 6.5 percent to 6.8 percent unemployment rate in the fourth quarter of 2014 is pretty likely to imply the unemployment rate crossing 7 percent sometime around roughly the middle of next year.

So, how do things look after the June employment report? As is our wont, we turn to our Jobs Calculator to answer such questions, and come up with the following. If the U.S. economy creates 191,000 jobs per month (the average for the past 12 months), and the labor force participation rate stays at 63.5 percent (its June level), and all the other important assumptions (such as the ratio of establishment survey to household survey employment) remain the same, then the economy’s schedule looks like this:

Macroblog_2013-07-05B

Note also the implication of this statement...

[T]he Committee decided to keep the target range for the federal funds rate at 0 to 1/4 percent and currently anticipates that this exceptionally low range for the federal funds rate will be appropriate at least as long as the unemployment rate remains above 6-1/2 percent , inflation between one and two years ahead is projected to be no more than a half percentage point above the Committee's 2 percent longer-run goal, and longer-term inflation expectations continue to be well anchored.

...which certainly aids in understanding this information, from the last Summary of Economic Projections:

Macroblog_2013-07-05C

I will leave it to the principals to articulate whether today’s report materially changes anything contained in last month’s projections. In the meantime, enjoy your weekend.

Photo of Dave AltigBy Dave Altig, executive vice president and research director of the Atlanta Fed


July 5, 2013 in Economics, Employment, Federal Reserve and Monetary Policy, Forecasts, Labor Markets | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef0192abe1d53c970d

Listed below are links to blogs that reference A Quick Independence Day Weekend, Post-Employment Report Update:

Comments

The analysis describing the decline in the unemployment rate to 6.25-percent in July 2015 assumes that "the labor force participation rate stays at 63.5 percent."

Other than spring 2013, the last time the LFPR was lower than 63.5-percent was in May 1979 ... 34 years ago.  So I don't challenge your arithmetic, but find it highly improbable that the LFPR will stabilize at current levels as the economy expands. People flood into the labor market when jobs become easier to find.

The last time the unemployment rate was at (about) 6.25-percent was in October 2008, at which time the LFPR stood at 66-percent.  In the previous business cycle, the LFPR remained above 67-percent for an extended period between 1997 and 2001.

According to the Jobs Calculator, monthly job growth of 190,000 and a LFPR of 66% would bring the unemployment rate down to 6.25-percent about 94 months from now ... in mid-2021.

Posted by: Thomas Wyrick | July 08, 2013 at 09:33 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

May 16, 2013

Labor Costs, Inflation Expectations, and the Affordable Care Act: What Businesses Are Telling Us

The Atlanta Fed’s May survey of businesses showed little overall concern about near-term inflation. Year-ahead unit cost expectations averaged 2 percent, down a tenth from April and on par with business inflation expectations at this time last year.

OK, we’re going to guess this observation doesn’t exactly knock you off your chair. But here’s something we’ve been keeping an eye on that you might find interesting. When we ask firms about what role, if any, labor costs are likely to play in their prices over the next 12 months, an increasing proportion have been telling us they see a potential for upward price pressure coming from labor costs (see the chart).



To investigate further, we posed a special question to our Business Inflation Expectations (BIE) panel regarding their expectations for compensation growth over the next 12 months: “Projecting ahead over the next 12 months, by roughly what percentage do you expect your firm’s average compensation per worker (including benefits) to change?”

We got a pretty large range of responses, but on average, firms told us they expect average compensation growth—including benefits—of 2.8 percent. That’s about a percent higher than the average over the past year (as estimated by either the index of compensation per hour or the employment cost index). But a 2.8 percent rise is also about a percentage point below average compensation growth before the recession. We’re included to read the survey as a confirmation that labor markets are improving and expected to improve further over the coming year. But we’re not inclined to interpret the survey data as an indication that the labor market is nearing full employment.

We’ve also been hearing more lately about the potential for the Affordable Care Act (ACA) to have a significant influence on labor costs and, presumably, to provide some upward price pressure. Indeed, several of our panelists commented on their concern about the influence of the ACA when they completed their May BIE survey. So can we tie any of this expected compensation growth to the ACA, a significant share of which is scheduled to go into effect eight months from now?

Because a disproportionate impact from the ACA will fall on firms that employ 50 or more workers, we separated our panel into firms with 50 or more employees, and those employing fewer than 50 workers. What we see is that average expected compensation growth is the same for the bigger employers and smaller employers. Moreover, the big firms in our sample report the same inflation expectation as the smaller firms.

But the data reveal that the bigger firms are a little more uncertain about their unit cost projections for the year ahead. OK, it’s not a big difference, but it is statistically significant. So while their cost and compensation expectations are not yet being affected by the prospect of the ACA, the act might be influencing their uncertainty about those potential costs.



Photo of Mike BryanBy Mike Bryan, vice president and senior economist,

Photo of Brent MeyerBrent Meyer, economist, and

Photo of Nicholas ParkerNicholas Parker, senior economic research analyst, all in the Atlanta Fed’s research department


May 16, 2013 in Business Inflation Expectations, Economics, Health Care, Inflation Expectations, Labor Markets, Pricing | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01901c3ff386970b

Listed below are links to blogs that reference Labor Costs, Inflation Expectations, and the Affordable Care Act: What Businesses Are Telling Us:

Comments

Maybe we're finally reaching the point where firms can no longer expropriate productivity gains. If you look at the total hourly compensation for non-supervisory workers vs. productivity, the last 40 years have more or less seen the gains made during the Great Compression utterly obliterated. Now that we're back to Gilded-Age levels of income distribution, it may be that we've reached an equilibrium.

Posted by: Valerie Keefe | May 19, 2013 at 12:22 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in