Close

This page had been redirected to a new URL, please update any bookmarks.

Font Size: A A A

macroblog

August 12, 2014

Are We There Yet?

Editor’s note: This macroblog post was published yesterday with some content inadvertently omitted. Below is the complete post. We apologize for the error.

Anyone who has undertaken a long road trip with children will be familiar with the frequent “are we there yet?” chorus from the back seat. So, too, it might seem on the long post-2007 monetary policy road trip. When will the economy finally look like it is satisfying the Federal Open Market Committee’s (FOMC) dual mandate of price stability and full employment? The answer varies somewhat across the FOMC participants. The difference in perspectives on the distance still to travel is implicit in the range of implied liftoff dates for the FOMC’s short-term interest-rate tool in the Summary of Economic Projections (SEP).

So how might we go about assessing how close the economy truly is to meeting the FOMC’s objectives of price stability and full employment? In a speech on July 17, President James Bullard of the St. Louis Fed laid out a straightforward approach, as outlined in a press release accompanying the speech:

To measure the distance of the economy from the FOMC’s goals, Bullard used a simple function that depends on the distance of inflation from the FOMC’s long-run target and on the distance of the unemployment rate from its long-run average. This version puts equal weight on inflation and unemployment and is sometimes used to evaluate various policy options, Bullard explained.

We think that President Bullard’s quadratic-loss-function approach is a reasonable one. Chart 1 shows what you get using this approach, assuming a goal of year-over-year personal consumption expenditure inflation at 2 percent, and the headline U-3 measure of the unemployment rate at 5.4 percent. (As the U.S. Bureau of Labor Statistics defines unemployment, U-3 measures the total unemployed as a percent of the labor force.) This rate is about the midpoint of the central tendency of the FOMC’s longer-run estimate for unemployment from the June SEP.

Chart 1: Progress toward Objectives: U-3 Gap

Notice that the policy objective gap increased dramatically during the recession, but is currently at a low value that’s close to precrisis levels. On this basis, the economy has been on a long, uncomfortable trip but is getting pretty close to home. But other drivers of the monetary policy minivan may be assessing how far there is still to travel using an alternate road map to chart 1. For example, Atlanta Fed President Dennis Lockhart has highlighted the role of involuntary part-time work as a signal of slack that is not captured in the U-3 unemployment rate measure. Indeed, the last FOMC statement noted that

Labor market conditions improved, with the unemployment rate declining further. However, a range of labor market indicators suggests that there remains significant underutilization of labor resources.

So, although acknowledging the decline in U-3, the Committee is also suggesting that other labor market indicators may suggest somewhat greater residual slack in the labor market. For example, suppose we used the broader U-6 measure to compute the distance left to travel based on President Bullard’s formula. The U-6 unemployment measure counts individuals who are marginally attached to the labor force as unemployed and, importantly, also counts involuntarily part-time workers as unemployed. One simple way to incorporate the U-6 gap is to compute the average difference between U-6 and U-3 prior to 2007 (excluding the 2001 recession), which was 3.9 percent, and add that to the U-3 longer-run estimate of 5.4 percent, to give an estimate of the longer-run U-6 rate of 9.3 percent. Chart 2 shows what you get if you run the numbers through President Bullard’s formula using this U-6 adjustment (scaling the U-6 gap by the ratio of the U-3 and U-6 steady-state estimates to put it on a U-3 basis).

Chart 2: Progress toward Objectives: U-3 Gap versus U-6 Gap

What the chart says is that, up until about four years ago, it didn’t really matter at all what your preferred measure of labor market slack was; they told a similar story because they tracked each other pretty closely. But currently, your view of how close monetary policy is to its goals depends quite a bit on whether you are a fan of U-3 or of U-6—or of something in between. I think you can put the Atlanta Fed’s current position as being in that “in-between” camp, or at least not yet willing to tell the kids that home is just around the corner.

In an interview last week with the Wall Street Journal, President Lockhart effectively put some distance between his own view and those who see the economy as being close to full employment. The Journal’s Real Time Economics blog quoted Lockhart:

“I’m not ruling out” the idea the Fed may need to raise short-term interest rates earlier than many now expect, Mr. Lockhart said in an interview with The Wall Street Journal. But, at the same time, “I’m a little bit cautious” about the policy outlook, and still expect that when the first interest rate hike comes, it will likely happen somewhere in the second half of next year.

“I remain one who is looking for further validation that we are on a track that is going to make the path to our mandate objectives pretty irreversible,” Mr. Lockhart said. “It’s premature, even with the good numbers that have come in ... to draw the conclusion that we are clearly on that positive path,” he said.

Mr. Lockhart said the current unemployment rate of 6.2% will likely continue to decline and tick under 6% by the end of the year. But, he said, there remains evidence of underlying softness in the job sector, and, he also said, while inflation shows signs of firming, it remains under the Fed’s official 2% target.

Our view is that the current monetary policy journey has made considerable progress toward its objectives. But the trip is not yet complete, and the road ahead remains potentially bumpy. In the meantime, I recommend these road-trip sing-along selections.

Photo of John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department


August 12, 2014 in Economics, Employment, Federal Reserve and Monetary Policy, Inflation, Labor Markets, Monetary Policy, Pricing, Unemployment | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a3fd455f7a970b

Listed below are links to blogs that reference Are We There Yet?:

Comments

Major problems with U6 include the fact that someone working 34 hours but wants to work 35 or more is considered unemployed (not partially unemployed) -- a very loose definition of an unemployed person. Also, some policymakers conflate marginally attached with discouraged workers. Only one-third of the marginally attached are discouraged about job prospects (the other two-thirds didn't look for work because of illness, school, etc. -- i.e., for reasons monetary policy cannot address). So there are very good reasons for President Bullard's objective function to be based on U3 rather than U6. Additionally, what policymakers should consider, to follow through with your analogy, is when you arrive at your destination should you still have the accelerator pressed to the floor? Or does it not make sense to let off of the gas a bit as you approach your destination (to avoid driving the minivan right through your home).

Posted by: Conrad DeQuadros | August 14, 2014 at 12:57 PM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

August 08, 2014

Getting There?

To say that last week was somewhat eventful on the macroeconomic data front is probably an exercise in understatement. Relevant numbers on GDP growth (past and present), employment and unemployment, and consumer price inflation came in quick succession.

These data provide some of the context for our local Federal Open Market Committee participant’s comments this week (for example, in the Wall Street Journal’s Real Time Economics blog, with similar remarks made in an interview on CNBC’s Closing Bell). From that Real Time Economics blog post:

Although the economy is clearly growing at a respectable rate, Federal Reserve Bank of Atlanta President Dennis Lockhart said Wednesday it is premature to start planning an early exit from the central bank’s ultra-easy policy stance.

“I’m not ruling out” the idea the Fed may need to raise short-term interest rates earlier than many now expect, Mr. Lockhart said in an interview with The Wall Street Journal. But, at the same time, “I’m a little bit cautious” about the policy outlook, and still expect that when the first interest rate hike comes, it will likely happen somewhere in the second half of next year.

“I remain one who is looking for further validation that we are on a track that is going to make the path to our mandate objectives pretty irreversible,” Mr. Lockhart said. “It’s premature, even with the good numbers that have come in...to draw the conclusion that we are clearly on that positive path,” he said.

Why so “cautious”? Here’s the Atlanta Fed staff’s take on the state of things, starting with GDP:

With the annual benchmark revision in hand, 2013 looks like the real deal, the year that the early bet on an acceleration of growth to the 3 percent range finally panned out. Notably, fiscal drag (following the late-2012 budget deal), which had been our go-to explanation of why GDP appeared to have fallen short of expectations once again, looks much less consequential on revision.

Is 2014 on track for a repeat (or, more specifically, comparable performance looking through the collection of special factors that weighed on the first quarter)? The second-quarter bounce of real GDP growth to near 4 percent seems encouraging, but we are not yet overly impressed. Final sales—a number that looks through the temporary contribution of changes in inventories—clocked in at a less-than-eye-popping 2.3 percent annual rate.

Furthermore, given the significant surprise in the first-quarter final GDP report when the medical-expenditure-soaked Quarterly Services Survey was finally folded in, we’re inclined to be pretty careful about over-interpreting the second quarter this early. It’s way too early for a victory dance.

Regarding labor markets, here is our favorite type of snapshot, courtesy of the Atlanta Fed’s Labor Market Spider Chart:

Atlanta Fed Labor Market Spider Chart

There is a lot to like in that picture. Leading indicators, payroll employment, vacancies posted by employers, and small business confidence are fully recovered relative to their levels at the end of the Great Recession.

On the less positive side, the numbers of people who are marginally attached or who are working part-time while desiring full-time hours remain elevated, and the overall job-finding rate is still well below prerecession levels. Even so, these indicators are noticeably better than they were at this time last year.

That year-over-year improvement is an important observation: the period from mid-2012 to mid-2013 showed little progress in the broader measures of labor-market performance that we place in the resource “utilization” category. During the past year, these broad measures have improved at the same relative pace as the standard unemployment statistic.

We have been contending for some time that part-time for economic reasons (PTER) is an important factor in understanding ongoing sluggishness in wage growth, and we are not yet seeing anything much in the way of meaningful wage pressures:

Total Private Earnings, year/year % change, sa

There was, to be sure, a second-quarter spike in the employment cost index (ECI) measure of labor compensation growth, but that increase followed a sharp dip in the first quarter. Maybe the most recent ECI reading is telling us something that hourly earnings are not, but that still seems like a big maybe. Outside of some specific sectors and occupations (in manufacturing, for example), there is not much evidence of accelerating wage pressure in either the data or in anecdotes we get from our District contacts. We continue to believe that wage growth is most consistent with the view that that labor market slack persists, and underlying inflationary pressures (from wage costs, at least) are at bay.

Clearly, it’s dubious to claim that wages help much in the way of making forward predictions on inflation (as shown, for example, in work from the Chicago Fed, confirming earlier research from our colleagues at the Cleveland Fed). And in any event, we are inclined to agree that the inflation outlook has, in fact, firmed up. At this time last year, it was hard to argue that the inflation trend was moving in the direction of the Committee’s objective (let alone that it was not actually declining).

But here again, a declaration that the risks have clearly shifted in the direction of overshooting the FOMC’s inflation goals seems wildly premature. Transitory factors have clearly elevated recent statistics. The year-over-year inflation rate is still only 1.5 percent, and by most cuts of the data, the trend still looks as close to that level as to 2 percent.

'Trends' in the June Core PCE

We do expect measured inflation trends to continue to move in the direction of 2 percent, but sustained performance toward that objective is still more conjecture than fact. (By the way, if you are bothered by the appeal to a measure of core personal consumption expenditures in that chart above, I direct you to this piece.)

All of this is by way of explaining why we here in Atlanta are “a little bit cautious” about joining any chorus singing from the we’re-moving-on-up songbook. Paraphrasing from President Lockhart’s comments this week, the first steps to policy normalization don’t have to wait until the year-over-year inflation rate is consistently at 2 percent, or until all of the slack in the labor market is eliminated. But it is probably prudent to be fairly convinced that progress to those ends is unlikely to be reversed.

We may be getting there. We’re just not quite there yet.

Photo of Dave AltigBy Dave Altig, executive vice president and research director of the Atlanta Fed


August 8, 2014 in Economic conditions, Economics, Employment, Federal Reserve and Monetary Policy, GDP, Inflation, Labor Markets | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a511f29aa7970c

Listed below are links to blogs that reference Getting There?:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

April 28, 2014

New Data Sources: A Conversation with Google's Hal Varian

New Data Sources: A Conversation with Google's Hal Varian

In recent years, there has been an explosion of new data coming from places like Google, Facebook, and Twitter. Economists and central bankers have begun to realize that these data may provide valuable insights into the economy that inform and improve the decisions made by policy makers.

Photo of Hal VarianAs chief economist at Google and emeritus professor at UC Berkeley, Hal Varian is uniquely qualified to discuss the issues surrounding these new data sources. Last week he was kind enough to take some time out of his schedule to answer a few questions about these data, the benefits of using them, and their limitations.

Mark Curtis: You've argued that new data sources from Google can improve our ability to "nowcast." Can you describe what this means and how the exorbitant amount of data that Google collects can be used to better understand the present?
Hal Varian: The simplest definition of "nowcasting" is "contemporaneous forecasting," though I do agree with David Hendry that this definition is probably too simple. Over the past decade or so, firms have spent billions of dollars to set up real-time data warehouses that track business metrics on a daily level. These metrics could include retail sales (like Wal-Mart and Target), package delivery (UPS and FedEx), credit card expenditure (MasterCard's SpendingPulse), employment (Intuit's small business employment index), and many other economically relevant measures. We have worked primarily with Google data, because it's what we have available, but there are lots of other sources.

Curtis: The ability to "nowcast" is also crucially important to the Fed. In his December press conference, former Fed Chairman Ben Bernanke stated that the Fed may have been slow to acknowledge the crisis in part due to deficient real-time information. Do you believe that new data sources such as Google search data might be able to improve the Fed's understanding of where the economy is and where it is going?
Varian: Yes, I think that this is definitely a possibility. The real-time data sources mentioned above are a good starting point. Google data seems to be helpful in getting real-time estimates of initial claims for unemployment benefits, housing sales, and loan modification, among other things.

Curtis: Janet Yellen stated in her first press conference as Fed Chair that the Fed should use other labor market indicators beyond the unemployment rate when measuring the health of labor markets. (The Atlanta Fed publishes a labor market spider chart incorporating a variety of indicators.) Are there particular indicators that Google produces that could be useful in this regard?
Varian: Absolutely. Queries related to job search seem to be indicative of labor market activity. Interestingly, queries having to do with killing time also seem to be correlated with unemployment measures!

Curtis: What are the downsides or potential pitfalls of using these types of new data sources?
Varian: First, the real measures—like credit card spending—are probably more indicative of actual outcomes than search data. Search is about intention, and spending is about transactions. Second, there can be feedback from news media and the like that may distort the intention measures. A headline story about a jump in unemployment can stimulate a lot of "unemployment rate" searches, so you have to be careful about how you interpret the data. Third, we've only had one recession since Google has been available, and it was pretty clearly a financially driven recession. But there are other kinds of recessions having to do with supply shocks, like energy prices, or monetary policy, as in the early 1980s. So we need to be careful about generalizing too broadly from this one example.

Curtis: Given the predominance of new data coming from Google, Twitter, and Facebook, do you think that this will limit, or even make obsolete, the role of traditional government statistical agencies such as Census Bureau and the Bureau of Labor Statistics in the future? If not, do you believe there is the potential for collaboration between these agencies and companies such as Google?
Varian: The government statistical agencies are the gold standard for data collection. It is likely that real-time data can be helpful in providing leading indicators for the standard metrics, and supplementing them in various ways, but I think it is highly unlikely that they will replace them. I hope that the private and public sector can work together in fruitful ways to exploit new sources of real-time data in ways that are mutually beneficial.

Curtis: A few years ago, former Fed Chairman Bernanke challenged researchers when he said, "Do we need new measures of expectations or new surveys? Information on the price expectations of businesses—who are, after all, the price setters in the first instance—as well as information on nominal wage expectations is particularly scarce." Do data from Google have the potential to fill this need?
Varian: We have a new product called Google Consumer Surveys that can be used to survey a broad audience of consumers. We don't have ways to go after specific audiences such as business managers or workers looking for jobs. But I wouldn't rule that out in the future.

Curtis: MIT recently introduced a big-data measure of inflation called the Billion Prices Project. Can you see a big future in big data as a measure of inflation?
Varian: Yes, I think so. I know there are also projects looking at supermarket scanner data and the like. One difficulty with online data is that it leaves out gasoline, electricity, housing, large consumer durables, and other categories of consumption. On the other hand, it is quite good for discretionary consumer spending. So I think that online price surveys will enable inexpensive ways to gather certain sorts of price data, but it certainly won't replace existing methods.

By Mark Curtis, a visiting scholar in the Atlanta Fed's research department


April 28, 2014 in Economics, Forecasts, Technology, Web/Tech | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a3fcfb87b0970b

Listed below are links to blogs that reference New Data Sources: A Conversation with Google's Hal Varian:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

January 31, 2014

A Brief Interview with Sergio Rebelo on the Euro-Area Economy

Last month, we at the Atlanta Fed had the great pleasure of hosting Sergio Rebelo for a couple of days. While he was here, we asked Sergio to share his thoughts on a wide range of current economic topics. Here is a snippet of a Q&A we had with him about the state of the euro-area economy:

Sergio, what would you say was the genesis of the problems the euro area has faced in recent years?

The contours of the euro area’s problems are fairly well known. The advent of the euro gave peripheral countries—Ireland, Spain, Portugal, and Greece—the ability to borrow at rates that were similar to Germany's. This convergence of borrowing costs was encouraged through regulation that allowed banks to treat all euro-area sovereign bonds as risk free.

The capital inflows into the peripheral countries were not, for the most part, directed to the tradable sector. Instead, they financed increases in private consumption, large housing booms in Ireland and Spain, and increases in government spending in Greece and Portugal. The credit-driven economic boom led to a rise in labor costs and a loss of competitiveness in the tradable sector.

Was there a connection between the financial crisis in the United States and the sovereign debt crisis in the euro area?

Simply put, after Lehman Brothers went bankrupt, we had a sudden stop of capital flows into the periphery, similar to that experienced in the past by many Latin American countries. The periphery boom quickly turned into a bust.

What do you see as the role for euro area monetary policy in that context?

It seems clear that more expansionary monetary policy would have been helpful. First, it would have reduced real labor costs in the peripheral countries. In those countries, the presence of high unemployment rates moderates nominal wage increases, so higher inflation would have reduced real wages. Second, inflation would have reduced the real value of the debts of governments, banks, households, and firms. There might have been some loss of credibility on the part of the ECB [European Central Bank], resulting in a small inflation premium on euro bonds for some time. But this potential cost would have been worth paying in return for the benefits.

And did this happen?

In my view, the ECB did not follow a sufficiently expansionary monetary policy. In fact, the euro-area inflation rate has been consistently below 2 percent and the euro is relatively strong when compared to a purchasing-power-parity benchmark. The euro area turned to contractionary fiscal policy as a panacea. There are good theoretical reasons to believe that—when the interest rate remains constant that so the central bank does not cushion the fall in government spending—the multiplier effect of government spending cuts can be very large. See, for example, Gauti Eggertsson and Michael Woodford, “The Zero Interest-rate Bound and Optimal Monetary Policy,” and Lawrence Christiano, Martin Eichenbaum, and Sergio Rebelo, "When Is the Government Spending Multiplier Large?

Theory aside, the results of the austerity policies implemented in the euro area are clear. All of the countries that underwent this treatment are now much less solvent than in the beginning of the adjustment programs managed by the European Commission, the International Monetary Fund, and the ECB.

Bank stress testing has become a cornerstone of macroprudential financial oversight. Do you think they helped stabilize the situation in the euro area during the height of the crisis in 2010 and 2011?

No. Quite the opposite. I think the euro-area problems were compounded by the weak stress tests conducted by the European Banking Association in 2011. Almost no banks failed, and almost no capital was raised. Banks largely increased their capital-to-asset ratios by reducing assets, which resulted in a credit crunch that added to the woes of the peripheral countries.

But we’re past the worst now, right? Is the outlook for the euro-area economy improving?

After hitting the bottom, a very modest recovery is under way in Europe. But the risk that a Japanese-style malaise will afflict Europe is very real. One useful step on the horizon is the creation of a banking union. This measure could potentially alleviate the severe credit crunch afflicting the periphery countries.

Thanks, Sergio, for this pretty sobering assessment.

John RobertsonBy John Robertson, a vice president and senior economist in the Atlanta Fed’s research department

Editor’s note: Sergio Rebelo is the Tokai Bank Distinguished Professor of International Finance at Northwestern University’s Kellogg School of Management. He is a fellow of the Econometric Society, the National Bureau of Economic Research, and the Center for Economic Policy Research.


January 31, 2014 in Banking, Capital and Investment, Economics, Europe, Interest Rates, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef01a73d66a0e3970d

Listed below are links to blogs that reference A Brief Interview with Sergio Rebelo on the Euro-Area Economy:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

December 23, 2013

Goodwill to Man

By pure coincidence, two interviews with Pennsylvania State University professor Neil Wallace have been published in recent weeks. One is in the December issue of the Federal Reserve Bank of Minneapolis’ excellent Region magazine. The other, conducted by Chicago Fed economist Ed Nosal and yours truly, is slated for the journal Macroeconomic Dynamics and is now available as a Federal Reserve Bank of Chicago working paper.

If you have any interest at all in the history of monetary theory over the past 40 years or so, I highly recommend to you these conversations. As Ed and I note of Professor Wallace in our introductory comments, very few people have such a coherent view of their own intellectual history, and fewer still have lived that history in such a remarkably consequential period for their chosen field.

Perhaps my favorite part of our interview was the following, where Professor Wallace reveals how he thinks about teaching economics, and macroeconomics specifically (link added):

If we were to construct an economics curriculum, independent of where we’ve come from, then what would it look like? The first physics I ever saw was in high school... I can vaguely remember something about frictionless inclined planes, and stuff like that. So that is what a first physics course is; it is Newtonian mechanics. So what do we have in economics that is the analogue of Newtonian mechanics? I would say it is the Arrow-Debreu general competitive model. So that might be a starting point. At the undergraduate level, do we ever actually teach that model?

[Interviewers] That means that you would not talk about money in your first course.

That is right. Suppose we taught the Arrow-Debreu model. Then at the end we’d have to say that this model has certain shortcomings. First of all, the equilibrium concept is a little hokey. It’s not a game, which is to say there are no outcomes associated with other than equilibrium choices. And second, where do the prices come from? You’d want to point out that the prices in the Arrow-Debreu model are not the prices you see in the supermarket because there’s no one in the model writing down the prices. That might take you to strategic models of trade. You would also want to point out that there are a lot of serious things in the world that we think we see that aren’t in the model: unemployment, money, and [an interesting notion of] firms aren’t in the Arrow-Debreu model. What else? Investing in innovation, which is critical to growth, isn’t in that model. Neither is asymmetric information. The curriculum, after this grounding in the analogue of Newtonian mechanics, which is the Arrow-Debreu model, would go into these other things. It would talk about departures from that theory to deal with such things; and it would describe unsolved problems.

So that’s a vision of a curriculum. Where would macro be? One way to think about macro is in terms of substantive issues. From that point of view, most of us would say macro is about business cycles and growth. Viewed in terms of the curriculum I outlined, business cycles and growth would be among the areas that are not in the Arrow-Debreu model. You can talk about attempts to shove them in the model, and why they fall short, and what else you can do.

Of the many things that I have learned from Professor Wallace, this one comes back to me again and again: Talk about how to get the things in the model that are essential to dealing with the unsolved problems, honestly assess why they fall short, and explore what else you can do. To me, this is not only a message of good science. It is one of intellectual generosity, the currency of good citizenship.

I was recently asked whether I align with “freshwater” or “saltwater” economics (roughly, I guess, whether I think of myself as an Arrow-Debreu type or a New Keynesian type). There are many similar questions that come up. Are you a policy “hawk” or a policy “dove”? Do you believe in old monetarism (willing to write papers with reduced-form models of money demand) or new monetarism (requiring, for example, some explicit statement about the frictions, or deviations from Arrow-Debreu, that give rise to money’s existence)?

What I appreciate about the Wallace formulation is that it asks us to avoid thinking in these terms. There are problems to solve. The models that we bring to those problems are not true or false. They are all false, and we—in the academic world and in the policy world—are on a common journey to figure out what we are missing and what else we can do.

It is deeply misguided to treat models as if they are immutable truths. All good economists appreciate this intellectually. And yet there is an awful lot of energy wasted, especially in the blogosphere, on casting aspersions at those who are perceived to be seeking answers within other theoretical tribes.

Some problems are well-suited to Newtonian mechanics, some are not. Some amendments to Arrow-Debreu are useful; some are not. And what is well-suited or useful in some circumstances may well be ill-suited or even harmful in others. Perhaps if we all acknowledge that none of us knows which is which 100 percent of the time, we can make just a little more progress on all those unsolved problems in the coming year. At a minimum, we would air our disagreements with a lot more civility.

Happy holidays.

David Altig By Dave Altig, executive vice president and research director at the Atlanta Fed


December 23, 2013 in Economics, Education, Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019b038761ea970c

Listed below are links to blogs that reference Goodwill to Man:

Comments

This's surprisingly simplistic point of view, c'mon. That particular debate is not about which model is right (all are wrong in one way or another, yes), but about what economists should do when their model turns out to not reflect real developments nearly as good as the other models do

Posted by: Konstantin | December 25, 2013 at 08:39 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 23, 2013

The Dynamics of Economic Dynamism

Earlier today, Atlanta Fed President Dennis Lockhart gave a speech at the Creative Leadership Summit of the Louise Blouin Foundation. He posed the questions: Is the economic dynamism of the United States declining? Is America losing its economic mojo? He observed:

“... we see a picture in which fewer firms are expanding, and each expanding firm is adding fewer new jobs on average than in the past. Fewer firms are shrinking, and each is downsizing by less on average. Fewer people are being laid off or are quitting their job, and firms are hiring fewer people. In other words, the employment dynamics of the U.S. economy are slower.

The decline in job creation and destruction was also the theme of this recent macroblog post by Mark Curtis, which featured some pretty nifty dynamic charts of trends in job creation and destruction by industry and geography.

Identifying the policy implications of these slower dynamics requires careful diagnosis of the causal factors underlying the trends. The cutting edge of economic research looking at this issue was featured at the 2013 Comparative Analysis of Enterprise Data Conference hosted last week by the Atlanta Census Research Data Center (ACRDC), which is housed at the Atlanta Fed and directed by one of our senior research economists, Julie Hotchkiss. Through the ACRDC, qualified researchers in Atlanta and around the Southeast can perform statistical analyses on non-public Census microdata.

The agenda and papers presented at the conference are located here. Some of the papers, I think, were particularly relevant to what President Lockhart discussed. A few examples:

Reallocation in the Great Recession: Cleansing or Not?” by Lucia Foster and Cheryl Grim of the Center for Economic Studies at the U.S. Census Bureau and John Haltiwanger at the University of Maryland looked at the so-called “cleansing hypothesis,” in which recessions are not only periods of outsized job creation and destruction, but they are also periods in which the reallocation is especially productivity enhancing. They find that while previous recessions fit this pattern reasonably well, they do not see this kind of activity in the most recent recession. In fact, they find that in the manufacturing sector, the intensity of reallocation fell rather than rose (because of the especially sharp decline in job creation), and the reallocation that did occur was less productivity enhancing than in prior recessions.

How Firms Respond to Business Cycles: The Role of Firm Age and Firm Size,” by Javier Miranda, Teresa Fort, John Haltiwanger and Ron Jarmin, looked at the varying impact of recessions on firms by size and age. They show that young businesses (which are typically small) exhibit very different cyclical dynamics than small/older businesses and are more sensitive to the cycle than larger/older businesses. The paper also explores explanations for the finding that young/small businesses were hit especially hard during the last recession. They identify the collapse in housing prices as a primary culprit, with the decline in job creation at young firms especially pronounced in states with a large drop in housing prices.

As a side note, although not presented at the conference, “The Secular Decline in Business Dynamism in the U.S.,” a new paper by Ryan Decker, John Haltiwanger, Ron Jarmin and Javier Miranda, analyzes the overall secular decline in job reallocation across industries. They find that changes in industry composition (the decline in manufacturing and rise of service industries) are not driving the decline. Instead, the primary driver seems to be the decline in the pace of entrepreneurship and the accompanying decline in the share of young firms in the economy.

Finally, Steve Davis, from the University of Chicago, talked about his joint research with John Haltiwanger, Kyle Handley, Ron Jarmin, Josh Lerner and Javier Miranda on private equity in employment dynamics, Private equity critics claim that leveraged buyouts bring huge job losses. Davis shows that private-equity buyouts are followed by a decline in net employment at these firms relative to controls (similar firms that were not targets of a buyout). However, that net change pales compared with the amount of gross job creation and destruction that typically occurs within the target firm after the buyout. In particular, he finds that in addition to reducing employment at its existing establishments, including by selling some establishments to other firms, jobs are created at new establishments within the firm via acquisition and the opening of new establishments. Moreover, they show that this reallocation is generally productivity enhancing for the firm. Although the data used in the study go only through the mid-2000s, it seems reasonable to infer from the findings that the decline in private equity deals during and since the last recession has contributed to the overall lower level of employment dynamics in this recovery.

The Comparative Analysis of Enterprise Date Conference was an excellent representation of the type of high-quality research being conducted on questions that go to the heart of the cyclical-versus-structural debate about the future course of the U.S. economy. While this is an exciting and important time for researchers in this field, it is troubling to learn that the programs that collect the data used in these types of studies are being trimmed because of federal budget cuts.

Photo of John RobertsonBy John Robertson, vice president and senior economist in the Atlanta Fed’s research department


September 23, 2013 in Economics, Employment, Unemployment | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff90d6d7970d

Listed below are links to blogs that reference The Dynamics of Economic Dynamism:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

September 13, 2013

Job Reallocation over Time: Decomposing the Decline

One of the primary ways an economy expands is by quickly reallocating resources to the places where they are most productive. If new and productive firms are able to quickly grow and unproductive firms can quickly shrink, then the economy as a whole will experience faster growth and the many benefits (such as lower unemployment and higher wages) that are associated with that growth. Certain individuals may experience unemployment spells from this reallocation, but economists, starting with Joseph Schumpeter, have found that reallocation is associated with economic growth and wage growth, particularly for young workers.

Recently, a number of prominent economists such as John Haltiwanger have expressed concern that falling reallocation rates in the United States are a major contributor to the slow economic recovery. One simple way to quantify the speed of reallocation is to examine the job creation rate—defined as the number of new jobs in expanding firms divided by the total number of jobs in the economy—and the destruction rate, defined likewise but using the number of jobs lost by contracting firms. Chart 1 plots both the creation and the destruction rates of the U.S. economy starting in 1977. These measures track each other closely with creation rates exceeding destruction rates during periods of economic growth and vice versa during recessions. The most recent recession saw a particularly sharp decline in job creation (you can highlight the creation rate by clicking on the line), but it is clear this decline is part of a larger trend that far predates the current period. A decline in these rates could indicate less innovation or less labor market flexibility, both of which are likely to retard economic growth. Feel free to explore the measures for yourself using the figure’s interactivity.

To better understand these important trends we create a common variable called reallocation, which is defined as total jobs created plus total jobs destroyed, divided by total jobs in the economy. This formula creates one measure that describes how quickly jobs are moving from shrinking firms to expanding firms. Using data from the U.S. Census Bureau’s Business Dynamic Statistics, we examine differences in this variable across sectors and across states. Furthermore, using some basic data visualization tools, we can see how reallocation has evolved over time across these dimensions.

Chart 2 plots reallocation rates by industry from 1977 to 2011. The plot highlights the reallocation rate for all industries, but you can also select or deselect any industry to more clearly view how it has changed over time. Scrolling over the lines allows you to view the exact rates by industry in any time period. A few interesting patterns emerge. First, sectors have different levels of job reallocation in the cross section. Manufacturing stands out as having particularly low reallocation rates, probably the result of the large fixed-cost capital requirements required in production. Second, not all industries experienced sharp declines during this period. If you highlight the finance, insurance, and real estate sector, it is evident that reallocation rates actually increased for this sector until the most recent recession. Retail and construction, on the other hand, have experienced steady and significant declines during the past 35 years.

Chart 3 maps reallocation rates across states for the year 1977. This figure provides us with a cross sectional view of geographical differences in reallocation rates. States with the highest reallocation rates are dark brown, and states with the lowest rates are light brown. You can click through the years to visually capture how these rates have changed overtime for each state. Compare the color of the map in 1977 with the color in 2011. Scroll the mouse over any state to view that state’s reallocation rate in the particular year.

As with industries, states display clear cross sectional differences in their reallocation rates. The highest rates are found in western states, Florida, and Texas, and the lowest are in the Midwest. Scrolling through the years shows that the decline in reallocation rates is common to the entire country.

Overall, these figures display a stark trend. The economy is reallocating jobs at much slower rates than 20 or even 10 years ago, and this decline is, with only a few exceptions, common across states and industries. Economists are just now starting to explore the causes of this trend, and a single, compelling explanation has yet to emerge. But some explanation is clearly in order and clearly important for economic policymakers, monetary and otherwise.

By Mark Curtis, a visiting scholar in the Atlanta Fed's research department

Please note that the charts and maps in this post were updated and improved on November 27, 2013.


September 13, 2013 in Economics, Employment, Labor Markets | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff5e9087970c

Listed below are links to blogs that reference Job Reallocation over Time: Decomposing the Decline:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

August 30, 2013

Still Waiting for Takeoff...

On Thursday, we got a revised look at the economy’s growth rate in the second quarter. While the 2.5 percent annualized rate was a significant upward revision from the preliminary estimate, it comes off a mere 1.1 percent growth rate in the first quarter. That combines for a subpar first-half growth rate of 1.8 percent. OK, it’s growth, but not as strong as one would expect for a U.S. expansion and clearly a disappointment to the many forecasters who had once (again) expected this to be the year the U.S. economy shakes itself out of the doldrums.

Now, we’re not blind optimists when it comes to the record of economic forecasts. We know well that the evidence says you shouldn’t get overly confident in your favorite economists’ prediction. Most visions of the economy’s future have proven to be blurry at best.

Still, we at the Atlanta Fed want to know how to best interpret this upward revision to the second-quarter growth estimate and how it affects our president’s baseline forecast “for a pickup in real GDP growth over the balance of 2013, with a further step-up in economic activity as we move into 2014.”

What we can say about the report is that the revised second-quarter growth estimate is a decided improvement from the first quarter and a modest bump up from the recent four-quarter growth trend (1.6 percent). And there are some positive indicators within the GDP components. For example, real exports posted a strong turnaround last quarter, presumably benefiting from Europe’s emerging from its recession. And the negative influence of government spending cuts, while still evident in the data, was much smaller than during the previous two quarters.  Oh, and business investment spending improved between the first and second quarters.

All good, but these data simply give us a better fix on where we were in the second quarter, not necessarily a good signal of where we are headed. To that we turn to our “nowcast” estimate for the third quarter based on the incoming monthly data (the evolution of which is shown in the table below).

A "nowcasting" exercise generates quarterly GDP estimates in real time. The technical details of this exercise are described here, but the idea is fairly simple. We use incoming data on 100-plus economic series to forecast 12 components of GDP for the current quarter. We then aggregate those forecasts of GDP components to get a current-quarter estimate of overall GDP growth.

We caution that unlike others, our nowcast involves no interpretation whatsoever of these data. In what is purely a statistical exercise, we let the data do all the speaking for themselves.

Given the first data point of July—the July jobs report—the nowcast for the third quarter was pretty bleak (1.1 percent). Things improved a few days later with the release of strong international trade data for June, and stepped up further with the June wholesale trade report. But the remainder of the recent data point to a third-quarter growth rate that is very close to the lackluster performance of the first half.


In his speech a few weeks ago, President Dennis Lockhart indicated what he was looking for as drivers for stronger growth in the second half of this year.

“I expect consumer activity to strengthen.”

Today’s read on real personal consumption expenditures (PCE) probably isn’t bolstering confidence in that view. Real PCE was virtually flat in July, undermining private forecasters’ expectation of a moderate gain. Our nowcast for real GDP slipped down 0.5 percentage points to 1.4 percent on the basis of this data, and pegged consumer spending at 1.7 percent for Q3—in line with Q2’s 1.8 percent gain.

“I expect business investment to accelerate somewhat.”

The July data were pretty disappointing on this score. The durable-goods numbers released a few days ago were quite weak, causing our nowcast, and those of the others we follow, to revise down the third-quarter growth estimate.

“I expect the rebound we have seen in the housing sector to continue.”

Check. Our nowcast wasn’t affected much by the housing starts data, but the existing sales numbers produced a positive boost to the estimate. Our nowcast’s estimate of residential investment growth in the third quarter is well under what we saw in the second quarter. But at 5.3 percent, the rebound looks to be continuing.

“I expect the recent improvement in exports to last.”

Unfortunately, the July trade numbers don’t get reported until next week. So we’re going to mark this one as missing in action.  But as we said earlier, that June trade number was strong enough to cause our third-quarter nowcast to be revised up a bit.

“And I expect to see an easing of the public-sector spending drag at the federal, state, and local levels.”

Again, check. The July Treasury data indicated growth in government spending overall.

So the July data are a mixed bag: some positives, some disappointments, and some missing-in-actions. But if President Lockhart were to ask us (and something tells us he just might), we’re likely to say that on the basis of the July indicators, the “pickup in real GDP growth over the balance of 2013” isn’t yet very evident in the data.

This news isn’t likely to come as a big surprise to him. Again, here’s what he said publicly two weeks ago:

When I weigh the balance of risks around the medium-term outlook I laid out, I have some concerns about the potential for ambiguous or disappointing data. I also think that it is important to be realistic about the degree to which we are likely to have clarity in the near term about the direction of the economy. Both the quantity of information and the strength of the signal conveyed by the data will likely be limited. As of September, the FOMC will have in hand one more employment report, two reports on inflation, a revision to the second-quarter GDP data, and preliminary incoming signals about growth in the third quarter. I don't expect to have enough data to be sure of my outlook.

It’s still a little early to say with any confidence we won’t eventually see a pickup this quarter, and we can hope that the incoming August numbers show a more marked improvement. All we can say at this point is that after seeing most of the July data, it still feels like we’re stuck on the tarmac.

Photo of Mike BryanBy Mike Bryan, vice president and senior economist,

Photo of Patrick HigginsPatrick Higgins, senior economist, and

Photo of Brent MeyerBrent Meyer, economist, all in the Atlanta Fed's research department


August 30, 2013 in Data Releases, Economic Growth and Development, Economics, Forecasts, GDP | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef019aff17e3fc970c

Listed below are links to blogs that reference Still Waiting for Takeoff...:

Comments

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

August 19, 2013

Does Forward Guidance Reach Main Street?

The Federal Open Market Committee (FOMC) has been operating with two tools (well described in a recent speech by our boss here in Atlanta). The first is our large-scale asset purchase program, or QE to everyone outside of the Federal Reserve. The second is our forward guidance on the federal funds rate. Here’s what the fed funds guidance was following the July FOMC meeting:

[T]he Committee decided to keep the target range for the federal funds rate at 0 to 1/4 percent and currently anticipates that this exceptionally low range for the federal funds rate will be appropriate at least as long as the unemployment rate remains above 6-1/2 percent, inflation between one and two years ahead is projected to be no more than a half percentage point above the Committee's 2 percent longer-run goal, and longer-term inflation expectations continue to be well anchored. 

The quarterly projections of the June FOMC meeting participants give more specific guidance on the fed funds rate assuming “appropriate” monetary policy. All but one FOMC participant expects the funds rate to be lifted off the floor in 2015, with the median projection that the fed funds rate will be 1 percent by the end of 2015.



But forward guidance isn’t worth much if the public has a very different view of how long the fed funds rate will be held near zero. The Federal Reserve Bank of New York has a good read on Wall Street’s expectation for the federal funds rate. Its June survey of primary dealers (a set of institutions the Fed trades with when conducting open market operations) saw a 52 percent chance that the fed funds rate will rise from zero in 2015, and the median forecast of the group saw the fed funds rate at 0.75 percent at the end of 2015. In other words, the bond market is broadly in agreement with the fed funds rate projections made by FOMC meeting participants.

But what do we know about Main Street’s perspective on the fed funds rate? Do they even have an opinion on the subject?

Our perspective on Main Street comes from our panel of businesses who participate in the monthly Business Inflation Expectations (BIE) Survey. And we used our special question to the panel this month to see if we could gauge how, indeed whether, businesses have opinions about the future of the federal funds rate. Here’s the specific question we put to the group:

Currently the fed funds rate is near 0%. [In June, the Federal Reserve projected the federal funds rate to be 1% by the end of 2015.] Please assign a percentage likelihood to the following possible ranges for the federal funds rate at the end of 2015 (values should sum to 100%).

In the chart below, we plot the distribution of panelists’ median-probability forecast (the green bars) compared to the distribution of the FOMC’s June projection (we’ve simply smushed the FOMC’s dots into the appropriately categorized blue bars).

Seventy-five percent of our respondents had a median-probability forecast for the fed funds rate somewhere between 0.5 percent and 1.5 percent by the end of 2015. That forecast compares very closely to the 73 percent of the June FOMC meeting participants.



You may have noticed in the above question a bracketed bit of information about the Federal Reserve’s forecast for the federal funds rate: “In June, the Federal Reserve projected the federal funds rate to be 1% by the end of 2015.” Actually, this bit of extra information was supplied only to half of our panel (selected at random). A comparison between these two panel subsets is shown in the chart below.


These two subsets are very similar. (If you squint, you might see that the green bars appear a little more diffuse, but this isn’t a statistically significant difference…we checked.) This result suggests that the extra bit of information we provided was largely extraneous. Our business panel seems to have already had enough information on which to make an informed prediction about the federal funds rate.

Finally, the data shown in the two figures above are for those panelists who opted to answer the question we posed. But, at our instruction, not every firm chose to make a prediction for the federal funds rate. With this month’s special question, we instructed our panelists to “Please feel free to leave this question blank if you have no opinion.” A significant number of our panelists exercised this option.

The typical nonresponse rate from the BIE survey special question is about 2 percent. This month, it was 22 percent—which suggests that an unusually high share of our panel had no opinion on the future of the fed funds rate. What does this mean? Well, it could mean that a significant share of Main Street businesses are confused by the FOMC’s communications and are therefore unable to form an opinion. But a high nonresponse rate could also mean that some segment of Main Street businesses don’t believe that forward guidance on the fed funds rate affects their businesses much.

Unfortunately, the data we have don’t put us in a very good position to distinguish between confusion and apathy. Besides, we’re optimistic sorts. We’re going to emphasize that 78 percent of those businesses we surveyed responded to the question, and that typical response lined up pretty well with the opinions of FOMC meeting participants and the expectations of Wall Street. So, while not everyone is dialed in to our forward guidance, Main Street seems to get it.

Photo of Mike BryanBy Mike Bryan, vice president and senior economist,

Photo of Brent MeyerBrent Meyer, economist, and

Photo of Nicholas ParkerNicholas Parker, senior economic research analyst, all in the Atlanta Fed's research department


August 19, 2013 in Business Inflation Expectations, Economics, Fed Funds Futures, Federal Reserve and Monetary Policy | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef0192ac9f9e5b970d

Listed below are links to blogs that reference Does Forward Guidance Reach Main Street?:

Comments

Forward guidance can prove to be an effective tool for monetary policy, especially, when it is first implemented, as it is unexpected as well. Later on, however, its impact is diminished as it is only the change in expected guidance that might have an impact.

Posted by: Javier | September 21, 2013 at 11:50 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in

July 08, 2013

Let’s Talk about Oil

Given its role in touching nearly every aspect of life across the globe and given the higher and volatile prices over the past half-decade, oil supply has been an incessant topic of conversation for much of our recent memory. Yet the tone of the conversation has dramatically pivoted recently from arguments about whether peak oil or sky-high oil prices could spur a global economic meltdown (anyone remember 2008?) to the shifting energy balance as a result of rapidly growing oil production from North America.

Chip Cummins and Russell Gold recently published a piece in the Wall Street Journal discussing how new supply from U.S. shale oil and Canadian oil sands is helping to steady global oil prices.

Crude prices have remained remarkably stable over the past year in the face of a long list of supply disruptions, from Nigerian oil theft to Syrian civil war to an export standoff between Sudan and South Sudan. The reason in large part is a thick new blanket of North American oil cushioning the markets.

This chart helps demonstrate how quickly the oil landscape in the United States has indeed changed. The U.S. Energy Information Administration (EIA) expects national crude oil production to exceed net oil imports later this year, marking a rapid turnaround from the trend of ever-increasing reliance on imports.



However, despite the increase in U.S. oil production, global oil prices have stabilized at relatively high levels, as the chart below shows.



However, the two seemingly opposing narratives—that of high oil prices and that of an emerging oil and gas abundance—are fundamentally linked. In fact, if it hadn’t been for such high oil prices, this new surge in North American oil production may not have happened. It is much more difficult to rationalize drilling activity in deep offshore areas, hard shale, or tar sands—from which, by nature, oil is expensive to produce—without high oil prices. (West Texas Intermediate, or WTI, oil averaged $31 per barrel in 2003, which, even in real terms, is only about 2/5 of today’s prices.) Analysts at Morgan Stanley estimate that the break-even point for Bakken (North Dakota) crude oil is about $70 per barrel and that even a price of $85 per barrel could squeeze out many of the unconventional producers.

What does all this mean for prices? Well, keep in mind that oil is a global commodity. So the roughly two million barrels of oil per day that have entered the market from the U.S. fracking boom represent a big shift domestically but only just over 2 percent of global oil consumption.

And while the United States is seeing growing oil supplies and moderating demand, a different trend is taking place globally, with rising demand from China and other emerging economies coupled with declining supply from older fields and OPEC efforts to keep prices higher through production limits.

However, not everyone believes that higher prices are here to stay. Some analysts have begun to warn that a price crash may be looming. Paul Stevens, an energy specialist with Chatham House, argues that we may be headed for a replay of the price crash in 1986 when high prices triggered demand destruction while bringing new, more expensive sources of supply to the market from the North Sea and Alaska.

Only time will tell where global oil prices will ultimately shake out, but for now, the larger supply cushion has certainly been a welcome development in the United States. Back to the Wall Street Journal article:

The new supply...is acting as a shock absorber in a global supply chain that pumps 88 million barrels of oil to consumers each day. That helps everyone from manufacturers to motorists, by steadying fuel prices and making budgeting easier.

Photo of Laurel GraefeBy Laurel Graefe, Atlanta Fed REIN director, and

Photo of LRebekah DurhamRebekah Durham, economic policy analysis specialist at the New Orleans Branch of the Atlanta Fed

Authors’ note: We didn’t touch on the difference between WTI and Brent oil prices in this post, despite the fact that the changing global oil production landscape has undoubtedly contributed to that spread. For those interested, we recommend some recent analysis from the Energy Information Administration on the narrowing spread between WTI and Brent.


July 8, 2013 in Economics, Energy, Pricing | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c834f53ef0192abec254b970d

Listed below are links to blogs that reference Let’s Talk about Oil:

Comments

A useful distinction is between the equilibrium price and the spot price which is notoriously volatile, in part, because of geopolitical risks in the Middle East. Increased production sourced in N. America reduces those risks and by adding 'spare capacity' also reduces overall costs by obviating the need for contingency arrangements.

Posted by: van schayk | July 09, 2013 at 12:42 PM

You only briefly mention moderating demand in the US, but the change in demand is about the same as the change in supply (US production). There has been lots of talk about increased production, but very little at the decreased domestic demand. Some of this is due to decreased miles driven, while some is due to higher CAFE standards prompting many new models to have significantly higher mileage than prior models (20% better for Altimas, Mazdas & others). Increased production is important, but reduced demand is equally important, and will be a better long-term solution as we will continue to see improvements as the nationwide fleet improves its mileage.

http://www.eia.gov/countries/country-data.cfm?fips=US#pet

Posted by: JimC | July 10, 2013 at 09:48 AM

Post a comment

Comments are moderated and will not appear until the moderator has approved them.

If you have a TypeKey or TypePad account, please Sign in