March 28, 2013
The Same-Old, Same-Old Labor Market
In his March 24 Wall Street Journal piece on declining government payrolls, Sudeep Reddy offers up a key observation:
The cuts in the public-sector workforce—at the federal, state and local levels—marked the deepest retrenchment in government employment of civilians since just after World War II... down by about 740,000 jobs since the recession ended in June 2009. At the same time, the private sector has added more than 5.2 million jobs over the course of the recovery.
As the Journal article notes, the story of shrinking government employment combining with private-sector payroll expansion has been remarkably consistent for much of the recovery.
About a year ago, we provided a graphical illustration of postrecession employment patterns using payroll-employment "bubble charts." These charts measure postrecession average monthly employment changes by sector relative to the changes in the prerecession period from December 2001 through October 2007. Not a lot in that chart has changed over the intervening year (just as not a lot had changed in 2012 compared with 2011):
The stability in the employment picture across private industries, both relative to one another and relative to the precrisis pace of job gains, is just as notable as the changing fortunes of private versus public employment. In fact, the charts offer some pretty clear impressions:
- Virtually all private-sector industries have moved into positive employment-growth territory. That movement includes the construction and financial activities sectors, which have generally lagged the improvement in the rest of the economy. The only broad category still shedding jobs in the private sector has been the information industry—which includes publishing, motion picture production, telecommunications, data processing, and the like—an industry that was also shrinking in terms of employment over the decade leading up to the financial crisis.
- All of the private-sector bubbles in the charts are now close to, on, or above the 45-degree line, meaning that the average pace of monthly job creation in each sector is near, equal to, or greater than what prevailed during the last recovery.
- As the Reddy piece emphasizes, government employment has been in decline since early 2010, though the government sector as a whole retains its status as the sector with the largest employment share. (The size of the bubbles in the charts above represents the share of employment in each sector at the end of the period for which the graph is drawn.) But the chart also illustrates another key point of Reddy's article: To date, the decline in government employment has been concentrated in state and, especially, local government jobs. Until recently, job creation by the federal government, which is relatively small in the bigger scheme of things, has not deviated much from its prerecession pattern.
The last point brings us to this observation, from the WSJ article:
How the rest of the private sector responds to a shrinking of the federal government could play a bigger role in determining how the budget fight hits the workforce.
"The private sector in the U.S. is growing so much stronger than anyone had expected," said Bernard Baumohl of the Economic Outlook Group. "This organic growth is going to significantly offset the effect of the sequester in terms of economic output and employment."
It is worth pointing out that the monthly average of 17,000 state, local, and federal government jobs lost since March 2010 has been nearly matched by average monthly increases of better than 14,000 jobs in manufacturing, a sector that persistently shed jobs in the previous recovery. The replacement of private- for public-sector employment has generated 175,000 to 185,000 net jobs per month in both 2011 and 2012. To put one perspective on that figure, at current labor force participation rates (along with some other assumptions and caveats), that pace would be sufficient to reach the Federal Open Market Committee's 6.5 percent unemployment threshold by sometime in spring 2015 (as you can verify yourself with the Atlanta Fed's Jobs Calculator). That calculation raises the stake somewhat on the matter of how "the rest of the private sector responds."
By Dave Altig, executive vice president and research director of the Atlanta Fed
March 19, 2013
Being Ahead of the Curve: Not Always a Good Thing
Our friends at the New York Fed have a nifty interactive graphic that compares the unemployment rate, labor force participation rate, and employment-to-population ratio over the last five business cycles. You can even break these indicators down by gender, by age, or by a particular business cycle. (For a deeper dive, check out this post at Liberty Street Economics by Jonathan McCarthy and Simon Potter.) And though it’s not exactly late-breaking news, no matter which of the three indicators you look at, you can’t help but conclude that the most recent recession is an outlier.
The Beveridge curve is a fourth and particularly useful graphical representation of a steady-state economy showing how, in theory, one might expect the vacancy rate to change, given an unemployment rate. It depicts the relationship between job openings and the unemployment rate. (The Atlanta Fed’s magazine, EconSouth, discussed the Beveridge curve.) It, too, has been standing out over the course of the most recent recovery, so much so that we think it warrants at least a second glance. There are a number of ways to estimate a Beveridge curve (see, for example, methods described by Gadi Barlevy of the Chicago Fed here and by the Richmond Fed’s Thomas Lubik here).
We use the method described by Barnichon et al. (2012) to estimate the solid curve used in the first chart below. The square plots represent actual vacancy rate (y-axis) and unemployment rate (x-axis) combinations by month from December 2000, when the Job Openings and Labor Turnover Statistics (JOLTS) data series from the U.S. Bureau of Labor Statistics (BLS) series begins, to January 2013, the most recent month of data available for both series.
Blue squares represent data from December 2000 to December 2009, when the “errors” between actual plots and the curve estimation were below 2 percentage points, and red squares represent data since January 2010, where data suddenly seem to jump higher than the predicted Beveridge curve to the tune of 2 percentage points or greater (see the chart below).
In June 2012, Regis Barnichon and his coauthors concluded that the unemployment rate’s lackluster performance so far in the recovery was attributable to a shortfall in hires per vacancy. Since then, the vacancy rate has climbed its way back to its June 2008 level of 2.7 percent. However, the unemployment rate has clearly not returned to either its June 2008 level (5.6 percent) or where the Beveridge curve says it should be given this vacancy rate, which one might predict to be 5.5 percent using the methodology of Barnichon et al.
This “ahead of the curve” phenomenon has not gone unnoticed and has prompted some explanations. In a March 6, 2013, article in The New York Times (which also has some cool charts), Catherine Rampell posits that available positions are staying unfilled longer, while interview processes have become lengthier.
The next day, Rampell went into more detail about why we’re going “off the curve” in a New York Times Economix post. She cites skills mismatch and a skills atrophy effect of the long-term unemployed affecting the ability of employers to fill positions (which she explains aren’t full explanations, yet we would expect to see wages for highly coveted positions rise significantly).
Rampell goes back to the explanation many of us continue to hear from business contacts: employers are unwilling to fill vacant positions because of economic and fiscal policy uncertainty. She quotes Stephen Davis of Chicago’s Booth School: “They’re taking longer to fill vacancies because they just feel less need to fill jobs now,” Davis said. “They recognize that in a slack labor market, there is an abundance of viable candidates. If something happens, and if they need to hire quickly, they know they can do that. That’s harder in a tight labor market.”
So maybe as labor markets “tighten up,” or perhaps if the speed by which they tighten up quickens, we’ll get back on the Beveridge curve. Only time, and several BLS releases, will tell.
By Patrick Higgins, an economist at the Atlanta Fed, and
Mark Carter, a senior economic analyst at the Atlanta Fed
March 11, 2013
You Say You’re a Homeowner and Not a Renter? Think Again.
As we’ve said before, we’re suckers for cool charts. The latest that caught our eye is the following one, originally created by the U.S. Bureau of Labor Statistics (BLS). It highlights the relative importance assigned to the various components of the consumer price index (CPI) and shows where increases in the index have come from over the past 12 months.
It probably won’t surprise anyone that the drop in gasoline prices (found in the transportation component) exerted downward pressure on the CPI last year, while the cost of medical care pushed the price index higher. What might surprise you is the size of that big, blue square labeled “housing.” Housing accounts for a little more than 40 percent of the CPI market basket and, given its weight, any change in this component significantly affects the overall index.
This begs the question: In light of the recent strength seen in the housing market—and notably the nearly 10 percent rise in home prices over the past 12 months—are housing costs likely to exert more upward pressure on the CPI?
Before we dive into this question, it’s important to understand that home prices do not directly enter into the computation of the CPI (or the personal consumption expenditures [PCE] price index, for that matter). This is because a home is an asset, and an increase in its value does not impose a “cost” on the homeowner. But there is a cost that homeowners face in addition to home maintenance and utilities, and that’s the implied rent they incur by living in their home rather than renting it out. In effect, every homeowner is his or her own tenant, and the rent they forgo each month is called the “owners’ equivalent rent” (or OER) in the CPI. OER represents about 24 percent of the CPI (and about 11 percent of the PCE price index). The CPI captures this OER cost (sensibly, in our view) by measuring the cost of home rentals (details here). So whether the robust rise in home prices will influence the behavior of the CPI this year depends on whether rising home prices influence home rents.
So what is likely to happen to OER given the continued increase in home prices? Well, higher home prices, in time, ought to cause home rents to rise, putting upward pressure on the CPI. Homes are assets to landlords, after all, and landlords (like all investors) require an adequate return on their investments. Let’s call this the “asset market influence” of home prices on home rents. But the rents that landlords charge also compete with homeownership. If renters decide to become homeowners, the rental market loses customers, which should push home rents in the opposite direction of home prices for a time. Let’s call this the “substitution influence” on rent prices.
Consider the following charts, which show three-month home prices and home rents (measured by the CPI’s OER measure). It’s a little hard to see a clear correlation between these two measures.
So we’ve separated these data into their trend and cycle components (using Hodrick-Prescott procedures, if you must know) shown in the following two charts. Now, if one takes the trend view, there is a clear positive relationship between home prices and home rents. This is consistent with the asset market influence described above. But also consider the detrended perspective. Here, home prices and home rents are pretty clearly negatively correlated. This, to us, looks like the substitution influence described above.
So let’s get back to the question at hand. What do rising home prices mean for OER and, ultimately, the behavior of the CPI? Well, it’s rather hard to say because the link between home prices and OER isn’t particularly strong.
Not definitive enough for you? OK, how about this: We think the recent rise in home prices will more likely lean against the rise in OER for the near term as the growing demand for home ownership provides some competition to the rental market. But, in time, these influences will give way to the asset market fundamentals, and rents are likely to accelerate as returns on real estate investments are reaffirmed.
By Mike Bryan, vice president and senior economist, and
Nick Parker, economic research analyst, both in the Atlanta Fed’s research department
March 08, 2013
Will the Next Exit from Monetary Stimulus Really Be Different from the Last?
Suppose you run a manufacturing business—let's say, for example, widgets. Your customers are loyal and steady, but you are never completely certain when they are going to show up asking you to satisfy their widget desires.
Given this uncertainty, you consider two different strategies to meet the needs of your customers. One option is to produce a large quantity of widgets at once, store the product in your warehouse, and when a customer calls, pull the widgets out of inventory as required.
A second option is to simply wait until buyers arrive at your door and produce widgets on demand, which you can do instantaneously and in as large a quantity as you like.
Thinking only about whether you can meet customer demand when it presents itself, these two options are basically identical. In the first case you have a large inventory to support your sales. In the second case you have a large—in fact, infinitely large—"shadow" inventory that you can bring into existence in lockstep with demand.
I invite you to think about this example as you contemplate this familiar graph of the Federal Reserve's balance sheet:
I gather that a good measure of concern about the size of the Fed's (still growing) balance sheet comes from the notion that there is more inherent inflation risk with bank reserves that exceed $1.5 trillion than there would be with reserves somewhere in the neighborhood of $10 billion (which would be the ballpark value for the pre-crisis level of reserves).
I understand this concern, but I don't believe that it is entirely warranted. My argument is as follows: The policy strategy for tightening policy (or exiting stimulus) when the banking system is flush with reserves is equivalent to the strategy when the banking system has low (or even zero) reserves in the same way that the two strategies for meeting customer demand that I offered at the outset of this post are equivalent.
Here's why. Suppose, just for example, that bank reserves are literally zero and the Federal Open Market Committee (FOMC) has set a federal funds rate target of, say, 3 percent. Despite the fact that bank reserves are zero there is a real sense in which the potential size of the balance sheet—the shadow balance sheet, if you will—is very large.
The reason is that when the FOMC sets a target for the federal funds rate, it is sending very specific instructions to the folks from the Open Market Desk at the New York Fed, who run monetary policy operations on behalf of the FOMC. Those instructions are really pretty simple: If you have to inject more bank reserves (and hence expand the size of the Fed's balance sheet) to maintain the FOMC's funds rate target, do it.
To make sense of that statement, it is helpful to remember that the federal funds rate is an overnight interest rate that is determined by the supply and demand for bank reserves. Simplifying just a bit, the demand for reserves comes from the banking system, and the supply comes from the Fed. As in any supply and demand story, if demand goes up, so does the "price"—in this case, the federal funds rate.
In our hypothetical example, the Open Market Desk has been instructed not to let the federal funds rate deviate from 3 percent—at least not for very long. With such instructions, there is really only one thing to do in the case that demand from the banking system increases—create more reserves.
To put it in the terms of the business example I started out with, in setting a funds rate target the FOMC is giving the Open Market Desk the following marching orders: If customers show up, step up the production and meet the demand. The Fed's balance sheet in this case will automatically expand to meet bank reserve demand, just as the businessperson's inventory would expand to support the demand for widgets. As with the businessperson in my example, there is little difference between holding a large tangible inventory and standing ready to supply on demand from a shadow inventory.
Though the analogy is not completely perfect—in the case of the Fed's balance sheet, for example it is the banks and not the business (i.e., the Fed) that hold the inventory—I think the story provides an intuitive way to process the following comments (courtesy of Bloomberg) from Fed Chairman Ben Bernanke, from last week's congressional testimony:
"Raising interest rate on reserves" when the balance sheet is large is the functional equivalent to raising the federal funds rate when the actual balance sheet is not so large, but the potential or shadow balance sheet is. In both cases, the strategy is to induce banks to resist deploying available reserves to expand deposit liabilities and credit. The only difference is that, in the former case, the available reserves are explicit, and in the latter case they are implicit.
The Monetary Policy Report that accompanied the Chairman's testimony contained a fairly thorough summary of costs that might be associated with continued monetary stimulus. Some of these in fact pertain to the size of the Fed's balance sheet. But, as the Chairman notes in the video clip above, when it comes to the mechanics of exiting from policy stimulus, the real challenge is the familiar one of knowing when it is time to alter course.
By Dave Altig, executive vice president and research director of the Atlanta Fed
March 01, 2013
What the Dual Mandate Looks Like
Sometimes simple, direct points are the most powerful. For me, the simplest and most direct points in Chairman Bernanke’s Senate testimony this week were contained in the following one minute and 49 seconds of video (courtesy of Bloomberg):
At about the 1:26 mark, the Chairman says:
So, our accommodative monetary policy has not really traded off one of [the FOMC’s mandated goals] against the other, and it has supported both real growth and employment and kept inflation close to our target.
To that point, here is a straightforward picture:
I concede that past results are no guarantee of future performance. And in his testimony, the Chairman was very clear that prudence dictates vigilance with respect to potential unintended consequences:
Highly accommodative monetary policy also has several potential costs and risks, which the committee is monitoring closely. For example, if further expansion of the Federal Reserve's balance sheet were to undermine public confidence in our ability to exit smoothly from our accommodative policies at the appropriate time, inflation expectations could rise, putting the FOMC's price stability objective at risk...
Another potential cost that the committee takes very seriously is the possibility that very low interest rates, if maintained for a considerable time, could impair financial stability. For example, portfolio managers dissatisfied with low returns may reach for yield by taking on more credit risk, duration risk, or leverage.
Concerns about such developments are fair and, as Mr. Bernanke makes clear, shared by the FOMC. Furthermore, the language around the Fed’s ultimate decision to end or alter the pace of its current open-ended asset-purchase program is explicitly cast in terms of an ongoing cost-benefit analysis. But anyone who wants to convince me that monetary policy actions have been contrary to our dual mandate is going to have to explain to me why that conclusion isn’t contradicted by the chart above.
By Dave Altig, executive vice president and research director of the Atlanta Fed
February 22, 2013
Nature Abhors an Output Gap
In The Washington Post, Neil Irwin highlights a shortcoming that I know all too well:
Throughout the halting economic recovery that began in 2009, the formal economic projections released by the Congressional Budget Office, White House Council of Economic Advisers, and Federal Reserve have displayed quite a consistent pattern: This year may be one of sluggish growth, they acknowledge. But stronger growth, of perhaps 3.5 percent, is just around the corner, and will arrive next year.
Consider, for example, the Fed's projections in November of 2009. Sure, growth would be slow in 2010, they held. But 2011 growth, they expected, would be 3.4 to 4.5 percent, and 2012 would 3.5 to 4.8 percent growth. The actual levels of growth were 2 percent in 2011 and 1.5 percent in 2012.
What's amazing is that the Fed's newest projections, released in December of 2012, look like they could have been copy and pasted from 2009, just with the years changed: They forecast sluggish growth in 2013, 2.3 to 3 percent, followed by a pickup to 3 to 3.5 percent in 2014 and 3 to 3.7 percent in 2015.
I, for one, am guilty as charged, and feel pretty fortunate that the offense is not a hanging one. In fact I don't think Irwin's indictment is overly harsh, and he is on the right track when he offers up this explanation for the last several years' persistently overly rosy projections:
Economic forecasters tend to look at past experience and extrapolate; in the past, when there has been a recession, the very forces that caused the recession become unwound, sowing the seeds for expansion...
Here is a basic fact about macroeconomic forecasting. The truly powerful driver of forecasts is mean reversion, which is the tendency of models to predict that gross domestic product (GDP) will move toward an average trend over time. This fact holds true whether we are talking about formal statistical analysis or the intuitive judgmental adjustments that all forecasters apply to their formal statistical models.
Forecasters are not completely robotic, of course. Irwin is correct when he says "forecasters tend to look at past experience and extrapolate, but forecasters do leaven past experience with incoming details that alter judgments about what is the mean—the "normal state," if you will—to which the economy will converge. But whatever is that normal state, our models insist that we will converge to it.
Nothing illustrates this property of forecasting reality better than this chart, which supplements the latest economic projections from the Congressional Budget Office:
The potential GDP line in that chart is the level of production that represents the structural path of the economy. Forecasters, no matter where they think that potential GDP line might be, all believe actual GDP will eventually move back to it. "Output gaps"—the shaded area representing the cumulative miss of actual GDP relative to its potential—simply won't last forever. And if that means GDP growth has to accelerate in the future (as it does when GDP today is below its potential)—well, that's just the way it is.
Unfortunately, potential GDP is not so simple to divine. We have to guess (or, more generously, estimate) what it is. That guessing game has been harder than usual over the past several years. Here is the record of the CBO's potential GDP since 2009:
I think this picture is a fairly representative record of how views about the potential level of U.S. GDP has evolved over the past several years. What has not been resolved is the debate over what conclusions should be drawn from persistent overestimates of potential and serial misses to the high side on GDP projections.
Irwin seems to be of two minds. On the one hand he offers very structural-sounding reasons for poor forecasting experience:
... the financial-crisis-induced recession of 2008–2009 was so deep that it had deep-seated effects that go beyond those explained by those traditional relationships. It messed up the workings of the financial system, and banks are still trying to figure out what the new one looks like.
On the other hand, he makes appeals to very traditional explanations tied to deficient spending and insufficient policy stimulus (though even here structural change may be one reason that stimulus has been insufficient):
Breakdowns in the financial system mean that low-interest rate policies from the Fed don't have their usual punch. An overhang of household debt means that consumers hold their wallets more than usual. Federal fiscal stimulus to offset those effects is now long-over...
This much, in any event, is clear: Given any starting point where the level of GDP is below its potential level—that is, given an output gap—forecasts will include a bounce back in GDP growth above its long-run average, at least for a while. That's just the way it works.
If, contrary to conventional wisdom, you believe that the true output gaps are much smaller than suggested in the CBO picture above, you might want to take the under on a bet to whether GDP forecasts will prove too optimistic once again.
By Dave Altig, executive vice president and research director of the Atlanta Fed
February 15, 2013
Promoting Job Creation: Don't Forget the Old Guys
In a provocative article posted this week, the American Enterprise Institute's James Pethokoukis concludes that the state of entrepreneurship in the United States is, disturbingly, weaker than ever. In particular, Pethokoukis documents a decline in jobs created by establishments less than one year old, a trend that began before the 2001 recession and has continued more or less unabated since. He specifically cites the following symptoms of trouble:
- Had small business come out of the recession maintaining just the rate of start-ups generated in 2007, according to McKinsey, the U.S. economy would today have almost 2.5 million more jobs than it does.
- There were fewer new firms formed in 2010 and 2011 than during the Great Recession.
- The rate of start-up jobs during 2010 and 2011—years that were technically in full recovery—were the lowest on record, according to economist Tim Kane of the Hudson Institute.
That last point appears to be all the more ominous given this observation from Tim Kane:
"...that startups create essentially all net new jobs. Existing employers, it turns out, tend to be net job losers, averaging net losses of 1 million workers per year."
Pethokoukis makes his case with political commentary that we don't endorse and don't find particularly helpful. But we won't argue with his conclusion that more entrepreneurial start-up activity would be a good thing. Nonetheless, we get a little concerned when the conversation jumps from data on net job creation and the role of start-ups and early life-cycle firms, and moves on to policy conclusions that seem to disproportionately focus on that class of businesses specifically.
Here is the source of our concern: Though it is also tempting to lump all "existing employers” into the basket of net job destroyers, there are existing firms that create jobs, and a few are doing so on a very large scale.
Take 2006, for instance. Based on data from the Commerce Department called Business Dynamics Statistics (BDS), new firms (businesses with a payroll that existed in March 2006 but not in March 2005) had about 3.5 million employees. This is the large net job creation by new firms reported by Kane. However, over the same year, expanding firms more than 10 years old added a whopping 11 million jobs—about three times as many jobs as created by new firms. Of course, some older firms were downsizing or closing—contracting mature firms destroyed an estimated 10 million jobs. So the net number of jobs created by older established firms looks somewhat less impressive than the record of those young start-ups. But in the overall picture, were the 11 million jobs created by the expanding older businesses really less important than 3.5 million created by the newbies?
It turns out that older firms also account for a large fraction of the job creation occurring in fast-growing firms, arguably a better characterization of entrepreneurism than newness. We found some compelling evidence reported in recent research by Akbar Sadeghi, James Spletzer, and David Talan. Using data from the U.S. Bureau of Labor Statistics' Business Employment Dynamics (BED), Sadeghi, Spletzer, and Talen find that older firms (those at least 10 years old) accounted for more than 40 percent of the employment created by high-growth firms (those with at least 20 percent annual employment gains between 2008 and 2011). A similar conclusion about the role of older, fast-growing firms is found in this earlier Kauffman Foundation report based on BDS data looking at the 1 percent of fastest-growing firms in the United States.
The point is not that start-up entrepreneurial activity is unimportant. It is vitally important. But in larger terms, we should recognize that all entrepreneurial activity is important, no matter what the age of the firm in which it occurs. Atlanta Fed President Dennis Lockhart highlighted this point in a speech delivered earlier this week at Instituto de Empresas in Madrid, Spain:
My bank's experience in trying to understand the role of small businesses, small-growth businesses, young businesses, and mature-growth businesses in job creation illustrates a key point, I think. In the pursuit of economic growth and increased employment, there is no silver bullet. Rather, the policy community should be pursuing an effective mix of policy elements (with focus in areas such as new business formation, labor rules, and regulatory efficiency, to name a few) that together catalyze a virtuous circle of innovation, growth, and employment.
Certainly, entrepreneurial risk-taking, whether by large, mature businesses or start-ups aimed at becoming growth companies, is part of the solution.
When it comes to promoting job creation, forgetting to throw mature businesses into the mix with start-ups is surely not the path to finding the best policy solutions.
February 05, 2013
2013 Business Hiring Plans: Employment, Effort, Hours, and Fiscal Uncertainty
How much is fiscal uncertainty holding back hiring? The answer seems to depend on whom you ask. Early in January, the Atlanta Fed spoke to 670 businesses in the Southeast about employment. Conditional on the respondents’ 2013 hiring plans (expand, hold steady, or contract), the following set of charts summarizes the results for how the businesses viewed activity relative to their own interpretation of “normal” along three dimensions: their current employment level, the amount of effort required from their staff per hour, and the average hours worked per employee. These questions were modeled on questions asked in the Atlanta Fed’s December 2012 Business Inflation Expectations Survey. In the following three charts, the green bars represent firms that said they planned to expand employment in 2013. The grey bars represent firms that said they did not plan to change their employment level in 2013, and the red bars represent firms that planned to reduce employment in 2013.
The first chart shows the results for current employment. Regardless of hiring plans over the next 12 months, most firms said they were currently at or below normal employment levels. Those planning on increasing employment over the next 12 months were a bit more likely to say they have already surpassed normal levels of employment than other firms, while those looking to shed employees were very likely to say their employment level is below normal employment levels.
Chart 2 shows that businesses are generally pushing hard along the effort dimension. Firms were quite likely to say that their staff’s effort per hour worked was currently at or above normal, whether or not they were planning to change employment in 2013.
Chart 3 shows that firms planning to expand were very likely to say that average hours worked were at or above normal (28 percent said hours were above normal, 60 percent about normal), whereas firms planning to contract were more likely to say that hours were at or below normal (48 percent about normal, 39 percent below normal).
Taken together, these results suggest that some firms are approaching the limit of how far they can go along the intensive margins of effort and hours before they have to hire more workers. With effort elevated, as more firms increase average hours worked to above-normal levels, one might expect more hiring to follow.
Each business was also asked how uncertainty about future fiscal policy was affecting its hiring plans. Firms planning to reduce employment tended to cite fiscal uncertainty as having a negative impact on their hiring plans. However, for those firms, hours also tended to be well below normal, so it is unlikely that removing fiscal uncertainty would move many of those firms into expansion mode (although it may help stabilize their outlook).
In contrast, fiscal uncertainty was generally viewed as having less impact by those planning to expand employment and those planning to hold employment levels steady. Presumably, reducing fiscal uncertainty would move some of the firms planning to hold steady into expansion mode, and those planning to expand would do so a bit more. To get some idea of this potential, Chart 4 shows the responses by firms who reported above-normal effort per hour and above-normal average hours worked. About 40 percent of those businesses said that fiscal uncertainty had caused them to scale back their hiring plans.
It is unclear whether eliminating fiscal uncertainty would have a big impact on the hiring plans of these firms. But these results suggest that it sure couldn’t hurt.
By John Robertson, vice president and senior economist, and
Ellyn Terry, a senior economic analyst in the Atlanta Fed's research department
February 01, 2013
Just in case you were inclined to drop the "dismal" from the "dismal science," Northwestern University professor Robert Gordon has been doing his best to talk you out of it. His most recent dose of glumness was offered up in a recent Wall Street Journal article that repeats an argument he has been making for a while now:
The growth of the past century wasn't built on manna from heaven. It resulted in large part from a remarkable set of inventions between 1875 and 1900...
This narrow time frame saw the introduction of running water and indoor plumbing, the greatest event in the history of female liberation, as women were freed from carrying literally tons of water each year. The telephone, phonograph, motion picture and radio also sprang into existence. The period after World War II saw another great spurt of invention, with the development of television, air conditioning, the jet plane and the interstate highway system…
Innovation continues apace today, and many of those developing and funding new technologies recoil with disbelief at my suggestion that we have left behind the era of truly important changes in our standard of living…
Gordon goes on to explain why he thinks potential growth-enhancing developments such as advances in healthcare, leaps in energy-production technologies, and 3-D printing are just not up to late-19th-century snuff in their capacity to better the lot of the average citizen. To paraphrase, your great-granddaddy's inventions beat the stuffing out of yours.
There has been a lot of commentary about Professor Gordon's body of work—just a few examples from the blogosphere include Paul Krugman, John Cochrane, Free Exchange (at The Economist), Gary Becker, and Thomas Edsall (who includes commentary from a collection of first-rate economists). Most of these posts note the current-day maladies that Gordon offers up to furrow the brow of the growth optimists. Among these are the following:
And inequality in America will continue to grow, driven by poor educational outcomes at the bottom and the rewards of globalization at the top, as American CEOs reap the benefits of multinational sales to emerging markets. From 1993 to 2008, income growth among the bottom 99% of earners was 0.5 points slower than the economy's overall growth rate.
Serious considerations, to be sure, but there is actually a chance that some of the "headwinds" that Gordon emphasizes are signs that something really big is afoot. In fact, Gordon's headwinds remind me of this passage, from a paper by economists Jeremy Greenwood and Mehmet Yorukoglu published about 15 years ago:
A simple story is told here that connects the rate of technological progress to the level of income inequality and productivity growth. The idea is this. Imagine that a leap in the state of technology occurs and that this jump is incarnated in new machines, such as information technologies. Suppose that the adoption of new technologies involves a significant cost in terms of learning and that skilled labor has an advantage at learning. Then the advance in technology will be associated with an increase in the demand for skill needed to implement it. Hence the skill premium will rise and income inequality will widen. In the early phases the new technologies may not be operated efficiently due to a dearth of experience. Productivity growth may appear to stall as the economy undertakes the (unmeasured) investment in knowledge needed to get the new technologies running closer to their full potential. The coincidence of rapid technological change, widening inequality, and a slowdown in productivity growth is not without precedence in economic history.
Greenwood and Yorukoglu go on to assess, in detail, how durable-goods prices, inequality, and productivity actually behaved in the first and second industrial revolutions. They conclude that game-changing technologies have, in history, been initially associated with falling capital prices, rising inequality, and falling productivity. Here is a representative chart, depicting the period (which was rich with technological advance) leading up to Gordon's (undeniably) golden age:
Source: "1974," Jeremy Greenwood and Mehmet Yorukoglu,
Carnegie-Rochester Conference Series on Public Policy, 46, 1997
Greenwood and Yorukoglu conclude their study with this pointed question:
Plunging prices for new technologies, a surge in wage inequality, and a slump in the advance of labor productivity - could all this be the hallmark of the dawn of an industrial revolution? Just as the steam engine shook 18th-century England, and electricity rattled 19th-century America, are information technologies now rocking the 20th-century economy?
I don't know (and nobody knows) if the dark-before-the-dawn possibility described by Greenwood and Yorukoglu is the apt analogy for where the U.S. (and global) economy sits today. (Update: Clark Nardinelli also discussed this notion.) But I will bet you there was some commentator writing in 1870 who sounded an awful lot like Professor Gordon.
By Dave Altig, executive vice president and research director of the Atlanta Fed
January 18, 2013
Still a Skeptic: Addressing a Few Questions about Nominal GDP Targeting
In a comment to last week's post on inflation versus price-level targeting, David Beckworth asks the following (referring back to an even earlier post on nominal gross domestic product [NGDP] targeting):
You refer back to your previous post on NGDP level targeting, but fail to take note of the comments that respond to your concerns about it. Specifically, see the ones by Andy Harless and Gregor Bush. Would love to see your response to those ones. Do you have a response for them? I am listening if you have one.
Here is an excerpt from the Harless comment...
Most people who advocate NGDP targeting today advocate level path targeting, not growth rate targeting. I don't believe that your "historical justification" applies in this case. Indeed, I think it makes the case for level targeting (of either the price level or NGDP, but there are reasons to prefer the latter) relative to the current system which centers on a growth rate target for the price level (in other words, an inflation target).
...and here is the Bush comment:
Just to add to Andy's point, advocates of NGDP level targeting argue that it's precisely because of uncertainty around estimates [of] potential output [that] NGDP targeting should be adopted. They argue that [as] long as the central bank keeps nominal spending on, say, a 5% trend line, there will be neither demand side recessions (mass unemployment) nor high inflation. In other words, AD will be stable and this will produce a stable macroeconomic environment. Whether inflation is 2% and real output [grows] at 3% or inflation is 3% and real output grows at 2% is of no concern.
In the post on NGDP targeting I was in fact thinking about level targeting, and Gregor Bush's last sentence gets to—in fact is—the heart of our disagreement. I am just not willing to concede that anchoring long-term inflation by saying something like "2 percent, 3 percent, whatever" is the path to sustaining central bank credibility. Over the longer term, inflation is the only thing that monetary policy can reliably deliver, as the Federal Open Market Committee (FOMC) has clearly articulated in its statement of longer-run goals and policy strategy:
The inflation rate over the longer run is primarily determined by monetary policy, and hence the Committee has the ability to specify a longer-run goal for inflation. The Committee judges that inflation at the rate of 2 percent, as measured by the annual change in the price index for personal consumption expenditures, is most consistent over the longer run with the Federal Reserve's statutory mandate...
The maximum level of employment is largely determined by nonmonetary factors that affect the structure and dynamics of the labor market. These factors may change over time and may not be directly measurable. Consequently, it would not be appropriate to specify a fixed goal for employment; rather, the Committee's policy decisions must be informed by assessments of the maximum level of employment, recognizing that such assessments are necessarily uncertain and subject to revision.
This excerpt does not imply, of course, that the Fed need slavishly pursue a numerical inflation target in the shorter run and, as I have pointed out before, in his last press conference Chairman Bernanke explicitly indicated that the FOMC does not intend to do so:
The Committee... intends to look through purely transitory fluctuations in inflation, such as those induced by short-term variations in the prices of internationally traded commodities, and to focus instead on the underlying inflation trend.
My price-level targeting post, co-authored with Mike Bryan, was exactly making the point that, over the past couple of decades, the FOMC has essentially delivered on a 2 percent longer-term price-level growth objective, while accepting plenty of shorter-term variability.
In the end, it is an open question whether credibility in delivering price stability, hard won in the '80s and early '90s, could be sustained if the FOMC says it does not care so much about the exact level of the average rate of inflation, even in the long run. To be truthful, I can't give you an answer to that question. But neither can the proponents of NGDP targeting. I just don't feel that this is an opportune time for an experiment.
Update: Scott Sumner responds.
By Dave Altig, executive vice president and research director of the Atlanta Fed