In October 1979, Federal Reserve Chairman Paul Volcker persuaded his FOMC colleagues to adopt a new policy framework that i) accepted responsibility for controlling inflation and ii) implemented new operating procedures to control the growth of monetary aggregates in an effort to restore price stability. These moves were strongly supported by monetarist-oriented economists, including the leadership and staff of the Federal Reserve Bank of St. Louis. The next three years saw inflation peak and then fall sharply, but also two recessions and considerable volatility in interest rates and money supply growth rates. This article reviews the episode through the lens of speeches and FOMC meeting statements of Volcker and St. Louis Fed president Lawrence Roos, and articles by Roos’ staff. The FOMC adopted monetarist principles to establish the Fed’s anti-inflation credibility but Volcker was willing to accept deviations of money growth from the FOMC’s targets, unlike Roos, who viewed the targets as sacrosanct. The FOMC abandoned monetary aggregates in October 1982, but preserved the Fed’s commitment to price stability. The episode illustrates how Volcker used a change in operating procedures to alter policy fundamentally, and later adapt the procedures to changed circumstances without abandoning the foundational features of the policy.
We use an incomplete markets economy to quantify the distribution of welfare gains and losses of the US “Volcker” disinflation. In the long run households prefer low inflation, but disinflation requires a transition period and a redistribution from net nominal borrowers to net nominal savers. Welfare costs may be significant for households with nominal liabilities. When calibrated to match the micro and macro moments of the early 1980s high-inflation environment and the actual changes in the nominal interest rate and inflation during the Volcker disinflation, nearly 60 percent of all households would prefer to avoid the disinflation. This share depends negatively on the liquidity value of money, positively on the average duration of nominal borrowing, and positively on the short-run increase in the real interest rate.
This paper seeks to explain several key components of the growing regional disparities in the U.S. since 1980: big cities saw a larger increase in the relative wages and relative supply of skilled workers, and a smaller decline in business dynamism. These trends can be explained by differences across cities in the extent to which firms adopt new skill-biased technologies. With the introduction of a new skill-biased, high fixed cost but low marginal cost technology, firms endogenously adopt more in big cities, cities that offer abundant amenities for high-skilled workers and cities that are more productive in using high-skilled labor. Differences in technology adoption account for the regional divergence in the relative wages and supply of skilled workers and in business dynamism. I document that firms in big cities invest more intensively in Information and Communication Technology, consistent with patterns of technology adoption in the model.
What is the theoretical justification for taxing unspent money transfers in a recession? To examine this question, I study a model economy where fiat money is necessary as a medium of exchange and, incidentally, serves as a store of value. This latter property is shown to open the door to business cycles and depressions driven entirely by speculation. Unconditional money transfers do not guarantee escape from a psychologically-induced depression. I demonstrate how money transfers subject to a short expiration date do eliminate speculative equilibria. This hot money policy compares favorably to negative interest rate policy because the latter taxes all money savings whereas the former only threatens to tax gifted money.
We investigate the essentiality of credit and banking in a microfounded monetary model in which agents face heterogeneous idiosyncratic time preference shocks. Three main results arise from our analysis. First, the constrained-efficient allocation is unattainable without banks. Second, financial intermediation can improve the equilibrium allocation even at the Friedman rule because it relaxes the liquidity constraints of impatient borrowers. Third, changes in credit conditions are not necessarily neutral in a monetary equilibrium at the Friedman rule. If the debt limit is sufficiently low, money and credit are perfect substitutes and tightening the debt limit is neutral. As the debt limit increases, however, patient agents always hold money but impatient agents prefer not to since it is costly for them to do so given they are facing a positive shadow rate. Borrowing instead is costless when interest rates are zero and increasing the debt limit improves the allocation.
A model with two essential elements, sovereign default and distortionary fiscal and monetary policies, explains the interaction between sovereign debt, default risk and inflation in emerging countries. We derive conditions under which monetary policy is actively used to support fiscal policy and characterize the intertemporal tradeoffs that determine the choice of debt. We show that in response to adverse shocks to the terms of trade or productivity, governments reduce debt and deficits, and increase inflation and currency depreciation rates, matching the patterns observed in the data for emerging economies.
This article describes the origins and development of the federal funds market from its inception in the 1920s to the early 1950s. We present a newly digitized daily data series on the federal funds rate from April 1928 through June 1954. We compare the behavior of the funds rate with other money market interest rates and the Federal Reserve discount rate. Our federal funds rate series will enhance the ability of researchers to study an eventful period in U.S. financial history and to better understand how monetary policy was transmitted to banking and financial markets. For the 1920s and 1930s, our series is the best available measure of the overnight risk-free interest rate, better than the call money rate which many studies have used for that purpose. For the 1940s-1950s, our series provides new information about the transition away from wartime interest-rate pegs culminating in the 1951 Treasury-Federal Reserve Accord.
This paper takes a unique approach to the scenario where a resident terrorist group in a (fragile) developing nation poses a terrorism threat at home and abroad. The host developing nation’s proactive countermeasures against the resident terrorist group not only limits terrorism at home and abroad, but also bolsters regime stability at home. A two-stage game is presented in which the developed country takes a leadership role to institute a tax-subsidy combination to discourage (encourage) proactive measures at home (abroad) in stage 1. Stage 2 involves both nations’ counterterrorism choices under alternative stage-1 public-policy packages. Unlike the extant literature, we explore corner and interior solutions in both stages based on the terrorists’ targeting preferences and the host nation’s regime-stability preferences. Surprisingly, the developed nation may profit from policy packages that reduce global counterterrorism while raising global terrorism. This outcome and others involve engineered counterterrorism burden shifting.
This paper decomposes the causal effect of government defense spending into: (i) a local (or direct) effect, and (ii) a spillover (or indirect) effect. Using state-level defense spending data, we show that a negative cross-state spillover effect explains the existing simultaneous findings of a low aggregate multiplier and a high local multiplier. We show that enlisting disaggregate data improves the precision of aggregate effect estimates, relative to using aggregate time series alone. Moreover, we compare two-step efficient GMM with two alternative moment weighting approaches used in existing research.
I present a model where work implies social interactions and the spread of a disease is described by an SIR-type framework. Upon the outbreak of a disease reduced social contacts are decided at the cost of lower consumption. Private individuals do not internalize the effects of their decisions on the evolution of the epidemic while the planner does. Specifically, the planner internalizes that an early reduction in contacts implies fewer infectious in the future and, therefore, a lower risk of infection. This additional (relative to private individuals) benet of reduced contacts implies that the planner's solution feature more social distancing early in the epidemics. The planner also internalizes that some infectious eventually recover and contribute further to a lower risk of infection. These mechanisms imply that the planner obtains a flatter infection curve than that generated by private individuals' responses.
Short-term debt is commonly used to fund illiquid assets. A conventional view asserts that such arrangements are run-prone in part because redemptions must be processed on a first-come, first-served basis. This sequential service protocol, however, appears absent in the wholesale banking sector---and yet, shadow banks appear vulnerable to runs. We explain how banking arrangements that fund fixed-cost operations using short-term debt can be run-prone even in the absence of sequential service. Interventions designed to eliminate run risk may or may not improve depositor welfare. We describe how optimal policies vary under different conditions and compare these to recent policy interventions by the Security and Exchange Commission and the Federal Reserve. We conclude that the conventional view concerning the societal benefits of liquidity transformation and its recommendations for prudential policy extend far beyond their application to depository institutions.
We measure labor demand and supply shocks at the sector level around the COVID-19 outbreak by estimating a Bayesian structural vector autoregression on monthly statistics of hours worked and real wages. Most sectors were subject to historically large negative labor supply and demand shocks in March and April, with substantial heterogeneity in the size of shocks across sectors. Our estimates suggest that two-thirds of the drop in the aggregate growth rate of hours in March and April 2020 are attributable to labor supply. We validate our estimates of supply shocks by showing that they are correlated with sectoral measures of telework.
This paper studies international trade policy during a pandemic. We consider a multi-sector small open economy model with essential and non-essential goods. Essential goods provide utility relative to a reference consumption level, and a pandemic consists of an increase in this reference level along with higher import and export prices of these goods. The economy produces domestic varieties of both types of goods subject to sectoral adjustment costs, and varieties are traded internationally subject to trade barriers. We find that trade provides limited relief to the increased demand for essential goods: import prices increase, limiting access, while domestic producers reallocate domestic sales toward exports. We find that international trade policy changes can mitigate these effects. The optimal unilateral trade policy response to the pandemic is to subsidize imports of essential goods while taxing exports, leading to increased consumption of essential goods in the short-run. These findings are consistent with evidence on changes in trade barriers across countries during the COVID-19 pandemic.
The largest economic cost of the COVID-19 pandemic could arise if it changed behavior long after the immediate health crisis is resolved. A common explanation for such a long-lived effect is the scarring of beliefs. We show how to quantify the extent of such belief changes and determine their impact on future economic outcomes. We find that the long-run effect of the COVID crisis depends crucially on whether bankruptcies and changes in habit make existing capital obsolete. A policy that avoided most permanent separation of workers from capital could generate a much larger benefit than originally thought, that could easily be 180% of annual GDP, in present value.
This paper studies the optimal maturity structure for government debt when markets for liquidity insurance are incomplete or non-competitive. There is no fiscal risk. Government debt in the model solves a dynamic inefficiency. Issuing debt in short and long maturities solves a liquidity insurance problem, but optimal yield curve policy is only possible if long-duration debt is rendered illiquid. Optimal policy is implementable through treasury operations only--adjustments in the primary deficit are not necessary.
A positive national debt is often rationalized either by the assumption of dynamic inefficiency in an overlapping-generations (OLG) model, or by the hypothesis of heterogeneous-agents and incomplete-markets (HAIM) in an infinite horizon model. Both assumptions imply insufficient private liquidity to support private saving and investment, thus calling for a positive level of public debt to improve social welfare. However, since public debt is financed often by distortionary future taxes, optimal debt and tax policies ought to be studied jointly in a single framework. In this paper we use a primal Ramsey approach to analytically characterize optimal debt and tax policy in an OLG-HAIM model. We show that (i) public debt can be a liability instead of net wealth, despite insufficient private liquidity to support private saving and investment, and (ii) such a debt policy can dramatically change the government's optimal tax scheme since the sign and magnitude of the optimal quantity of debt dictate the sign and magnitude of optimal taxes as well as the priority order of tax tools (such as a labor tax vs. a capital tax) in financing the public debt.
I study the effects of the 2020 coronavirus outbreak in the United States and subsequent fiscal policy response in a nonlinear DSGE model. The pandemic is a shock to the utility of contact-intensive services that propagates to other sectors via general equilibrium, triggering a deep recession. I use a calibrated version of the model that matches the path of the US unemployment rate in 2020 to analyze different types of fiscal policies. I find that the pandemic shock changes the ranking of policy multipliers. Unemployment benefits are the most effective tool to stabilize income for borrowers, who are the hardest hit during a pandemic, while liquidity assistance programs are the most effective if the policy objective is to stabilize employment in the affected sector. I also study the effects of the $2.2 trillion CARES Act of 2020.
In this paper we present and describe a large quarterly frequency, macroeconomic database. The data provided are closely modeled to that used in Stock and Watson (2012a). As in our previous work on FRED-MD, our goal is simply to provide a publicly available source of macroeconomic “big data” that is updated in real time using the FRED database. We show that factors extracted from this data set exhibit similar behavior to those extracted from the original Stock and Watson data set. The dominant factors are shown to be insensitive to outliers, but outliers do affect the relative influence of the series as indicated by leverage scores. We then investigate the role unit root tests play in the choice of transformation codes with an emphasis on identifying instances in which the unit root-based codes differ from those already used in the literature. Finally, we show that factors extracted from our data set are useful for forecasting a range of macroeconomic series and that the choice of transformation codes can contribute substantially to the accuracy of these forecasts.
New vehicle sales in the U.S. fell nearly 40 percent during the past recession, causing significant job losses and unprecedented government interventions in the auto industry. This paper explores three potential explanations for this decline: increasing oil prices, falling home values, and falling household income expectations. First, we use the historical macroeconomic relationship between oil prices and vehicle sales to show that the oil price spike explains roughly 15 percent of the auto sales decline between 2007 and 2009. Second, we establish that declining home values explain only a small portion of the observed reduction in household new vehicle sales. Using a county-level panel from the episode, we find (1) a one-dollar fall in home values reduced household new vehicle spending by 0.5 to 0.7 cents and overall new vehicle spending by 0.9 to 1.2 cents and (2) falling home values explain between 16 and 19 percent of the overall new vehicle spending decline. Next, examining state-level data for 1997-2016, we find (3) the short-run responses of new vehicle consumption to home value changes are larger in the 2005-2011 period relative to other years, but at longer horizons (e.g. 5 years), the responses are similar across the two sub-periods and (4) the service flow from vehicles, as measured by miles traveled, responds very little to house price shocks. We also detail the sources of the differences between our findings (1) and (2) from existing research. Third, we establish that declining current and expected future income expectations potentially played an important role in the auto market's collapse. We build a permanent income model augmented to include infrequent repeated car buying. Our calibrated model matches the pre-recession distribution of auto vintages and the liquid-wealth-to-income ratio, and exhibits a large vehicle sales decline in response to a mild decline in expected permanent income due to a transitory slowdown in income growth. In response to the shock, households delay replacing existing vehicles, allowing them to smooth the effects of the income shock without significantly adjusting the service flow from their vehicles. Augmenting our model with a richer set of household expectations allows us to match 65 percent of the overall new vehicle spending decline (i.e. roughly the portion of the decline not explained by oil prices and falling home values). Combining our negative results regarding housing wealth and oil prices with our positive model-based findings, we interpret the auto market collapse as consistent with existing permanent income based approaches to durable goods purchases (e.g., Leahy and Zeira (2005)).
This article examines how the real economy and inflation and inflation expectations evolved in response to the six tightening episodes enacted by the FOMC since 1983. The findings indicate that the sixth episode (2015-2018) differed in several key dimensions compared with the previous five episodes. In the first five episodes, the data show the FOMC was generally tightening into a strengthening economy with building price pressures. In contrast, in the final episode the FOMC began its tightening regime during a deceleration in economic activity and with headline and core inflation remaining well below the FOMC’s 2 percent inflation target. Moreover, both short- and long-term inflation expectations were drifting lower. These developments helped explain why there was a one-year gap between the first and second increases in the federal funds target rate in the final episode. Another key difference is that in three of the first five episodes, the FOMC continued to tighten after the yield curve inverted; a recession then followed shortly thereafter. However, in the final episode, the FOMC ended its tightening policy about eight months before the yield curve inverted.
In a canonical model of heterogeneous agents with precautionary saving motives, Aiyagari (1995) breaks the classical result of zero capital tax obtained in representative-agent models. Aiyagari argues that with capital overaccumulation the optimal long-run capital tax should be strictly positive in order to achieve aggregate allocative efficiency suggested by the modified golden rule (MGR). In this paper, we find that, depending on the sources of capital overaccumulation, capital taxation may not be the most efficient means to restore the MGR when government debt is feasible. To demonstrate our point, we study optimal policy mix in achieving the socially optimal (MGR) level of aggregate capital stock in an infinite horizon heterogeneous-agents incomplete-markets economy where capital may be overaccumulated for two distinct reasons: (i) precautionary savings and (ii) production externalities. By solving the Ramsey problem analytically along the entire transitional path, we reveal that public debt and capital taxation play very distinct roles in dealing with the overaccumulation problem. The Ramsey planner opts neither to use a capital tax to correct the overaccumulation problem if it is caused solely by precautionary saving—regardless of the feasibility of public debt—nor to use debt (financed by consumption tax) to correct the overaccumulation problem if it is caused solely by production externality (such as pollution)—regardless of the feasibility of a capital tax. The key is that the MGR has two margins: an intratemporal margin pertaining to the marginal product of capital (MPK) and an intertemporal margin pertaining to the time discount rate. To achieve the MGR, the Ramsey planner needs to equate not only the private MPK with the social MPK but also the interest rate with the time discount rate—neither of which is equalized in a competitive equilibrium. Yet public debt and a capital tax are each effective only in calibrating one of the two margins, respectively, but not both.
The closing of a busy airport has large effects on noise and economic activity. Using a unique dataset, we examine the effects of closing Denver’s Stapleton Airport on nearby housing markets. We find evidence of immediate anticipatory price effects upon announcement, but no price changes at closing and little evidence of upward trending prices between announcement and closing. However, after airport closure, more higher income and fewer black households moved into these locations, and developers built higher quality houses. Finally, post-closing, these demographic and housing stock changes had substantial effects on housing prices, even after restricting the sample to sales of pre-existing housing.
We use an analytically tractable, heterogeneous-agent incomplete-markets model to show that the Ramsey planner's decision to finance stochastic public expenditures implies a departure from tax smoothing and an endogenous mean-reverting force to support positive debt growth despite the government's precautionary saving motives. Specifically, the government's attempt to balance the competing incentives between its own precautionary saving (tax smoothing) and households' precautionary saving (individual consumption smoothing)---even at the cost of extra tax distortion---implies an endogenous, soft lower bound on the stochastic unit-root dynamics of optimal taxes and public debt.
US payroll employment data come from a survey and are subject to revisions. While revisions are generally small at the national level, they can be large enough at the state level to alter assessments of current economic conditions. Users must therefore exercise caution in interpreting state employment data until they are “benchmarked” against administrative data 5–16 months after the reference period. This paper develops a state-space model that predicts benchmarked state employment data in real time. The model has two distinct features: 1) an explicit model of the data revision process and 2) a dynamic factor model that incorporates real-time information from other state-level labor market indicators. We find that the model reduces the average size of benchmark revisions by about 11 percent. When we optimally average the model’s predictions with those of existing models, the model reduces the average size of the revisions by about 14 percent.
The wave of sovereign defaults in the early 1980s and the string of debt crises in subsequent decades have fostered proposals involving policy interventions in sovereign debt restructurings. The global financial crisis and the recent global pandemic have further reignited this discussion among academics and policymakers. A key question about these policy proposals for debt restructurings that has proved hard to handle is how they influence the behavior of creditors and debtors. We address this challenge by evaluating policy proposals in a quantitative sovereign default model that incorporates two essential features of debt: maturity choice and debt renegotiation in default. We find, first, that a rule that tilts the distribution of creditor losses during restructurings toward holders of long-maturity bonds reduces short-term yield spreads, lowering the probability of a sovereign default by 25 percent. Second, issuing GDP-indexed bonds exclusively during restructurings also reduces the probability of default, especially of defaults in the five years following a debt restructuring. The policies lead to welfare improvements and reductions in haircuts of similar magnitude when implemented separately. When jointly implemented, they reinforce each other’s welfare gains, suggesting good complementarity
Gino Gancia, Giacomo Ponzetto and Jaume Ventura have written an extremely interesting paper on a topic that is very timely for the global economy. In this article, I will first argue that GPV have succeeded in formalizing their hypothesis, and that while providing very suggestive analytical results, additional work can and should be done with the model, especially with regards to relative changes in the relative weights of incumbent countries. Second, I will comment on the potential insights if the rest of the world is modeled more realistically. Third, I will call for extending the baseline model to incorporate additional aspects beyond trade, such as investment and immigration flows, which appear to be relevant for the story of the European Union and its discontents. Four, I will add my non-European perspective on using the model to understand the story of the European Union.
We propose a method to decompose changes in the tax structure into orthogonal components measuring the level and progressivity of taxes. The level shock is similar to tax shocks found in the empirical literature--increasing the tax level is contractionary. On the other hand, an increase in tax progressivity is expansionary. When tax progressivity increases, the bottom of the income distribution experiences an increase in disposable income. Agents at the low end of the income distribution who have high marginal propensity to consume offset the decrease in consumption by the savers at the high end of the income distribution. In the medium term, the economic expansion benefits those at the top of the income distribution: Capital gains they experience from the boom offset income losses from the increase in tax progressivity. The net result is that an increase in progressivity leads to an increase in income inequality, contrary to what conventional wisdom might suggest. We interpret these results as evidence in favor of trickle up, not trickle down, economics.
The global financial crisis of the past decade has shaken the research and policy worlds out of their belief that housing markets are mostly benign and immaterial for understanding economic cycles. Instead, a growing consensus recognizes the central role that housing plays in shaping economic activity, particularly during large boom and bust episodes. This article discusses the latest research regarding the causes, consequences, and policy implications of housing crises with a broad focus that includes empirical and structural analysis, insights from the 2000's experience in the United States, and perspectives from around the globe. Even with the significant degree of heterogeneity in legal environments, institutions, and economic fundamentals over time and across countries, several common themes emerge to guide current and future thinking in this area.
There has been much interest recently in the role of household long-term, mortgage, debt in the transmission of monetary policy. This paper offers a tractable framework that integrates the long-term debt channel with the standard New-Keynesian channel, providing a tool for monetary policy analysis that reflects the recent debates in the literature. As the model includes both short- and long-term debt, it provides a useful laboratory for the analysis of monetary policy operating not only through short-term actions, as has been done traditionally in the literature, but also through expected, persistent, changes in its stance.
This paper investigates the determinants of international technology licensing using
data for 41 countries during 1996-2012. A multi-country model of innovation and international technology licensing yields a dynamic structural gravity equation for royalty payments as a function of fundamentals, including: (i) imperfect intellectual property protection and (ii) tax havens. The gravity equation is estimated using nonlinear methods. The model’s fundamentals account for 56% of the variation in royalty payments. Counterfactual analysis sheds light on the role of intellectual property rights and tax
havens on international technology licensing.