In this paper we present and describe a large quarterly frequency, macroeconomic database. The data provided are closely modeled to that used in Stock and Watson (2012a). As in our previous work on FRED-MD, our goal is simply to provide a publicly available source of macroeconomic “big data” that is updated in real time using the FRED database. We show that factors extracted from this data set exhibit similar behavior to those extracted from the original Stock and Watson data set. The dominant factors are shown to be insensitive to outliers, but outliers do affect the relative influence of the series as indicated by leverage scores. We then investigate the role unit root tests play in the choice of transformation codes with an emphasis on identifying instances in which the unit root-based codes differ from those already used in the literature. Finally, we show that factors extracted from our data set are useful for forecasting a range of macroeconomic series and that the choice of transformation codes can contribute substantially to the accuracy of these forecasts.
New vehicle sales in the U.S. fell nearly 40 percent during the past recession, causing significant job losses and unprecedented government interventions in the auto industry. This paper explores three potential explanations for this decline: increasing oil prices, falling home values, and falling household income expectations. First, we use the historical macroeconomic relationship between oil prices and vehicle sales to show that the oil price spike explains roughly 15 percent of the auto sales decline between 2007 and 2009. Second, we establish that declining home values explain only a small portion of the observed reduction in household new vehicle sales. Using a county-level panel from the episode, we find (1) a one-dollar fall in home values reduced household new vehicle spending by 0.5 to 0.7 cents and overall new vehicle spending by 0.9 to 1.2 cents and (2) falling home values explain between 16 and 19 percent of the overall new vehicle spending decline. Next, examining state-level data for 1997-2016, we find (3) the short-run responses of new vehicle consumption to home value changes are larger in the 2005-2011 period relative to other years, but at longer horizons (e.g. 5 years), the responses are similar across the two sub-periods and (4) the service flow from vehicles, as measured by miles traveled, responds very little to house price shocks. We also detail the sources of the differences between our findings (1) and (2) from existing research. Third, we establish that declining current and expected future income expectations potentially played an important role in the auto market's collapse. We build a permanent income model augmented to include infrequent repeated car buying. Our calibrated model matches the pre-recession distribution of auto vintages and the liquid-wealth-to-income ratio, and exhibits a large vehicle sales decline in response to a mild decline in expected permanent income due to a transitory slowdown in income growth. In response to the shock, households delay replacing existing vehicles, allowing them to smooth the effects of the income shock without significantly adjusting the service flow from their vehicles. Augmenting our model with a richer set of household expectations allows us to match 65 percent of the overall new vehicle spending decline (i.e. roughly the portion of the decline not explained by oil prices and falling home values). Combining our negative results regarding housing wealth and oil prices with our positive model-based findings, we interpret the auto market collapse as consistent with existing permanent income based approaches to durable goods purchases (e.g., Leahy and Zeira (2005)).
This article examines how the real economy and inflation and inflation expectations evolved in response to the six tightening episodes enacted by the FOMC since 1983. The findings indicate that the sixth episode (2015-2018) differed in several key dimensions compared with the previous five episodes. In the first five episodes, the data show the FOMC was generally tightening into a strengthening economy with building price pressures. In contrast, in the final episode the FOMC began its tightening regime during a deceleration in economic activity and with headline and core inflation remaining well below the FOMC’s 2 percent inflation target. Moreover, both short- and long-term inflation expectations were drifting lower. These developments helped explain why there was a one-year gap between the first and second increases in the federal funds target rate in the final episode. Another key difference is that in three of the first five episodes, the FOMC continued to tighten after the yield curve inverted; a recession then followed shortly thereafter. However, in the final episode, the FOMC ended its tightening policy about eight months before the yield curve inverted.
In a canonical model of heterogeneous agents with precautionary saving motives, Aiyagari (1995) breaks the classical result of zero capital tax obtained in representative-agent models. Aiyagari argues that with capital overaccumulation the optimal long-run capital tax should be strictly positive in order to achieve aggregate allocative efficiency suggested by the modified golden rule (MGR). In this paper, we find that, depending on the sources of capital overaccumulation, capital taxation may not be the most efficient means to restore the MGR when government debt is feasible. To demonstrate our point, we study optimal policy mix in achieving the socially optimal (MGR) level of aggregate capital stock in an infinite horizon heterogeneous-agents incomplete-markets economy where capital may be overaccumulated for two distinct reasons: (i) precautionary savings and (ii) production externalities. By solving the Ramsey problem analytically along the entire transitional path, we reveal that public debt and capital taxation play very distinct roles in dealing with the overaccumulation problem. The Ramsey planner opts neither to use a capital tax to correct the overaccumulation problem if it is caused solely by precautionary saving—regardless of the feasibility of public debt—nor to use debt (financed by consumption tax) to correct the overaccumulation problem if it is caused solely by production externality (such as pollution)—regardless of the feasibility of a capital tax. The key is that the MGR has two margins: an intratemporal margin pertaining to the marginal product of capital (MPK) and an intertemporal margin pertaining to the time discount rate. To achieve the MGR, the Ramsey planner needs to equate not only the private MPK with the social MPK but also the interest rate with the time discount rate—neither of which is equalized in a competitive equilibrium. Yet public debt and a capital tax are each effective only in calibrating one of the two margins, respectively, but not both.
The closing of a busy airport has large effects on noise and economic activity. Using a unique dataset, we examine the effects of closing Denver’s Stapleton Airport on nearby housing markets. We find evidence of immediate anticipatory price effects upon announcement, but no price changes at closing and little evidence of upward trending prices between announcement and closing. However, after airport closure, more higher income and fewer black households moved into these locations, and developers built higher quality houses. Finally, post-closing, these demographic and housing stock changes had substantial effects on housing prices, even after restricting the sample to sales of pre-existing housing.
We use an analytically tractable, heterogeneous-agent incomplete-markets model to show that the Ramsey planner's decision to finance stochastic public expenditures implies a departure from tax smoothing and an endogenous mean-reverting force to support positive debt growth despite the government's precautionary saving motives. Specifically, the government's attempt to balance the competing incentives between its own precautionary saving (tax smoothing) and households' precautionary saving (individual consumption smoothing)---even at the cost of extra tax distortion---implies an endogenous, soft lower bound on the stochastic unit-root dynamics of optimal taxes and public debt.
US payroll employment data come from a survey and are subject to revisions. While revisions are generally small at the national level, they can be large enough at the state level to alter assessments of current economic conditions. Users must therefore exercise caution in interpreting state employment data until they are “benchmarked” against administrative data 5–16 months after the reference period. This paper develops a state-space model that predicts benchmarked state employment data in real time. The model has two distinct features: 1) an explicit model of the data revision process and 2) a dynamic factor model that incorporates real-time information from other state-level labor market indicators. We find that the model reduces the average size of benchmark revisions by about 11 percent. When we optimally average the model’s predictions with those of existing models, the model reduces the average size of the revisions by about 14 percent.
The wave of sovereign defaults in the early 1980s and the string of debt crises in subsequent decades have fostered proposals involving policy interventions in sovereign debt restructurings. The global financial crisis and the recent global pandemic have further reignited this discussion among academics and policymakers. A key question about these policy proposals for debt restructurings that has proved hard to handle is how they influence the behavior of creditors and debtors. We address this challenge by evaluating policy proposals in a quantitative sovereign default model that incorporates two essential features of debt: maturity choice and debt renegotiation in default. We find, first, that a rule that tilts the distribution of creditor losses during restructurings toward holders of long-maturity bonds reduces short-term yield spreads, lowering the probability of a sovereign default by 25 percent. Second, issuing GDP-indexed bonds exclusively during restructurings also reduces the probability of default, especially of defaults in the five years following a debt restructuring. The policies lead to welfare improvements and reductions in haircuts of similar magnitude when implemented separately. When jointly implemented, they reinforce each other’s welfare gains, suggesting good complementarity
Gino Gancia, Giacomo Ponzetto and Jaume Ventura have written an extremely interesting paper on a topic that is very timely for the global economy. In this article, I will first argue that GPV have succeeded in formalizing their hypothesis, and that while providing very suggestive analytical results, additional work can and should be done with the model, especially with regards to relative changes in the relative weights of incumbent countries. Second, I will comment on the potential insights if the rest of the world is modeled more realistically. Third, I will call for extending the baseline model to incorporate additional aspects beyond trade, such as investment and immigration flows, which appear to be relevant for the story of the European Union and its discontents. Four, I will add my non-European perspective on using the model to understand the story of the European Union.
We propose a method to decompose changes in the tax structure into orthogonal components measuring the level and progressivity of taxes. Similar to tax shocks found in the existing empirical literature, the level shock is contractionary. The tax progressivity shock is expansionary: Increasing tax progressivity raises (lowers) disposable income at the bottom (top) end of the income distribution by shifting the tax burden from the bottom to the top. If agents' marginal propensity to consume falls with income, the rise in consumption at the bottom more than compensates for the decline in consumption at the top. The resulting increase in output and consumption leads to rising capital gains for those at the high end of the income distribution that more than offsets their losses from higher income taxes. The net result is that an increasing progressivity leads to an increase in income inequality, contrary to what conventional wisdom might suggest. We interpret these results as evidence in favor of trickle up, not trickle down, economics.
The global financial crisis of the past decade has shaken the research and policy worlds out of their belief that housing markets are mostly benign and immaterial for understanding economic cycles. Instead, a growing consensus recognizes the central role that housing plays in shaping economic activity, particularly during large boom and bust episodes. This article discusses the latest research regarding the causes, consequences, and policy implications of housing crises with a broad focus that includes empirical and structural analysis, insights from the 2000's experience in the United States, and perspectives from around the globe. Even with the significant degree of heterogeneity in legal environments, institutions, and economic fundamentals over time and across countries, several common themes emerge to guide current and future thinking in this area.
There has been much interest recently in the role of household long-term, mortgage, debt in the transmission of monetary policy. This paper offers a tractable framework that integrates the long-term debt channel with the standard New-Keynesian channel, providing a tool for monetary policy analysis that reflects the recent debates in the literature. As the model includes both short- and long-term debt, it provides a useful laboratory for the analysis of monetary policy operating not only through short-term actions, as has been done traditionally in the literature, but also through expected, persistent, changes in its stance.
This paper investigates the determinants of international technology licensing using
data for 41 countries during 1996-2012. A multi-country model of innovation and international technology licensing yields a dynamic structural gravity equation for royalty payments as a function of fundamentals, including: (i) imperfect intellectual property protection and (ii) tax havens. The gravity equation is estimated using nonlinear methods. The model’s fundamentals account for 56% of the variation in royalty payments. Counterfactual analysis sheds light on the role of intellectual property rights and tax
havens on international technology licensing.
The Bretton Woods international financial system, which was in place from roughly 1949 to 1973, is the most significant modern policy experiment to attempt to simultaneously manage international payments, international capital flows, and international currency values. This paper uses an international macroeconomic accounting methodology to study the Bretton Woods system and finds that it: (1) significantly distorted both international and domestic capital markets and hence the accumulation and allocation of capital; (2) significantly slowed the reconstruction of Europe, albeit while limiting the indebtedness of European countries. Our results also provide support for the utility of the accounting methodology in that it finds a sharp change in the behavior of domestic and international capital market wedges that coincides with the breakdown of the system.
While conditional forecasting has become prevalent both in the academic literature and in practice (e.g., bank stress testing, scenario forecasting), its applications typically focus on continuous variables. In this paper, we merge elements from the literature on the construction and implementation of conditional forecasts with the literature on forecasting binary variables. We use the Qual-VAR [Dueker (2005)], whose joint VAR-probit structure allows us to form conditional forecasts of the latent variable which can then be used to form probabilistic forecasts of the binary variable. We apply the model to forecasting recessions in real-time and investigate the role of monetary and oil shocks on the likelihood of two U.S. recessions.
This paper examines why there is house price comovement across some U.S. metropolitan areas (MSAs), and which MSAs cluster together for each of these reasons. Past studies have attributed common recessions in different regions as possible explanations for comovement. We explore other channels, and find some clusters based on common industry concentration (such as information technology), developable land area, as well as a cluster of MSAs that are desirable for retirees (in the sun belt). We find seven clusters of MSAs, where each cluster experiences idiosyncratic house price downturns, plus one distinct national house price cycle. Notably, only the housing downturn associated with the Great Recession spread across all the MSAs in our sample; all other house price downturns remained contained to a single cluster. We also identify MSA economic and geographic characteristics that correlate with cluster membership, which implies comovement due to mobility of residents. In addition, while prior research has found housing and business cycles to be related closely at the national level, we find very different house price comovement and employment comovement across clusters and across MSAs.
Countries have widely imposed fiscal rules designed to constrain government spending and ensure fiscal responsibility. This paper studies the effectiveness and welfare implications of revenue, deficit and debt rules when governments are discretionary and prone to overspending. The optimal prescription is a revenue ceiling coupled with a balance budget requirement. For the U.S., the optimal revenue ceiling is about 15% of output, 3 percentage points below the postwar average. Most of the benefits can still be reaped with a milder constraint or escape clauses during adverse times. Imposing a single fiscal rule allows governments to comply without necessarily curbing spending; on their own, revenue ceilings are only mildly effective, while deficit and debt rules are altogether ineffective.
We compile a new database of grocery prices in Argentina. We find uniform pricing both within and across regions—i.e., prices almost do not vary within stores of a chain. In line with uniform pricing, prices in stores of chains operating in one region react to changes in regional employment, while prices in multi-region chains do not. Using a quantitative regional model with multi-region firms and uniform pricing, we find a one-half smaller elasticity of prices to a regional than an aggregate shock. This result highlights that some caution may be necessary when using regional shocks to estimate aggregate elasticities.
Using Japanese firm data covering the Japanese financial crisis in the early 1990s, we find that exporters' domestic sales declined more significantly than their foreign sales, which in turn declined more significantly than non-exporters' sales. This stylized fact provides a new litmus test for different theories proposed in the literature to explain a trade collapse associated with a financial crisis. In this paper we embed the Melitz's (2003) model into a tractable DSGE framework with incomplete financial markets and endogenous credit allocation to explain both the Japanese firm-level data and the well-documented aggregate trade collapse during a financial crisis in world economic history. The model highlights the role of credit reallocation between non-exporters and exporters as the main mechanism in explaining exporters' behaviors and trade collapse following a financial crisis.
We document considerable scope of heterogeneity within the unemployed, especially when the unemployed are divided along eligibility and receipt of unemployment insurance (UI). We study the implications of this heterogeneity on UI’s insurance-incentive trade-off using a heterogeneous-agent job-search model capable of matching the wealth and income differences that distinguish UI recipients from non-recipients. Insurance benefits are larger for UI recipients who are predominantly wealth-poor. Meanwhile, incentive costs are non-monotonic in wealth because the poorest individuals, who value employment, exhibit weak responses. Differential elasticities imply that accounting for the composition of recipients is material to aligning model predictions with empirical estimates.
This paper examines the reliability of survey data on business incomes, valuations, and rates of return, which are key inputs for studies of wealth inequality and entrepreneurial choice. We compare survey responses of business owners with available data from administrative tax records, brokered private business sales, and publicly traded company filings and document problems due to nonrepresentative samples and measurement errors across several surveys, subsamples, and years. We find that the discrepancies are economically relevant for the statistics of interest. We investigate reasons for these discrepancies and propose corrections for future survey designs.
I document a small spousal earnings response to the job displacement of the family head. The response is even smaller in recessions, when earnings losses are larger and additional insurance is valuable. Using cross-state differences in transfer generosity, I find that generous transfers substantially crowd out the spousal earnings response. To study its policy implications, I develop an incomplete markets model with family labor supply and aggregate fluctuations, where predicted labor supply elasticities to taxes and transfers are in line with empirical estimates both in aggregate and across income groups. Counterfactual experiments indeed reveal that generous transfers in recessions discourage spousal earnings. I show that the optimal policy features procyclical means-tested and countercyclical employment-tested transfers, unlike the existing policy that maintains generous transfers of both types in recessions. Abstracting from the incentive costs of transfers on the spousal labor supply changes both the level and the cyclicality of optimal transfers.
We construct a search model where sellers post prices and produce goods of unknown quality. A match reveals the quality of the seller. Buyers rate sellers based on quality. We show that unrated sellers charge a low price to attract buyers and that highly rated sellers post a high price and sell with a higher probability than unrated sellers. We nd that welfare is higher with a ratings system. Using data on Airbnb rentals, we show that Superhosts and hosts with high ratings: 1) charge higher prices, 2) have a higher occupancy rate and 3) higher revenue than average hosts.
We investigate claims made in Giacomini and White (2006) and Diebold (2015) regarding the asymptotic normality of a test of equal predictive ability. A counterexample is provided in which, instead, the test statistic diverges with probability one under the null.
We present a monopolistic competition model to analyze the effects of own nation and neighboring nation terrorism on a nation’s imports. The theoretical analysis shows that own nation terrorism may leave relative price of imports unaffected, but neighboring nation terrorism must raise the relative price, reducing imports. We find that a 10% increase in terrorist attacks in a neighboring nation reduces a country’s imports from the rest of the world by approximately $320 million USD, on average. Mediation analysis shows that trading delays is a potential channel of transmission of trade costs of terrorism to a neighbor.
We document three facts: (i) Higher educated workers are more likely to moonlight; (ii) conditional on education, workers with higher wages are less likely to moonlight; and (iii) the prevalence of moonlighting is declining over time for all education groups. We develop an equilibrium model of the labor market to explain these patterns. A dominating income effect explains the negative correlation of moonlighting with productivity in the cross section and the downward trend over time. A higher part-to-full time pay differential for skilled workers (a comparative advantage) explains the positive correlation with education. We provide empirical evidence of the comparative advantage using CPS data. We calibrate the model to 1994 cross-sectional data and assess its ability to reproduce the 2017 data. The driving forces are productivity variables and the proportion of skilled workers. The model accounts for 56% of the moonlighting trend for skilled workers, and 67% for unskilled workers.
We argue that the fiscal multiplier of government purchases in incomplete markets models is nonlinear in the spending shock, in contrast to the multiplier in complete markets models and what is assumed in most of the literature. In particular, the multiplier is increasing in the spending shock, with large positive shocks having the largest multiplier and large negative shocks having the smallest multiplier. The mechanism hinges on the relationship between fiscal shocks, their form of financing, and the response of labor supply across the wealth distribution. The model predicts that the aggregate labor supply elasticity is increasing in the fiscal shock, and this holds regardless of whether shocks are deficit- or balanced-budget financed. Our findings are consistent with aggregate fiscal consolidation data across 15 OECD countries over time. Furthermore, we find evidence of our mechanism in microdata for the US.
We analyze the propagation of recessions across countries. We construct a model that allows for multiple qualitative state variables in a vector autoregression (VAR) setting. The VAR structure allows us to include country-level variables to determine whether policy also propagates across countries. We consider two different versions of the model. One version assumes the discrete state of the economy (expansion or recession) is observed. The other assumes that the state of the economy is unobserved and must be inferred from movements in economic growth. We apply the model to Canada, Mexico, and the United States to test if spillover effects were similar before and after the North American Free Trade Agreement (NAFTA). We find that trade liberalization has increased the degree of business cycle propagation across the three countries.
We construct a dynamic general equilibrium model with occupation mobility, human capital accumulation and endogenous assignment of workers to tasks to quantitatively assess the aggregate impact of automation and other task-biased technological innovations. We extend recent quantitative general equilibrium Roy models to a setting with dynamic occupational choices and human capital accumulation. We provide a set of conditions for the problem of workers to be written in recursive form and provide a sharp characterization for the optimal mobility of individual workers and for the aggregate supply of skills across occupations. We craft our dynamic Roy model in a production setting where multiple tasks within occupations are assigned to workers or machines. We solve for the balanced-growth path and characterize the aggregate transitional dynamics ensuing task-biased technological innovations. In our quantitative analysis of the impact of task-biased innovations in the U.S. since 1980, we find that they account for an increased aggregate output in the order of 75% and for a much higher dispersion in earnings. If the U.S. economy had larger barriers to mobility it would have experienced less job polarization but substantially higher inequality and lower output as occupation mobility has provided an "escape" for the losers from automation.
The Lerner index is widely used to assess firms' market power. However, estimation and interpretation present several challenges, especially for banks, which tend to produce multiple outputs and operate with considerable inefficiency. We estimate Lerner indices for U.S. banks for 2001-18 using nonparametric estimators of the underlying cost and profit functions, controlling for inefficiency, and incorporating banks' off-balance-sheet activities. We find that mis-specification of cost or profit functional forms can seriously bias Lerner index estimates, as can failure to account for inefficiency and off-balance-sheet output.