The Bretton Woods international financial system, which was in place from roughly 1949 to 1973, is the most significant modern policy experiment to attempt to simultaneously manage international payments, international capital flows, and international currency values. This paper uses an international macroeconomic accounting methodology to study the Bretton Woods system and finds that it: (1) significantly distorted both international and domestic capital markets and hence the accumulation and allocation of capital; (2) significantly slowed the reconstruction of Europe, albeit while limiting the indebtedness of European countries. Our results also provide support for the utility of the accounting methodology in that it finds a sharp change in the behavior of domestic and international capital market wedges that coincides with the breakdown of the system.
While conditional forecasting has become prevalent both in the academic literature and in practice (e.g., bank stress testing, scenario forecasting), its applications typically focus on continuous variables. In this paper, we merge elements from the literature on the construction and implementation of conditional forecasts with the literature on forecasting binary variables. We use the Qual-VAR [Dueker (2005)], whose joint VAR-probit structure allows us to form conditional forecasts of the latent variable which can then be used to form probabilistic forecasts of the binary variable. We apply the model to forecasting recessions in real-time and investigate the role of monetary and oil shocks on the likelihood of two U.S. recessions.
This paper examines house price diffusion across metropolitan areas in the United States. We develop a generalization of the Hamilton and Owyang (2012) Markov-switching model, where we incorporate direct regional spillovers using a spatial weighting matrix. The Markov-switching framework allows consideration for house price movements that occur due to similar timing of downturns across MSAs. The inclusion of the spatial weighting matrix improves fit compared to a standard endogenous clustering model. We find seven clusters of MSAs that experience idiosyncratic recessions plus one distinct national house price cycle. Notably only the housing downturn associated with the Great Recession spread across all of the MSAs in our sample; other house price downturns remained contained to a single cluster. Previous research has found that housing cycles and business cycles are intertwined. To examine this potential relationship we apply our spatial Markov-switching model to employment growth data. We find that house price comovement and employment comovement are distinct across cities.
Countries have widely imposed fiscal rules designed to constrain government spending and ensure fiscal responsibility. This paper studies the effectiveness and welfare implications of revenue, deficit and debt rules when governments are discretionary and profligate. The optimal prescription is a revenue ceiling coupled with a balance budget requirement. For the U.S., the optimal revenue ceiling is about 15% of output, 3 percentage points below the postwar average, and yields welfare gains equivalent to 10% of consumption. Most of the benefits can still be reaped with a milder constraint or escape clauses during adverse times. Imposing a single fiscal rule allows governments to comply without necessarily curbing spending; on their own, revenue ceilings are only mildly effective, while deficit and debt rules are altogether ineffective.
During the Great Recession, the collapse of consumption across the U.S. varied greatly but systematically with house-price declines. We nd that financial distress among U.S. households amplified the sensitivity of consumption to house-price shocks. We uncover two essential facts: (1) the decline in house prices led to an increase in household financial distress prior to the decline in income during the recession, and (2) at the zip-code level, the prevalence of financial distress prior to the recession was positively correlated with house-price declines at the onset of the recession. Using a rich-estimated-dynamic model to measure the financial distress channel, we nd that these two facts amplify the aggregate drop in consumption by 7 percent and 45 percent respectively.
We compile a new database of grocery prices in Argentina, with over 9 million observations per day. We find uniform pricing both within and across regions—i.e., product prices almost do not vary within stores of a chain. Uniform pricing implies that prices would not change with regional conditions or shocks, particularly so if chains operate in several regions. We confirm this hypothesis using employment data. While prices in stores of chains operating almost exclusively in one region do react to changes in regional employment, stores of chains that operate in many regions do not. Finally, using a quantitative regional model with multi-region firms and uniform pricing, we find an almost one-half smaller elasticity of prices to a regional than an aggregate shock. This result highlights that some caution may be necessary when using regional shocks to estimate aggregate elasticities, particularly when the relevant prices are set uniformly across regions.
Using Japanese firm data covering the Japanese financial crisis in the early 1990s, we find that exporters' domestic sales declined more significantly than their foreign sales, which in turn declined more significantly than non-exporters' sales. This stylized fact provides a new litmus test for different theories proposed in the literature to explain a trade collapse associated with a financial crisis. In this paper we embed the Melitz's (2003) model into a tractable DSGE framework with incomplete financial markets and endogenous credit allocation to explain both the Japanese firm-level data and the well-documented aggregate trade collapse during a financial crisis in world economic history. The model highlights the role of credit reallocation between non-exporters and exporters as the main mechanism in explaining exporters' behaviors and trade collapse following a financial crisis.
We study optimal unemployment insurance (UI) over the business cycle using a heterogeneous agent job search model with aggregate risk and incomplete markets. We validate the model-implied micro and macro labor market elasticities to changes in UI generosity against existing estimates and reconcile divergent empirical findings. We show that generating the observed demographic differences between UI recipients and non-recipients is critical in determining the magnitudes of these elasticities. We find that the optimal policy features countercyclical replacement rates with average generosity close to current U.S. policy but adopts drastically longer payment durations reminiscent of European policies.
This paper examines the reliability of survey data on business incomes, valuations, and rates of return, which are key inputs for studies of wealth inequality and entrepreneurial choice. We compare survey responses of business owners with available data from administrative tax records, brokered private business sales, and publicly traded company filings and document problems due to nonrepresentative samples and measurement errors across several surveys, subsamples, and years. We find that the discrepancies are economically relevant for the statistics of interest. We investigate reasons for these discrepancies and propose corrections for future survey designs.
I document a small spousal earnings response to the job displacement of the family head. The response is even smaller in recessions, when earnings losses are larger and additional insurance is most valuable. I investigate whether the small response is an outcome of the crowding-out effects of government transfers. To accomplish this, I use an incomplete markets model with family labor supply and aggregate fluctuations where predicted spousal labor supply elasticities with respect to transfers are in line with microeconomic estimates both in aggregate and across subpopulations. Counterfactual experiments indeed reveal that generous transfers in recessions discourage the spousal labor supply significantly. I then show that the optimal policy features procyclical means-tested and countercyclical employment-tested transfers, unlike the existing policy that maintains generous transfers of both types in recessions. Abstracting from the incentive costs of transfers on the spousal labor supply changes both the level and cyclicality of optimal transfers.
We construct a search model where sellers post prices and produce goods of unknown quality. A match reveals the quality of the seller. Buyers rate sellers based on quality. We show that unrated sellers charge a low price to attract buyers and that highly rated sellers post a high price and sell with a higher probability than unrated sellers. We nd that welfare is higher with a ratings system. Using data on Airbnb rentals, we show that Superhosts and hosts with high ratings: 1) charge higher prices, 2) have a higher occupancy rate and 3) higher revenue than average hosts.
We investigate claims made in Giacomini and White (2006) and Diebold (2015) regarding the asymptotic normality of a test of equal predictive ability. A counterexample is provided in which, instead, the test statistic diverges with probability one under the null.
We present a monopolistic competition model to analyze the effects of own nation and neighboring nation terrorism on a nation’s imports. The theoretical analysis shows that own nation terrorism may leave relative price of imports unaffected, but neighboring nation terrorism must raise the relative price, reducing imports. We find that a 10% increase in terrorist attacks in a neighboring nation reduces a country’s imports from the rest of the world by approximately $320 million USD, on average. Mediation analysis shows that trading delays is a potential channel of transmission of trade costs of terrorism to a neighbor.
The proportion of multiple jobholders (moonlighters) is negatively correlated with productivity (wages) in cross-sectional and time series data, but positively correlated with education. We develop a model of the labor market to understand these seemingly contradictory facts. An income effect explains the negative correlation with productivity while a comparative advantage of skilled workers explains the positive correlation with education. We provide empirical evidence of the comparative advantage in CPS data. We calibrate the model to 1994 data on multiple jobholdings, and assess its ability to reproduce the 2017 data. There are three exogenous driving forces: productivity, number of children and the proportion of skilled workers. The model accounts for 68.7% of the moonlithing trend for college-educated workers, and overpredicts it by 33.7 percent for high school-educated workers. Counterfactual experiments reveal the contribution of each exogenous variable.
We argue that the fiscal multiplier of government purchases is nonlinear in the spending shock, in contrast to what is assumed in most of the literature. In particular, the multiplier of a fiscal consolidation is decreasing in the size of the consolidation. We empirically document this fact using aggregate fiscal consolidation data across 15 OECD countries. We show that a neoclassical life-cycle, incomplete markets model calibrated to match key features of the U.S. economy can explain this empirical finding. The mechanism hinges on the relationship between fiscal shocks, their form of financing, and the response of labor supply across the wealth distribution. The model predicts that the aggregate labor supply elasticity is increasing in the fiscal shock, and this holds regardless of whether shocks are deficit- or balanced-budget financed. We find evidence of our mechanism in microdata for the US.
In this paper, we analyze the propagation of recessions across countries. We construct a model with multiple qualitative state variables that evolve in a VAR setting. The VAR structure allows us to include country-level variables to determine whether policy also propagates across countries. We consider two different versions of the model. One version assumes the discrete state of the economy (expansion or recession) is observed. The other assumes that the state of the economy is unobserved and must be inferred from movements in economic growth. We apply the model to Canada, Mexico, and the U.S. to test if spillover effects were similar before and after NAFTA. We find that trade liberalization has increased the degree of business cycle propagation across the three countries.
We construct a dynamic general equilibrium model with occupation mobility, human capital accumulation and endogenous assignment of workers to tasks to quantitatively assess the aggregate impact of automation and other task-biased technological innovations. We extend recent quantitative general equilibrium Roy models to a setting with dynamic occupational choices and human capital accumulation. We provide a set of conditions for the problem of workers to be written in recursive form and provide a sharp characterization for the optimal mobility of individual workers and for the aggregate supply of skills across occupations. We craft our dynamic Roy model in a production setting where multiple tasks within occupations are assigned to workers or machines. We solve for the balanced-growth path and characterize the aggregate transitional dynamics ensuing task-biased technological innovations. In our quantitative analysis of the impact of task-biased innovations in the U.S. since 1980, we find that they account for an increased aggregate output in the order of 75% and for a much higher dispersion in earnings. If the U.S. economy had larger barriers to mobility it would have experienced less job polarization but substantially higher inequality and lower output as occupation mobility has provided an "escape" for the losers from automation.
The Lerner index is widely used to assess firms' market power. However, estimation and interpretation present several challenges, especially for banks, which tend to produce multiple outputs and operate with considerable inefficiency. We estimate Lerner indices for U.S. banks for 2001-18 using nonparametric estimators of the underlying cost and profit functions, controlling for inefficiency, and incorporating banks' off-balance-sheet activities. We find that mis-specification of cost or profit functional forms can seriously bias Lerner index estimates, as can failure to account for inefficiency and off-balance-sheet output.
In this note we use simple examples and associated simulations to investigate the size and power properties of tests of predictive ability described in Giacomini and White (2006; Econometrica). While we find that the tests can be accurately sized and powerful in large enough samples we identify details associated with the tests that are not otherwise apparent from the original text. In order of importance these include (i) the proposed test of equal finite-sample unconditional predictive ability is not asymptotically valid under the fixed scheme, (ii) for the same test, but when the rolling scheme is used, very large bandwidths are sometimes required when estimating long-run variances, and (iii) when conducting the proposed test of equal finite-sample conditional predictive ability, conditional heteroskedasticity is likely present when lagged loss differentials are used as instruments.
We study the comovement of international business cycles in a time series clustering model with regime-switching. We extend the framework of Hamilton and Owyang (2012) to include time-varying transition probabilities to determine what drives similarities in business cycle turning points. We find four groups, or "clusters", of countries which experience idiosyncratic recessions relative to the global cycle. Additionally, we find the primary indicators of international recessions to be fluctuations in equity markets and geopolitical uncertainty. In out-of-sample forecasting exercises, we find that our model is an improvement over standard benchmark models for forecasting both aggregate output growth and country-level recessions.
We study nominal GDP targeting as optimal monetary policy in a simple and stylized model with a credit market friction. The macroeconomy we study has considerable income inequality, which gives rise to a large private sector credit market. There is an important credit market friction because households participating in the credit market use non-state contingent nominal contracts (NSCNC). We extend previous results in this model by allowing for substantial intra-cohort heterogeneity. The heterogeneity is substantial enough that we can approach measured Gini coefficients for income, financial wealth, and consumption in the U.S. data. We show that nominal GDP targeting continues to characterize optimal monetary policy in this setting. Optimal monetary policy repairs the distortion caused by the credit market friction and so leaves heterogeneous households supplying their desired amount of labor, a type of "divine coincidence" result. We also further characterize monetary policy in terms of nominal interest rate adjustment.
What are the quantitative macroeconomic effects of the countercyclical capital buffer (CCyB)? I study this question in a nonlinear DSGE model with occasional financial crises, which is calibrated and combined with US data to estimate sequences of structural shocks. Raising capital buffers during leverage expansions can reduce the frequency of crises by more than half. A quantitative application to the 2007-08 financial crisis shows that the CCyB in the 2.5% range (as in the Federal Reserve's current framework) could have greatly mitigated the financial panic of 2008, for a cumulative gain of 29% in aggregate consumption. The threat of raising capital requirements is effective even if this tool is not used in equilibrium.
We build a tractable heterogeneous-agent incomplete-markets model with quasi-linear preferences to address a set of long-standing issues in the optimal Ramsey taxation literature. The tractability of our model enables us to analytically prove the existence of a Ramsey steady state and establish several novel results: (i) In the absence of any redistributional effects of capital taxation or lump-sum transfers, the optimal capital tax is exclusively zero in a Ramsey steady state regardless of the modified golden rule (MGR) and government debt limits. (ii) Whether the MGR holds or not depends critically on the government's capacity to issue debt but has no bearing on the planner's long-run capital tax scheme. (iii) The optimal debt-to-GDP ratio, however, is determined by a positive wedge times the MGR saving rate: The wedge is decreasing in the strength of individuals' self-insurance positions and approaches zero when the idiosyncratic risk vanishes or markets are complete. (iv) The assumption of the existence of a Ramsey steady state commonly made in the existing literature is not innocuous: When a Ramsey steady state does not exist but is erroneously assumed to exist, the MGR always appears to “hold" and the implied “optimal" long-run capital tax is strictly positive. (v) Along the transition path toward a Ramsey steady state, the optimal capital tax depends positively on the elasticity of intertemporal substitution. The key insight behind our results is that in the absence of any redistributional effects, taxing capital in the steady state permanently hinders individuals' self-insurance positions and thus the Ramsey planner opts to issue debt rather than impose a steady-state capital tax to correct the capital-overaccumulation problem. However, if the demand for debt approaches infinity when the interest rate approaches the time discount rate, a Ramsey steady state may not exist; thus, the MGR can fail to hold in a Ramsey equilibrium whenever the government encounters a binding debt limit.
The Great Recession was a deep downturn with long-lasting effects on credit, employment and output. While narratives about its causes abound, the persistence of GDP below pre-crisis trends remains puzzling. We propose a simple persistence mechanism that can be quantified and combined with existing models. Our key premise is that agents don't know the true distribution of shocks, but use data to estimate it non-parametrically. Then, transitory events, especially extreme ones, generate persistent changes in beliefs and macro outcomes. Embedding this mechanism in a neoclassical model, we find that it endogenously generates persistent drops in economic activity after tail events.
In this study, we develop and apply a new methodology for obtaining accurate and equitable property value assessments. This methodology adds a time dimension to the Geographically Weighted Regressions (GWR) framework, which we call Time-Geographically Weighted Regressions (TGWR). That is, when generating assessed values, we consider sales that are close in time and space to the designated unit. We think this is an important improvement of GWR since this increases the number of comparable sales that can be used to generate assessed values. Furthermore, it is likely that units that sold at an earlier time but are spatially near the designated unit are likely to be closer in value than units that are sold at a similar time but farther away geographically. This is because location is such an important determinant of house value. We apply this new methodology to sales data for residential properties in 50 municipalities in Connecticut for 1994-2013 and 145 municipalities in Massachusetts for 1987-2012. This allows us to compare results over a long time period and across municipalities in two states. We find that TGWR performs better than OLS with fixed effects and leads to less regressive assessed values than OLS. In many cases, TGWR performs better than GWR that ignores the time dimension. In at least one specification, several suburban and rural towns meet the IAAO Coefficient of Dispersion cutoffs for acceptable accuracy.
Using Lorenz-type curves, means tests, ordinary least squares, and locally weighted regressions (LWR), we examine the relative burdens of whites, blacks, and Hispanics in Georgia from road and air traffic noise. We find that whites bear less noise than either blacks or Hispanics and that blacks tend to experience more traffic noise than Hispanics. While every Metropolitan Statistical Area (MSA) showed that blacks experienced relatively more noise than average, such a result did not hold for Hispanics in roughly half of the MSAs. We find much heterogeneity across Census tracts using LWR. For most Census tracts, higher black and Hispanic population shares are associated with increased noise. However, 5.5 percent of the coefficients for blacks and 18.9 percent for Hispanics suggest larger population shares are associated with less noise. The noise LWR marginal effects for black populations across most tracts in the state are consistent with diminishing marginal noise from additional black population, while those in Atlanta exhibit diminishing marginal noise for Hispanics. In many regions of the state where the potential for health-damaging noise exists, we find relatively high disproportionality in noise experienced by the black and Hispanic populations compared to the rest of the overall population. Our findings underscore the importance of using nonparametric estimation approaches to unveil spatial heterogeneity in applied urban and housing economics analyses.
Should a central bank take over the provision of e-money, a circulable electronic liability? We discuss how e-money technology changes the tradeoff between public and private provision, and the tradeoff between e-money and a central bank's existing liabilities like bank notes and reserves. The tradeoffs depend on i) the technological setup of the e-money system (as a token or an account; centralized or decentralized); ii) the potential improvement in the implementation and transmission of monetary policy; iii) the risks to safety and privacy from cyber attacks; and iv) the uncertain impact on banks' efficiency and financial stability. The most compelling argument for central banks to issue e-money is to address competition problems in the banking sector.
Financial network structure is an important determinant of systemic risk. This paper examines how the U.S. interbank network evolved over a long and important period that included two key events: the founding of the Federal Reserve and the Great Depression. Banks established connections to correspondents that joined the Federal Reserve in cities with Fed offices, initially reducing overall network concentration. The network became even more focused on Fed cities during the Depression, as survival rates were higher for banks with more existing connections to Fed cities, and as survivors established new connections to those cities over time.
Liquidity shocks transmitted through interbank connections contributed to bank distress during the Great Depression. New data on interbank connections reveal that banks were much more likely to close when their correspondents closed. Further, after the Federal Reserve was established, banks’ management of cash and capital buffers was less responsive to network risk, suggesting that banks expected the Fed to reduce network risk. Because the Fed’s presence removed the incentives for the most systemically important banks to maintain capital and cash buffers that had protected against liquidity risk, it likely contributed to the banking system’s vulnerability to contagion during the Depression.