We investigate claims made in Giacomini and White (2006) and Diebold (2015) regarding the asymptotic normality of a test of equal predictive ability. A counterexample is provided in which, instead, the test statistic diverges with probability one under the null.
We present a monopolistic competition model to analyze the effects of own nation and neighboring nation terrorism on a nation’s imports. The theoretical analysis shows that own nation terrorism may leave relative price of imports unaffected, but neighboring nation terrorism must raise the relative price, reducing imports. We find that a 10% increase in terrorist attacks in a neighboring nation reduces a country’s imports from the rest of the world by approximately $320 million USD, on average. Mediation analysis shows that trading delays is a potential channel of transmission of trade costs of terrorism to a neighbor.
The proportion of multiple jobholders (moonlighters) is negatively correlated with productivity (wages) in cross-sectional and time series data, but positively correlated with education. We develop a model of the labor market to understand these seemingly contradictory facts. An income effect explains the negative correlation with productivity while a comparative advantage of skilled workers explains the positive correlation with education. We provide empirical evidence of the comparative advantage in CPS data. We calibrate the model to 1994 data on multiple jobholdings, and assess its ability to reproduce the 2017 data. There are three exogenous driving forces: productivity, number of children and the proportion of skilled workers. The model accounts for 68.7% of the moonlithing trend for college-educated workers, and overpredicts it by 33.7 percent for high school-educated workers. Counterfactual experiments reveal the contribution of each exogenous variable.
We argue that the fiscal multiplier of government purchases is nonlinear in the spending shock, in contrast to what is assumed in most of the literature. In particular, the multiplier of a fiscal consolidation is decreasing in the size of the consolidation. We empirically document this fact using aggregate fiscal consolidation data across 15 OECD countries. We show that a neoclassical life-cycle, incomplete markets model calibrated to match key features of the U.S. economy can explain this empirical finding. The mechanism hinges on the relationship between fiscal shocks, their form of financing, and the response of labor supply across the wealth distribution. The model predicts that the aggregate labor supply elasticity is increasing in the fiscal shock, and this holds regardless of whether shocks are deficit- or balanced-budget financed. We find evidence of our mechanism in microdata for the US.
In this paper, we analyze the propagation of recessions across countries. We construct a model with multiple qualitative state variables that evolve in a VAR setting. The VAR structure allows us to include country-level variables to determine whether policy also propagates across countries. We consider two different versions of the model. One version assumes the discrete state of the economy (expansion or recession) is observed. The other assumes that the state of the economy is unobserved and must be inferred from movements in economic growth. We apply the model to Canada, Mexico, and the U.S. to test if spillover effects were similar before and after NAFTA. We find that trade liberalization has increased the degree of business cycle propagation across the three countries.
We construct a dynamic general equilibrium model with occupation mobility, human capital accumulation and endogenous assignment of workers to tasks to quantitatively assess the aggregate impact of automation and other task-biased technological innovations. We extend recent quantitative general equilibrium Roy models to a setting with dynamic occupational choices and human capital accumulation. We provide a set of conditions for the problem of workers to be written in recursive form and provide a sharp characterization for the optimal mobility of individual workers and for the aggregate supply of skills across occupations. We craft our dynamic Roy model in a production setting where multiple tasks within occupations are assigned to workers or machines. We solve for the balanced-growth path and characterize the aggregate transitional dynamics ensuing task-biased technological innovations. In our quantitative analysis of the impact of task-biased innovations in the U.S. since 1980, we find that they account for an increased aggregate output in the order of 75% and for a much higher dispersion in earnings. If the U.S. economy had larger barriers to mobility it would have experienced less job polarization but substantially higher inequality and lower output as occupation mobility has provided an "escape" for the losers from automation.
The Lerner index is widely used to assess firms' market power. However, estimation and interpretation present several challenges, especially for banks, which tend to produce multiple outputs and operate with considerable inefficiency. We estimate Lerner indices for U.S. banks for 2001-18 using nonparametric estimators of the underlying cost and profit functions, controlling for inefficiency, and incorporating banks' off-balance-sheet activities. We find that mis-specification of cost or profit functional forms can seriously bias Lerner index estimates, as can failure to account for inefficiency and off-balance-sheet output.
In this note we use simple examples and associated simulations to investigate the size and power properties of tests of predictive ability described in Giacomini and White (2006; Econometrica). While we find that the tests can be accurately sized and powerful in large enough samples we identify details associated with the tests that are not otherwise apparent from the original text. In order of importance these include (i) the proposed test of equal finite-sample unconditional predictive ability is not asymptotically valid under the fixed scheme, (ii) for the same test, but when the rolling scheme is used, very large bandwidths are sometimes required when estimating long-run variances, and (iii) when conducting the proposed test of equal finite-sample conditional predictive ability, conditional heteroskedasticity is likely present when lagged loss differentials are used as instruments.
We study the comovement of international business cycles in a time series clustering model with regime-switching. We extend the framework of Hamilton and Owyang (2012) to include time-varying transition probabilities to determine what drives similarities in business cycle turning points. We find four groups, or "clusters", of countries which experience idiosyncratic recessions relative to the global cycle. Additionally, we find the primary indicators of international recessions to be fluctuations in equity markets and geopolitical uncertainty. In out-of-sample forecasting exercises, we find that our model is an improvement over standard benchmark models for forecasting both aggregate output growth and country-level recessions.
We study nominal GDP targeting as optimal monetary policy in a simple and stylized model with a credit market friction. The macroeconomy we study has considerable income inequality, which gives rise to a large private sector credit market. There is an important credit market friction because households participating in the credit market use non-state contingent nominal contracts (NSCNC). We extend previous results in this model by allowing for substantial intra-cohort heterogeneity. The heterogeneity is substantial enough that we can approach measured Gini coefficients for income, financial wealth, and consumption in the U.S. data. We show that nominal GDP targeting continues to characterize optimal monetary policy in this setting. Optimal monetary policy repairs the distortion caused by the credit market friction and so leaves heterogeneous households supplying their desired amount of labor, a type of "divine coincidence" result. We also further characterize monetary policy in terms of nominal interest rate adjustment.
What are the quantitative macroeconomic effects of the countercyclical capital buffer (CCyB)? I study this question in a nonlinear DSGE model with occasional financial crises, which is calibrated and combined with US data to estimate sequences of structural shocks. Raising capital buffers during leverage expansions can reduce the frequency of crises by more than half. A quantitative application to the 2007-08 financial crisis shows that the CCyB in the 2.5% range (as in the Federal Reserve's current framework) could have greatly mitigated the financial panic of 2008, for a cumulative gain of 29% in aggregate consumption. The threat of raising capital requirements is effective even if this tool is not used in equilibrium.
We build a tractable heterogeneous-agent incomplete-markets model with quasi-linear preferences to address a set of long-standing issues in the optimal Ramsey taxation literature. The tractability of our model enables us to analytically prove the existence of a Ramsey steady state and establish several novel results: (i) In the absence of any redistributional effects of capital taxation or lump-sum transfers, the optimal capital tax is exclusively zero in a Ramsey steady state regardless of the modified golden rule (MGR) and government debt limits. (ii) Whether the MGR holds or not depends critically on the government's capacity to issue debt but has no bearing on the planner's long-run capital tax scheme. (iii) The optimal debt-to-GDP ratio, however, is determined by a positive wedge times the MGR saving rate: The wedge is decreasing in the strength of individuals' self-insurance positions and approaches zero when the idiosyncratic risk vanishes or markets are complete. (iv) The assumption of the existence of a Ramsey steady state commonly made in the existing literature is not innocuous: When a Ramsey steady state does not exist but is erroneously assumed to exist, the MGR always appears to “hold" and the implied “optimal" long-run capital tax is strictly positive. (v) Along the transition path toward a Ramsey steady state, the optimal capital tax depends positively on the elasticity of intertemporal substitution. The key insight behind our results is that in the absence of any redistributional effects, taxing capital in the steady state permanently hinders individuals' self-insurance positions and thus the Ramsey planner opts to issue debt rather than impose a steady-state capital tax to correct the capital-overaccumulation problem. However, if the demand for debt approaches infinity when the interest rate approaches the time discount rate, a Ramsey steady state may not exist; thus, the MGR can fail to hold in a Ramsey equilibrium whenever the government encounters a binding debt limit.
The Great Recession was a deep downturn with long-lasting effects on credit, employment and output. While narratives about its causes abound, the persistence of GDP below pre-crisis trends remains puzzling. We propose a simple persistence mechanism that can be quantified and combined with existing models. Our key premise is that agents don't know the true distribution of shocks, but use data to estimate it non-parametrically. Then, transitory events, especially extreme ones, generate persistent changes in beliefs and macro outcomes. Embedding this mechanism in a neoclassical model, we find that it endogenously generates persistent drops in economic activity after tail events.
In this study, we develop and apply a new methodology for obtaining accurate and equitable property value assessments. This methodology adds a time dimension to the Geographically Weighted Regressions (GWR) framework, which we call Time-Geographically Weighted Regressions (TGWR). That is, when generating assessed values, we consider sales that are close in time and space to the designated unit. We think this is an important improvement of GWR since this increases the number of comparable sales that can be used to generate assessed values. Furthermore, it is likely that units that sold at an earlier time but are spatially near the designated unit are likely to be closer in value than units that are sold at a similar time but farther away geographically. This is because location is such an important determinant of house value. We apply this new methodology to sales data for residential properties in 50 municipalities in Connecticut for 1994-2013 and 145 municipalities in Massachusetts for 1987-2012. This allows us to compare results over a long time period and across municipalities in two states. We find that TGWR performs better than OLS with fixed effects and leads to less regressive assessed values than OLS. In many cases, TGWR performs better than GWR that ignores the time dimension. In at least one specification, several suburban and rural towns meet the IAAO Coefficient of Dispersion cutoffs for acceptable accuracy.
Using Lorenz-type curves, means tests, ordinary least squares, and locally weighted regressions (LWR), we examine the relative burdens of whites, blacks, and Hispanics in Georgia from road and air traffic noise. We find that whites bear less noise than either blacks or Hispanics and that blacks tend to experience more traffic noise than Hispanics. While every Metropolitan Statistical Area (MSA) showed that blacks experienced relatively more noise than average, such a result did not hold for Hispanics in roughly half of the MSAs. We find much heterogeneity across Census tracts using LWR. For most Census tracts, higher black and Hispanic population shares are associated with increased noise. However, 5.5 percent of the coefficients for blacks and 18.9 percent for Hispanics suggest larger population shares are associated with less noise. The noise LWR marginal effects for black populations across most tracts in the state are consistent with diminishing marginal noise from additional black population, while those in Atlanta exhibit diminishing marginal noise for Hispanics. In many regions of the state where the potential for health-damaging noise exists, we find relatively high disproportionality in noise experienced by the black and Hispanic populations compared to the rest of the overall population. Our findings underscore the importance of using nonparametric estimation approaches to unveil spatial heterogeneity in applied urban and housing economics analyses.
Should a central bank take over the provision of e-money, a circulable electronic liability? We discuss how e-money technology changes the tradeoff between public and private provision, and the tradeoff between e-money and a central bank's existing liabilities like bank notes and reserves. The tradeoffs depend on i) the technological setup of the e-money system (as a token or an account; centralized or decentralized); ii) the potential improvement in the implementation and transmission of monetary policy; iii) the risks to safety and privacy from cyber attacks; and iv) the uncertain impact on banks' efficiency and financial stability. The most compelling argument for central banks to issue e-money is to address competition problems in the banking sector.
Financial network structure is an important determinant of systemic risk. This paper examines how the U.S. interbank network evolved over a long and important period that included two key events: the founding of the Federal Reserve and the Great Depression. Banks established connections to correspondents that joined the Federal Reserve in cities with Fed offices, initially reducing overall network concentration. The network became even more focused on Fed cities during the Depression, as survival rates were higher for banks with more existing connections to Fed cities, and as survivors established new connections to those cities over time.
Liquidity shocks transmitted through interbank connections contributed to bank distress during the Great Depression. New data on interbank connections reveal that banks were much more likely to close when their correspondents closed. Further, after the Federal Reserve was established, banks’ management of cash and capital buffers was less responsive to network risk, suggesting that banks expected the Fed to reduce network risk. Because the Fed’s presence removed the incentives for the most systemically important banks to maintain capital and cash buffers that had protected against liquidity risk, it likely contributed to the banking system’s vulnerability to contagion during the Depression.
We examine international stock return comovements of country-industry portfolios. Our model allows comovements to be driven by a global and a cluster component, with the cluster membership endogenously determined. Results indicate that country-industry portfolios tend to cluster mainly within geographical areas that can include one or more countries. The cluster compositions substantially changed over time, with the emergence of clusters among European countries from the early 2000s. The cluster component was the main driver of country-industry portfolio returns for most of the sample, except from the mid-2000s to the mid-2010s when the global component had a more prominent role.
The entry of baby boomers into the labor market in the 1970s slowed growth for physical and human capital per worker because young workers have little of both. Thus, the baby boom could have contributed to the 1970s productivity slowdown. I build and calibrate a model a la Huggett et al. (2011) with exogenous population and TFP to evaluate this theory. The baby boom accounts for 75% of the slowdown in the period 1964-69, 25% in 1970-74 and 2% in 1975-79. The retiring of baby boomers may cause a 2.8pp decline in productivity growth between 2020 and 2040, ceteris paribus.
We consider the effects of uncertainty shocks in a nonlinear VAR that allows uncertainty to have amplification effects. When uncertainty is relatively low, fluctuations in uncertainty have small, linear effects. In periods of high uncertainty, the effect of a further increase in uncertainty is magnified. We find that uncertainty shocks in this environment have a more pronounced effect on real economic variables. We also conduct counterfactual experiments to determine the channels through which uncertainty acts. Uncertainty propagates through both the household consumption channel and through businesses delaying investment, providing substantial contributions to the decline in GDP observed after uncertainty shocks. Finally, we find evidence of the ability of systematic monetary policy to mitigate the adverse effects of uncertainty shocks.
We study the endogenous determination of corporate debt maturity in a setting with default risk. We assume that firms must access the bond market and they issue debt with a flexible structure (coupon, face value, and maturity). Initially, the firm is in a low growth/illiquid state that requires debt refinancing if it matures. Since lenders do not refinance projects with positive but small net present value, firms may be forced to default in the first phase. We call this liquidity risk. The technology is such that earnings can switch to a higher (but riskier) level. In this second phase firms have access to the equity market but they may default if this is the best option. We call this strategic default risk. In the model optimal maturity balances these two risks. We show that firms with poor prospects and firms in more unstable industries will choose shorter maturities even if it is feasible to issue longer debt. The model also offers predictions on how asset maturity, asset salability, and leverage influence maturity. Even though our model is extremely stylized we find that the predictions are roughly consistent with the evidence. Moreover, it offers some insights into the factors that determine the structure of the debt.
Leading into a debt crisis, interest rate spreads on sovereign debt rise before the economy experiences a decline in productivity, suggesting that news about future economic developments may play an important role in these episodes. In a VAR estimation, a news shock has a larger contemporaneous impact on sovereign credit spreads than a comparable shock to labor productivity. A quantitative model of news and sovereign debt default with endogenous maturity choice generates impulse responses and a variance decomposition similar to the empirical VAR estimates. The dynamics of the economy after a bad news shock share some features of a productivity shock and others of sudden stop events. However, unlike episodes of sudden stops, long-term debt does not shield the country from bad news shocks, and it may even exacerbate default risk. Finally, an increase in the precision of news allows the government to improve its debt maturity management, especially during periods of high stress in credit markets, and thus face lower yield spreads while increasing the amount of debt.
In this paper we document the stylized facts about the relationship between international oil price swings, sovereign risk and macroeconomic performance of oil-exporting economies. We show that even though being a bigger oil producer decreases sovereign risk–because it increases a country’s ability to repay–having more oil reserves increases sovereign risk by making autarky more attractive. We develop a small open economy model of sovereign risk with incomplete international financial markets, in which optimal oil extraction and sovereign default interact. We use the model to understand the mechanisms behind the empirical facts, and show that it supports them.
This paper quantifies the long-run effects of reducing capital gains taxes on aggregate investment. We develop a dynamic general equilibrium model with heterogeneous firms, which face discrete capital gains tax rates based on firm size. We calibrate our model by targeting micro moments and a difference-in-differences estimate of the capital stock response based on the institutional setting and policy reform in Korea. We find that the reform that reduced the capital gains tax rates for a subset of firms substantially increased investment in the short run, and capturing general equilibrium price responses is important to quantify the long-run aggregate outcomes.
This paper proposes an equilibrium model to explain the positive and sizable term premia observed in the data. We introduce a slow mean-reverting process of consumption growth and a segmented asset market mechanism with heterogeneous trading technology to otherwise a standard heterogeneous agent general equilibrium model. First, a slow mean-reverting consumption growth process implies that the expected consumption growth rate is only slightly countercyclical and the process can exhibit a near zero first-order autocorrelation as seen in the data. The very small countercyclicality of the expected consumption growth rate suggests that the long term bonds are risky and hence the term premia are positive. Second, the segmented asset market mechanism amplifies the size and the magnitude of term premia since the aggregate risk is concentrated into a small fraction of marginal traders who demand high risk premia. For sensitivity analysis, the role of each assumption is further investigated by taking each factor out one by one.
To study long-run large-scale early childhood policies, this paper incorporates early childhood investments into a standard general-equilibrium (GE) heterogeneous-agent overlapping-generations model. After estimating it using US data, we show that an RCT evaluation of a short-run small-scale early childhood program in the model predicts effects on children's education and income that are similar to the empirical evidence. A long-run large-scale program, however, yields twice as large welfare gains, even after considering GE and taxation effects. Key to this difference is that investing in a child not only improves her skills but also creates a better parent for the next generation.
We study the impact of research collaborations in coauthorship networks on research output and how optimal funding can maximize it. Through the links in the collaboration network, researchers create spillovers not only to their direct coauthors but also to researchers indirectly linked to them. We characterize the equilibrium when agents collaborate in multiple and possibly overlapping projects. We bring our model to the data by analyzing the coauthorship network of economists registered in the RePEc Author Service. We rank the authors and research institutions according to their contribution to the aggregate research output and thus provide a novel ranking measure that explicitly takes into account the spillover effect generated in the coauthorship network. Moreover, we analyze funding instruments for individual researchers as well as research institutions and compare them with the economics funding program of the National Science Foundation. Our results indicate that, because current funding schemes do not take into account the availability of coauthorship network data, they are ill-designed to take advantage of the spillover effects generated in scientific knowledge production networks.
A wide range of heterodox theories claim that banks are special because they create money in the act of lending. Put another way, banks can create the funding they need ex nihilo, whereas all other agencies must first acquire the funding they need from other parties. Mainstream economic theory largely agrees with this assessment, but questions its theoretical and empirical relevance, preferring to view banks as one of many potentially important actors in the financial market. In this paper, I develop a formal economic model in an attempt to make these ideas precise. The model lends some support to both views on banking.