This paper studies the synchronization of financial cycles across 17 advanced economies over the past 150 years. The comovement in credit, house prices, and equity prices has reached historical highs in the past three decades. The sharp increase in the comovement of global equity markets is particularly notable. We demonstrate that fluctuations in risk premiums, and not risk-free rates and dividends, account for a large part of the observed equity price synchronization after 1990. We also show that U.S. monetary policy has come to play an important role as a source of fluctuations in risk appetite across global equity markets. These fluctuations are transmitted across both fixed and floating exchange rate regimes, but the effects are more muted in floating rate regimes.
This paper answers fundamental questions that have preoccupied modern economic thought since the 18th century. What is the aggregate real rate of return in the economy? Is it higher than the growth rate of the economy and, if so, by how much? Is there a tendency for returns to fall in the long-run? Which particular assets have the highest long-run returns? We answer these questions on the basis of a new and comprehensive dataset for all major asset classes, including—for the first time—total returns to the largest, but oft ignored, component of household wealth, housing. The annual data on total returns for equity, housing, bonds, and bills cover 16 advanced economies from 1870 to 2015, and our new evidence reveals many new insights and puzzles.
Higher capital ratios are unlikely to prevent a financial crisis. This is empirically true both for the entire history of advanced economies between 1870 and 2013 and for the post-WW2 period, and holds both within and between countries. We reach this startling conclusion using newly collected data on the liability side of banks’ balance sheets in 17 countries. A solvency indicator, the capital ratio has no value as a crisis predictor; but we find that liquidity indicators such as the loan-to-deposit ratio and the share of non-deposit funding do signal financial fragility, although they add little predictive power relative to that of credit growth on the asset side of the balance sheet. However, higher capital buffers have social benefits in terms of macro-stability: recoveries from financial crisis recessions are much quicker with higher bank capital.
The trilemma of international finance explains why interest rates in countries that fix their exchange rates and allow unfettered cross-border capital flows are largely outside the monetary authority’s control. Using historical panel-data since 1870 and using the trilemma mechanism to construct an external instrument for exogenous monetary policy fluctuations, we show that monetary interventions have very different causal impacts, and hence implied inflation-output trade-offs, according to whether: (1) the economy is operating above or below potential; (2) inflation is low, thereby bringing nominal rates closer to the zero lower bound; and (3) there is a credit boom in mortgage markets. We use several adjustments to account for potential spillover effects including a novel control function approach. The results have important implications for monetary policy.
In advanced economies, a century-long near-stable ratio of credit to GDP gave way to rapid financialization and surging leverage in the last forty years. This “financial hockey stick” coincides with shifts in foundational macroeconomic relationships beyond the widely-noted return of macroeconomic fragility and crisis risk. Leverage is correlated with central business cycle moments, which we can document thanks to a decade-long international and historical data collection effort. More financialized economies exhibit somewhat less real volatility, but also lower growth, more tail risk, as well as tighter real-real and real-financial correlations. International real and financial cycles also cohere more strongly. The new stylized facts that we discover should prove fertile ground for the development of a new generation of macroeconomic models with a prominent role for financial factors.
The manner firms respond to shocks reflects fundamental features of labor, capital, and commodity markets, as well as advances in finance and technology. Such features are integral to constructing models of the macroeconomy. In this paper we document secular shifts in the margins firms use, in aggregate, to adjust to shocks that have consequences for the economy’s cyclical behavior. These new business cycle facts on the comovement of output and its inputs are a natural complement to analyzing output and its expenditure components. Our findings shed light on the changing cyclicality of productivity in response to different shocks.
This paper introduces new nonparametric statistical methods to evaluate zero-cost investment strategies. We focus on directional trading strategies, risk-adjusted returns, and the investor’s decisions under uncertainty as the core of our analysis. By relying on classification tools with a long tradition in the sciences and biostatistics, we can provide a tighter connection between model-based risk characteristics and the no-arbitrage conditions for market efficiency. Moreover, we extend the methods to multicategorical settings, such as when the investor can sometimes take a neutral position. A variety of inferential procedures are provided, many of which are illustrated with applications to excess equity returns and to currency carry trades.
Published Articles (Refereed Journals and Volumes)
Semiparametric estimates of monetary policy effects: string theory revisitied
Forthcoming in Journal of Business and Economic Statistics | With Angrist and Kuersteiner
The Time for Austerity: Estimating the Average Treatment Effect of Fiscal Policy
Economic Journal 126(590), February 2016, 219-255 | With Taylor
After the Global Financial Crisis, a controversial rush to fiscal austerity followed in many countries. Yet research on the effects of austerity on macroeconomic aggregates was and still is unsettled, mired by the difficulty of identifying multipliers from observational data. This article, reconciles seemingly disparate estimates of multipliers within a unified and state-contingent framework. We achieve identification of causal effects with new propensity-score based methods for time series data. Using this novel approach, we show that austerity is always a drag on growth, and especially so in depressed economies: a 1% of GDP fiscal consolidation translates into a loss of 3.5% of real GDP over five years when implemented in a slump, rather than just 1.8% in a boom.
This paper unveils a new resource for macroeconomic research: A long-run dataset covering disaggregated bank credit for 17 advanced economies since 1870. The new data show that the share of mortgages on banks’ balance sheets doubled in the course of the twentieth century, driven by a sharp rise of mortgage lending to households. Household debt to asset ratios have risen substantially in many countries. Financial stability risks have been increasingly linked to real estate lending booms, which are typically followed by deeper recessions and slower recoveries. Housing finance has come to play a central role in the modern macroeconomy.
Journal of Monetary Economics 76, December 2015, S1-S20 | With Schularick and Taylor
What risks do asset price bubbles pose for the economy? This paper studies bubbles in housing and equity markets in 17 countries over the past 140 years. History shows that not all bubbles are alike. Some have enormous costs for the economy, while others blow over. We demonstrate that what makes some bubbles more dangerous than others is credit. When fueled by credit booms, asset price bubbles increase financial crisis risks; upon collapse they tend to be followed by deeper recessions and slower recoveries. Credit-financed housing price bubbles have emerged as a particularly dangerous phenomenon.
This paper provides a historical overview of financial crises and their origins. The objective is to discuss a few of the modern statistical methods that can be used to evaluate predictors of these rare events. The problem involves the prediction of binary events, and therefore fits modern statistical learning, signal processing theory, and classification methods. The discussion also emphasizes the need for statistics and computational techniques to be supplemented with economics. The success of a forecast in this environment hinges on the economic consequences of the actions taken as a result of the forecast, rather than on typical statistical metrics of prediction accuracy.
What do the behavior of monkeys in captivity and the financial system have in common? The nodes in such social systems relate to each other through multiple and keystone networks, not just one network. Each network in the system has its own topology, and the interactions among the system’s networks change over time. In such systems, the lead into
a crisis appears to be characterized by a decoupling of the networks from the keystone network. This decoupling can also be seen in the crumbling of the keystone’s power structure toward a more horizontal hierarchy. This paper develops nonparametric methods for describing the joint model of the latent architecture of interconnected networks in order to describe this process of decoupling, and hence provide an early warning system of an
This note examines labor market performance across countries through the lens of Okun’s Law. We find that after the 1970s but prior to the global financial crisis of the 2000s, the Okun’s Law relationship between output and unemployment became more homogenous across countries. These changes presumably reflected institutional and technological changes. But, at least in the short term, the global financial crisis undid much of this convergence, in part because the affected countries adopted different labor market policies in response to the global demand shock.
This paper investigates the problem of constructing prediction regions for forecast trajectories 1 to H periods into the future-a path forecast. When the null model is only approximative, or completely unavailable, one cannot either derive the usual analytic expressions or resample from the null model. In this context, this paper derives a method for constructing approximate rectangular regions for simultaneous probability coverage that correct for serial correlation in the case of elliptical distributions. In both Monte Carlo studies and an empirical application to the Greenbook path-forecasts of growth and inflation, the performance of this method is compared to the performances of the Bonferroni approach and the approach which ignores simultaneity.
This paper codifies in a systematic and transparent way a historical chronology of business cycle turning points for Spain reaching back to 1850 at annual frequency, and 1939 at monthly frequency. Such an exercise would be incomplete without assessing the new chronology itself and against others—this we do with modern statistical tools of signal detection theory. We also use these tools to determine which of several existing economic activity indexes provide a better signal on the underlying state of the economy. We conclude by evaluating candidate leading indicators and hence construct recession probability forecasts up to 12 months in the future.
This paper studies the role of credit in the business cycle, with a focus on private credit overhang. Based on a study of the universe of over 200 recession episodes in 14 advanced countries between 1870 and 2008, we document two key facts of the modern business cycle: financial-crisis recessions are more costly than normal recessions in terms of lost output; and for both types of recession, more credit-intensive expansions tend to be followed by deeper recessions and slower recoveries. In additional to unconditional analysis, we use local projection methods to condition on a broad set of macroeconomic controls and their lags. Then we study how past credit accumulation impacts the behavior of not only output but also other key macroeconomic variables such as investment, lending, interest rates, and inflation. The facts that we uncover lend support to the idea that financial factors play an important role in the modern business cycle.
The carry trade is the investment strategy of going long in high-yield target currencies and short in low-yield funding currencies. Recently, this naive trade has seen very high returns for long periods, followed by large crash losses after large depreciations of the target currencies. Based on low Sharpe ratios and negative skew, these trades could appear unattractive, even when diversified across many currencies. But more sophisticated conditional trading strategies exhibit more favorable payoffs. We apply novel (within economics) binary-outcome classification tests to show that our directional trading forecasts are informative, and out-of-sample loss-function analysis to examine trading performance.
The critical conditioning variable, we argue, is the fundamental equilibrium exchange rate (FEER). Expected returns are lower, all else equal, when the target currency is overvalued. Like traders, researchers should incorporate this information when evaluating trading strategies. When we do so, some questions are resolved: negative skewness is purged, and market volatility (VIX) is uncorrelated with returns; other puzzles remain: the more sophisticated strategy has a very high Sharpe ratio, suggesting market inefficiency.
Frictions and perturbations may influence currency values in the short run, but it is generally acknowledged that real-exchange rates eventually settle toward equilibrium. The puzzle then is how gradually this parity is reached given the fluidity in foreign exchange markets. Persistent differences in the relative productivity of countries—a broad characterization of the Harrod–Balassa–Samuelson hypothesis—may help explain this puzzle. This article introduces methods to estimate equilibrium adjustment paths semiparametrically, and then sort how each of these components influences the dynamics of exchange rates. This is done in a dynamic panel setting by introducing novel local projections methods for cointegrated systems. Productivity shocks affect dynamics, and after adjusting for these factors, adjustment toward equilibrium is relatively rapid.
Do external imbalances increase the risk of financial crises? This paper studies the experience of 14 developed countries over 140 years (1870-2008). It exploits the long-run data set in a number of different ways. First, the paper applies new statistical tools to describe the temporal and spatial patterns of crises and identifies five episodes of global financial instability in the past 140 years. Second, it studies the macroeconomic dynamics before crises and shows that credit growth tends to be elevated and short-term interest rates depressed relative to the “natural rate” in the run-up to global financial crises. Third, the paper shows that recessions associated with crises lead to deeper slumps and stronger turnarounds in imbalances than during normal recessions. Finally, the paper asks to what extent external imbalances help predict financial crises. The overall result is that credit growth emerges as the single best predictor of financial instability. External imbalances have played an additional role, but more so in the pre-WWII era of low financialization than today.
The stability of the solution path in a macroeconomic model implies that it admits a Wold representation. This Wold representation can be estimated semiparametrically by local projections and used to estimate the model’s parameters by minimum distance techniques even when the stochastic process for the solution path is unknown or unconventional. We name this two-step estimation procedure “projection minimum distance” and investigate its statistical properties for the broad class of models where the mapping between Wold coefficients and parameters is linear. This includes many situations with likelihood score functions nonlinear in the parameters that would otherwise require numerical optimization routines.
The Business Cycle Dating Committee of the National Bureau of
Economic Research provides a historical chronology of business
cycle turning points. We investigate three central aspects of this
chronology. How skillful is the Dating Committee when classifying
economic activity into expansions and recessions? Which indices of
economic conditions best capture the current but unobservable state
of the business cycle? And which indicators best predict future turning
points, and at what horizons? We answer each of these questions
in detail using methods specifically designed to assess classification
ability. In the process, we clarify several important features of the
A path forecast refers to the sequence of forecasts 1 to H periods into the future. A summary of the range of possible paths the predicted variable may follow for a given confidence level requires construction of simultaneous confidence regions that adjust for any covariance between the elementsof the path forecast. This paper shows how to construct such regions with the joint predictive density and Scheffe’s (1953) S-method. In addition, the joint predictive density can be used to construct simple statistics to evaluate the local internal consistency of a forecasting exercise of a system of variables. Monte Carlo simulations demonstrate that these simultaneous confidence regions provide
approximately correct coverage in situations where traditional error bands, based on the collection of marginal predictive densities for each horizon, are vastly off mark. The paper showcases these methods with an application to the most recent monetary episode of interest rate hikes in the U.S. macroeconomy.
Inference about an impulse response is a multiple testing problem with serially correlated coefficient estimates. This paper provides a method to construct simultaneous confidence regions for impulse responses and conditional bands to examine significance levels of individual impulse response coefficients given propagation trajectories. The paper also shows how to constrain a subset of impulse response paths to anchor
structural identification and how to formally test the validity of such identifying constraints. Simulation and empirical evidence illustrate the new techniques. A broad summary of asymptotic analytic formulas is provided to make the methods easy to implement with commonly available
This paper introduces methods to compute impulse responses without specification and estimation of the underlying multivariate dynamic system. The central idea consists in estimating local projections at each period of interest rather than extrapolating into increasingly distant horizons from a given model, as it is done with vector autoregressions (VAR). The advantages of local projections are numerous: (1) they can be estimated by simple regression techniques with standard regression packages; (2) they are more robust to misspecification; (3) joint or point-wise analytic inference is simple; and (4) they easily accommodate experimentation with highly nonlinear and flexible specifications that may be impractical in a multivariate context. Therefore, these methods are a natural alternative to estimating impulse responses from VARs. Monte Carlo evidence and an application to a simple, closed-economy, new-Keynesian model clarify these numerous advantages.
This paper investigates the effects of temporal aggregation when the aggregation frequency is variable and possibly stochastic. The results that we report include, as a particular case, the well-known results on fixed-interval aggregation, such as when monthly data are aggregated into quarters. A variable aggregation frequency implies that the aggregated process will exhibit time-varying parameters and non-spherical disturbances, even when these characteristics are absent from the original model. Consequently, we develop methods for specification and estimation of the aggregate models and show with an example how these methods perform in practice.
This paper measures the degree of monetary policy interdependence between major industrialized countries from a new perspective. The analysis uses a special data set on central bank issued policy rate targets for 14 OECD countries. Methodologically, our approach is novel in that we separately examine monetary interdependence due to (1) the coincidence in time of when policy actions are executed from (2) the nature and magnitude of the policy adjustments made. The first of these elements requires that the timing of events be modeled with a dynamic discrete duration design. The discrete nature of the policy rate adjustment process that characterizes the second element is captured with an ordered response model. The results indicate there is significant policy interdependence among these 14 countries during the 1980-1998 sample period. This is especially true for a number of European countries which appeared to respond to German policy during our sample period. A number of other countries appeared to respond to U.S. policy, though this number is smaller than that suggested in preceding studies. Moreover, the policy harmonization we find appears to work through channels other than formal coordination agreements.
In February 4, 1994 the Federal Reserve began the practice of announcing changes in the targeted level for the federal funds rate immediately after such decisions were made. This paper investigates to what extent the policy of “the announcement” affected a key ingredient in the monetary transmission mechanism: the term structure of nominally risk-free, Treasury securities. We find that term rates react much more in unison during announcement days than at any other time. Moreover, the practice of circumscribing almost all changes in the federal funds rate target to Federal Open Market Committee (FOMC) meeting dates regiments the formation of market expectations in the overnight rate and the price discovery process of term rates, thus facilitating the Fed’s goal of controlling long-term rates.
This paper shows that greater uncertainty about monetary policy can lead to a decline in nominal interest rates. In the context of a limited participation model, monetary policy uncertainty is modeled as a mean preserving spread in the distribution for the money growth process. This increase in uncertainty lowers the yield on short-term maturity bonds because the household sector responds by increasing liquidity in the banking sector. Long-term maturity bonds also have lower yields but this decrease is a result of the effect that greater uncertainty has on the nominal intertemporal rate of substitution–which is a convex function of money growth. We examine the nature of these relations empirically by introducing the GARCH-SVAR model–a multivariate generalization of the GARCH-M model. The predictions of the model are broadly supported by the data: higher uncertainty in the federal funds rate can lower the yields of the three- and six-month treasury bill rates.
This paper shows that high-frequency, irregularly spaced, foreign exchange (FX) data can generate nonnormality, conditional heteroskedasticity, and leptokurtosis when aggregated into fixed-interval calendar time, even when these features are absent in the original DGP. Furthermore, we introduce a new approach to modeling these high-frequency irregularly spaced data based on the Poisson regression model. The new model is called the autoregressive conditional intensity model and it has the advantage of being simple and of maintaining the calendar timescale. To illustrate the virtues of this approach, we examine a classical issue in FX microstructure: the variation in information content as a function of fluctuations in the intensity of activity levels.
A Model for the Federal Funds Rate Target
Journal of Political Economy 110(5), July 2002, 1135-1167 | With Hamilton
This paper is a statistical analysis of the manner in which the Federal Reserve determines the level of the federal funds rate target, one of the most publicized and anticipated economic indicators in the financial world. The paper introduces new statistical tools for forecasting a discrete-valued time series such as the target, and suggests that these methods, in conjunction with a focus on the institutional details of how the target is determined, can significantly improve on standard VAR forecasts of the effective federal funds rate. We further show that the news that the Fed has changed the target has substantially different statistical content from the news that the Fed failed to make an anticipated target change, causing us to challenge some of the conclusions drawn from standard linear VAR impulse-response functions.
Testing Nonlinearity: Decision Rules for Choosing between Logistic and Exponential STAR Models
Spanish Economic Review 3, 2001, 193-209 | With Escribano
A new LM specification procedure to choose between Logistic and Exponential Smooth Transition Autoregressive (STAR) models is introduced. The new decision rule has better properties than those previously available in the literature when the model is ESTAR and similar properties when the model is LSTAR. A simple natural extension of the usual LM-test for linearity is introduced and evaluated in terms of power. Monte-Carlo simulations and empirical evidence are provided in support of our claims.
Random Time Aggregation in Partial Adjustment Models
Journal of Business and Economic Statistics 7(3), July 1999, 382-396
How is econometric analysis (of partial adjustment models) affected by the fact that, while data collection is done at regular, fixed intervals of time, economic decisions are made at random intervals of time? This paper addresses this question by modelling the economic decision making process as a general point process. Under randomtime aggregation: (1) inference on the speed of adjustment is biased–adjustments are a function of the intensity of the point procEss and the proportion of adjustment; (2) inference on the correlation with exogenous variables is generally downward biased; and (3) a non-constant intensity of the point process gives rise to a general class of regime dependent time series models. An empirical application to test the production smoothing-buffer stock model of inventory behavior illustrates, in practice, the effects
of random-time aggregation.
In advanced economies, a century-long near-stable ratio of credit to GDP gave way
to rapid financialization and surging leverage in the last forty years. This “financial
hockey stick” coincides with shifts in foundational macroeconomic relationships beyond
the widely-noted return of macroeconomic fragility and crisis risk. Leverage is
correlated with central business cycle moments, which we can document thanks to
a decade-long international and historical data collection effort. More financialized
economies exhibit somewhat less real volatility, but also lower growth, more tail risk,
as well as tighter real-real and real-financial correlations. International real and financial
cycles also cohere more strongly. The new stylized facts that we discover should
prove fertile ground for the development of a new generation of macroeconomic
models with a prominent role for financial factors.
Book Review: ‘New Introduction to Multiple Time Series Analysis’ by Helmut Lutkepohl
Econometric Reviews 29(2), 2010, 243-246
Open Market Operations
In International Encyclopedia of the Social Sciences, 2nd edition | MacMillan Reference/Thomson-Gale, 2007
North Coast River Loading Study: Road Crossing on Small Streams
In Report prepared for the Division of Environmental Analysis | California Department of Transportation, 2002 | With et al.
FRB St. Louis Review 83(4), July 2001, 113-137 | With Hoover
Boletín Inflación y Analisis Económico: Predicción y Diagnóstico 68, June 2000
Improved Testing and Specification of Smooth Transition Regression Models
In Dynamic Modeling and Econometrics in Economics and Finance, Vol 1, Nonlinear Time Series Analysis of Economic and Financial Data, ed. by Rothman | Kluwer Academic Press, 1998. 289-319 | With Escribano
La Política Monetaria en los Estados Unidos: El Objetivo de los Tipos de Fondos Federales