With business leverage at record levels, the effects of corporate debt overhang on growth and investment have become a prominent concern. In this paper, we study the effects of corporate debt overhang based on long-run cross-country data covering the near universe modern business cycles. We show that business credit booms typically do not leave a lasting imprint on the macroeconomy. Quantile local projections indicate that business credit booms do not affect the economy’s tail risks either. Yet in line with theory, we find that the economic costs of corporate debt booms rise when inefficient debt restructuring and liquidation impede the resolution of corporate financial distress and make it more likely that corporate zombies creep along.
The fiscal “multiplier” measures how many additional dollars of output are gained or lost for each dollar of fiscal stimulus or contraction. In practice, the multiplier at any point in time depends on the monetary policy response and existing conditions in the economy. Using the IMF fiscal consolidations dataset for identification and a new decomposition-based approach, we show how to quantify the importance of these monetary-fiscal interactions. In the data, the fiscal multiplier varies considerably with monetary policy: it can be zero, or as large as 2 depending on the monetary offset. More generally, we show how to decompose the typical macro impulse response function by extending local projections to carry out the well-known Blinder-Oaxaca decomposition. This provides a convenient way to evaluate the effects of policy, state-dependence, and balance conditions for identification.
Business cycles are costlier and stabilization policies could be more beneficial than widely thought. This paper introduces a new test to show that all business cycles are asymmetric and resemble “mini-disasters.” By this we mean that growth is pervasively fat-tailed and non-Gaussian. Using long-run historical data, we show empirically that this is true for advanced economies since 1870. Focusing on peacetime eras, we develop a tractable local projection framework to estimate consumption growth paths for normal and financial-crisis recessions. Introducing random coefficient local projections (RCLP) we get an easy and transparent mapping from the estimates to a calibrated simulation model with disasters of variable severity. Simulations show that substantial welfare costs arise not just from the large rare disasters, but also from the smaller but more frequent mini-disasters in every cycle. On average, and in post-WW2 data, even with low risk aversion, households would sacrifice about 15 percent of consumption to avoid such cyclical fluctuations.
What are the medium- to long-term effects of pandemics? How do they differ from other economic disasters? We study major pandemics using the rates of return on assets stretching back to the 14th century. Significant macroeconomic after-effects of pandemics persist for decades, with real rates of return substantially depressed, in stark contrast to what happens after wars. Our findings are consistent with the neoclassical growth model: capital is destroyed in wars, but not in pandemics; pandemics instead may induce relative labor scarcity and/or a shift to greater precautionary savings.
Is the effect of monetary policy on the productive capacity of the economy long lived? Yes, in fact we find such impacts are significant and last for over a decade based on: (1) merged data from two new international historical databases; (2) identification of exogenous monetary policy using the macroeconomic trilemma; and (3) improved econometric methods. Notably, the capital stock and total factor productivity (TFP) exhibit hysteresis, but labor does not. Money is non-neutral for a much longer period of time than is customarily assumed. A New Keynesian model with endogenous TFP growth can reconcile all these empirical observations.
Interest rates in major advanced economies have drifted down and in greater unison over the past few decades. A country’s rate of interest can be thought of as reflecting movements in the global neutral rate of interest, the domestic neutral rate, and the stance of monetary policy. Only the latter is controlled by the central bank. Estimates from a state space New Keynesian model show that central bank policy explains less than half of the variation in interest rates. The rest of the time, the central bank is catching up to trends dictated by productivity growth, demography, and other factors outside of its control.
The risk premium puzzle is worse than you think. Using a new database for the U.S. and 15 other advanced economies from 1870 to the present that includes housing as well as equity returns (to capture the full risky capital portfolio of the representative agent), standard calculations using returns to total wealth and consumption show that: housing returns in the long run are comparable to those of equities, and yet housing returns have lower volatility and lower covariance with consumption growth than equities. The same applies to a weighted total-wealth portfolio, and over a range of horizons. As a result, the implied risk aversion parameters for housing wealth and total wealth are even larger than those for equities, often by a factor of 2 or more. We find that more exotic models cannot resolve these even bigger puzzles, and we see little role for limited participation, idiosyncratic housing risk, transaction costs, or liquidity premiums.
The manner firms respond to shocks reflects fundamental features of labor, capital, and commodity markets, as well as advances in finance and technology. Such features are integral to constructing models of the macroeconomy. In this paper we document secular shifts in the margins firms use, in aggregate, to adjust to shocks that have consequences for the economy’s cyclical behavior. These new business cycle facts on the comovement of output and its inputs are a natural complement to analyzing output and its expenditure components. Our findings shed light on the changing cyclicality of productivity in response to different shocks.
This paper introduces new nonparametric statistical methods to evaluate zero-cost investment strategies. We focus on directional trading strategies, risk-adjusted returns, and the investor’s decisions under uncertainty as the core of our analysis. By relying on classification tools with a long tradition in the sciences and biostatistics, we can provide a tighter connection between model-based risk characteristics and the no-arbitrage conditions for market efficiency. Moreover, we extend the methods to multicategorical settings, such as when the investor can sometimes take a neutral position. A variety of inferential procedures are provided, many of which are illustrated with applications to excess equity returns and to currency carry trades.
Published Articles (Refereed Journals and Volumes)
What is the relationship between bank capital, the risk of a financial crisis, and its severity? This article introduces the first comprehensive analysis of the long-run evolution of the capital structure of modern banking using newly constructed data for banks’ balance sheets in 17 countries since 1870. In addition to establishing stylized facts on the changing funding mix of banks, we study the nexus between capital structure and financial instability. We find no association between higher capital and lower risk of banking crisis. However, economies with better capitalized banking systems recover faster from financial crises as credit begins to flow back more readily.
The trilemma of international finance explains why interest rates in countries that fix their exchange rates and allow unfettered cross-border capital flows are outside the monetary authority’s control. Based on this exogenous source of variation, we show that monetary interventions have large and significant effects using historical panel data since 1870. The causal effect of these interventions depends on whether: (1) the economy is above or below potential; (2) inflation is low; and (3) there is a credit boom in mortgage markets. Several novel control function adjustments account for potential spillover effects. The results have important implications for monetary policy.
The Phillips curve remains central to stabilization policy. Increasing financial linkages, international supply chains, and managed exchange rate policy have given core currencies an outsized influence on the domestic affairs of world economies. We exploit such influence as a source of exogenous variation to examine the effects of the recent financial crisis on the Phillips curve mechanism. Using a difference-in-differences approach, and comparing countries before and after the 2008 financial crisis sorted by whether they endured or escaped the crisis, we are able to assess the evolution of the Phillips curve globally.
What is the aggregate real rate of return in the economy? Is it higher than the growth rate of the economy and, if so, by how much? Is there a tendency for returns to fall in the long run? Which particular assets have the highest long-run returns? We answer these questions on the basis of a new and comprehensive data set for all major asset classes, including housing. The annual data on total returns for equity, housing, bonds, and bills cover 16 advanced economies from 1870 to 2015, and our new evidence reveals many new findings and puzzles.
This paper studies the synchronization of financial cycles across 17 advanced economies over the past 150 years. The comovement in credit, house prices, and equity prices has reached historical highs in the past three decades. While comovement of credit and house prices increased in line with growing real sector integration, comovement of equity prices has increased above and beyond growing real sector integration. The sharp increase in the comovement of global equity markets is particularly notable. We demonstrate that fluctuations in risk premiums, and not risk-free rates and dividends, account for a large part of the observed equity price synchronization after 1990. We also show that US monetary policy has come to play an important role as a source of fluctuations in risk appetite across global equity markets. These fluctuations are transmitted across both fixed and floating exchange rate regimes, but the effects are more muted in floating rate regimes.
We develop flexible semiparametric time series methods that are then used to assess the causal effect of monetary policy interventions on macroeconomic aggregates. Our estimator captures the average causal response to discrete policy interventions in a macro-dynamic setting, without the need for assumptions about the process generating macroeconomic outcomes. The proposed procedure, based on propensity score weighting, easily accommodates asymmetric and nonlinear responses. Application of this estimator to the effects of monetary restraint shows the Fed to be an effective inflation fighter. Our estimates of the effects of monetary accommodation, however, suggest the Federal Reserve’s ability to stimulate real economic activity is more modest. Estimates for recent financial crisis years are similar to those for the earlier, pre-crisis period.
After the Global Financial Crisis, a controversial rush to fiscal austerity followed in many countries. Yet research on the effects of austerity on macroeconomic aggregates was and still is unsettled, mired by the difficulty of identifying multipliers from observational data. This article, reconciles seemingly disparate estimates of multipliers within a unified and state-contingent framework. We achieve identification of causal effects with new propensity-score based methods for time series data. Using this novel approach, we show that austerity is always a drag on growth, and especially so in depressed economies: a 1% of GDP fiscal consolidation translates into a loss of 3.5% of real GDP over five years when implemented in a slump, rather than just 1.8% in a boom.
Two separate narratives have emerged in the wake of the Global Financial Crisis. One interpretation speaks of private financial excess and the key role of the banking system in leveraging and deleveraging the economy. The other emphasizes the public sector balance sheet and worries about the risks of lax fiscal policy. However, the two may interact in important and understudied ways. This paper examines the co-evolution of public and private sector debt in advanced countries from 1870 to 2012. We find that in advanced economies financial crises are not preceded by public debt build-ups nor are they more likely when public debt is high. However, history shows that high levels of public debt tend to exacerbate the effects of private sector deleveraging after financial crises. The economic costs of financial crises rise substantially if large private sector credit booms are unwound at times when the public sector has little capacity to pursue macroeconomic and financial stabilization.
This paper unveils a new resource for macroeconomic research: A long-run dataset covering disaggregated bank credit for 17 advanced economies since 1870. The new data show that the share of mortgages on banks’ balance sheets doubled in the course of the twentieth century, driven by a sharp rise of mortgage lending to households. Household debt to asset ratios have risen substantially in many countries. Financial stability risks have been increasingly linked to real estate lending booms, which are typically followed by deeper recessions and slower recoveries. Housing finance has come to play a central role in the modern macroeconomy.
What risks do asset price bubbles pose for the economy? This paper studies bubbles in housing and equity markets in 17 countries over the past 140 years. History shows that not all bubbles are alike. Some have enormous costs for the economy, while others blow over. We demonstrate that what makes some bubbles more dangerous than others is credit. When fueled by credit booms, asset price bubbles increase financial crisis risks; upon collapse they tend to be followed by deeper recessions and slower recoveries. Credit-financed housing price bubbles have emerged as a particularly dangerous phenomenon.
Is there a link between loose monetary conditions, credit growth, house price booms, and financial instability? This paper analyzes the role of interest rates and credit in driving house price booms and busts with data spanning 140 years of modern economic history in the advanced economies. We exploit the implications of the macroeconomic policy trilemma to identify exogenous variation in monetary conditions: countries with fixed exchange regimes often see fluctuations in short-term interest rates unrelated to home economic conditions. We use novel instrumental variable local projection methods to demonstrate that loose monetary conditions lead to booms in real estate lending and house prices’ bubbles; these, in turn, materially heighten the risk of financial crises. Both effects have become stronger in the postwar era.
This paper provides a historical overview of financial crises and their origins. The objective is to discuss a few of the modern statistical methods that can be used to evaluate predictors of these rare events. The problem involves the prediction of binary events, and therefore fits modern statistical learning, signal processing theory, and classification methods. The discussion also emphasizes the need for statistics and computational techniques to be supplemented with economics. The success of a forecast in this environment hinges on the economic consequences of the actions taken as a result of the forecast, rather than on typical statistical metrics of prediction accuracy.
What do the behavior of monkeys in captivity and the financial system have in common? The nodes in such social systems relate to each other through multiple and keystone networks, not just one network. Each network in the system has its own topology, and the interactions among the system’s networks change over time. In such systems, the lead into
a crisis appears to be characterized by a decoupling of the networks from the keystone network. This decoupling can also be seen in the crumbling of the keystone’s power structure toward a more horizontal hierarchy. This paper develops nonparametric methods for describing the joint model of the latent architecture of interconnected networks in order to describe this process of decoupling, and hence provide an early warning system of an
This note examines labor market performance across countries through the lens of Okun’s Law. We find that after the 1970s but prior to the global financial crisis of the 2000s, the Okun’s Law relationship between output and unemployment became more homogenous across countries. These changes presumably reflected institutional and technological changes. But, at least in the short term, the global financial crisis undid much of this convergence, in part because the affected countries adopted different labor market policies in response to the global demand shock.
This paper investigates the problem of constructing prediction regions for forecast trajectories 1 to H periods into the future-a path forecast. When the null model is only approximative, or completely unavailable, one cannot either derive the usual analytic expressions or resample from the null model. In this context, this paper derives a method for constructing approximate rectangular regions for simultaneous probability coverage that correct for serial correlation in the case of elliptical distributions. In both Monte Carlo studies and an empirical application to the Greenbook path-forecasts of growth and inflation, the performance of this method is compared to the performances of the Bonferroni approach and the approach which ignores simultaneity.
This paper codifies in a systematic and transparent way a historical chronology of business cycle turning points for Spain reaching back to 1850 at annual frequency, and 1939 at monthly frequency. Such an exercise would be incomplete without assessing the new chronology itself and against others—this we do with modern statistical tools of signal detection theory. We also use these tools to determine which of several existing economic activity indexes provide a better signal on the underlying state of the economy. We conclude by evaluating candidate leading indicators and hence construct recession probability forecasts up to 12 months in the future.
This paper studies the role of credit in the business cycle, with a focus on private credit overhang. Based on a study of the universe of over 200 recession episodes in 14 advanced countries between 1870 and 2008, we document two key facts of the modern business cycle: financial-crisis recessions are more costly than normal recessions in terms of lost output; and for both types of recession, more credit-intensive expansions tend to be followed by deeper recessions and slower recoveries. In additional to unconditional analysis, we use local projection methods to condition on a broad set of macroeconomic controls and their lags. Then we study how past credit accumulation impacts the behavior of not only output but also other key macroeconomic variables such as investment, lending, interest rates, and inflation. The facts that we uncover lend support to the idea that financial factors play an important role in the modern business cycle.
The carry trade is the investment strategy of going long in high-yield target currencies and short in low-yield funding currencies. Recently, this naive trade has seen very high returns for long periods, followed by large crash losses after large depreciations of the target currencies. Based on low Sharpe ratios and negative skew, these trades could appear unattractive, even when diversified across many currencies. But more sophisticated conditional trading strategies exhibit more favorable payoffs. We apply novel (within economics) binary-outcome classification tests to show that our directional trading forecasts are informative, and out-of-sample loss-function analysis to examine trading performance.
The critical conditioning variable, we argue, is the fundamental equilibrium exchange rate (FEER). Expected returns are lower, all else equal, when the target currency is overvalued. Like traders, researchers should incorporate this information when evaluating trading strategies. When we do so, some questions are resolved: negative skewness is purged, and market volatility (VIX) is uncorrelated with returns; other puzzles remain: the more sophisticated strategy has a very high Sharpe ratio, suggesting market inefficiency.
Frictions and perturbations may influence currency values in the short run, but it is generally acknowledged that real-exchange rates eventually settle toward equilibrium. The puzzle then is how gradually this parity is reached given the fluidity in foreign exchange markets. Persistent differences in the relative productivity of countries—a broad characterization of the Harrod–Balassa–Samuelson hypothesis—may help explain this puzzle. This article introduces methods to estimate equilibrium adjustment paths semiparametrically, and then sort how each of these components influences the dynamics of exchange rates. This is done in a dynamic panel setting by introducing novel local projections methods for cointegrated systems. Productivity shocks affect dynamics, and after adjusting for these factors, adjustment toward equilibrium is relatively rapid.
Do external imbalances increase the risk of financial crises? This paper studies the experience of 14 developed countries over 140 years (1870-2008). It exploits the long-run data set in a number of different ways. First, the paper applies new statistical tools to describe the temporal and spatial patterns of crises and identifies five episodes of global financial instability in the past 140 years. Second, it studies the macroeconomic dynamics before crises and shows that credit growth tends to be elevated and short-term interest rates depressed relative to the “natural rate” in the run-up to global financial crises. Third, the paper shows that recessions associated with crises lead to deeper slumps and stronger turnarounds in imbalances than during normal recessions. Finally, the paper asks to what extent external imbalances help predict financial crises. The overall result is that credit growth emerges as the single best predictor of financial instability. External imbalances have played an additional role, but more so in the pre-WWII era of low financialization than today.
The stability of the solution path in a macroeconomic model implies that it admits a Wold representation. This Wold representation can be estimated semiparametrically by local projections and used to estimate the model’s parameters by minimum distance techniques even when the stochastic process for the solution path is unknown or unconventional. We name this two-step estimation procedure “projection minimum distance” and investigate its statistical properties for the broad class of models where the mapping between Wold coefficients and parameters is linear. This includes many situations with likelihood score functions nonlinear in the parameters that would otherwise require numerical optimization routines.
The Business Cycle Dating Committee of the National Bureau of
Economic Research provides a historical chronology of business
cycle turning points. We investigate three central aspects of this
chronology. How skillful is the Dating Committee when classifying
economic activity into expansions and recessions? Which indices of
economic conditions best capture the current but unobservable state
of the business cycle? And which indicators best predict future turning
points, and at what horizons? We answer each of these questions
in detail using methods specifically designed to assess classification
ability. In the process, we clarify several important features of the
A path forecast refers to the sequence of forecasts 1 to H periods into the future. A summary of the range of possible paths the predicted variable may follow for a given confidence level requires construction of simultaneous confidence regions that adjust for any covariance between the elementsof the path forecast. This paper shows how to construct such regions with the joint predictive density and Scheffe’s (1953) S-method. In addition, the joint predictive density can be used to construct simple statistics to evaluate the local internal consistency of a forecasting exercise of a system of variables. Monte Carlo simulations demonstrate that these simultaneous confidence regions provide
approximately correct coverage in situations where traditional error bands, based on the collection of marginal predictive densities for each horizon, are vastly off mark. The paper showcases these methods with an application to the most recent monetary episode of interest rate hikes in the U.S. macroeconomy.
Inference about an impulse response is a multiple testing problem with serially correlated coefficient estimates. This paper provides a method to construct simultaneous confidence regions for impulse responses and conditional bands to examine significance levels of individual impulse response coefficients given propagation trajectories. The paper also shows how to constrain a subset of impulse response paths to anchor
structural identification and how to formally test the validity of such identifying constraints. Simulation and empirical evidence illustrate the new techniques. A broad summary of asymptotic analytic formulas is provided to make the methods easy to implement with commonly available
This paper introduces methods to compute impulse responses without specification and estimation of the underlying multivariate dynamic system. The central idea consists in estimating local projections at each period of interest rather than extrapolating into increasingly distant horizons from a given model, as it is done with vector autoregressions (VAR). The advantages of local projections are numerous: (1) they can be estimated by simple regression techniques with standard regression packages; (2) they are more robust to misspecification; (3) joint or point-wise analytic inference is simple; and (4) they easily accommodate experimentation with highly nonlinear and flexible specifications that may be impractical in a multivariate context. Therefore, these methods are a natural alternative to estimating impulse responses from VARs. Monte Carlo evidence and an application to a simple, closed-economy, new-Keynesian model clarify these numerous advantages.
This paper investigates the effects of temporal aggregation when the aggregation frequency is variable and possibly stochastic. The results that we report include, as a particular case, the well-known results on fixed-interval aggregation, such as when monthly data are aggregated into quarters. A variable aggregation frequency implies that the aggregated process will exhibit time-varying parameters and non-spherical disturbances, even when these characteristics are absent from the original model. Consequently, we develop methods for specification and estimation of the aggregate models and show with an example how these methods perform in practice.
This paper measures the degree of monetary policy interdependence between major industrialized countries from a new perspective. The analysis uses a special data set on central bank issued policy rate targets for 14 OECD countries. Methodologically, our approach is novel in that we separately examine monetary interdependence due to (1) the coincidence in time of when policy actions are executed from (2) the nature and magnitude of the policy adjustments made. The first of these elements requires that the timing of events be modeled with a dynamic discrete duration design. The discrete nature of the policy rate adjustment process that characterizes the second element is captured with an ordered response model. The results indicate there is significant policy interdependence among these 14 countries during the 1980-1998 sample period. This is especially true for a number of European countries which appeared to respond to German policy during our sample period. A number of other countries appeared to respond to U.S. policy, though this number is smaller than that suggested in preceding studies. Moreover, the policy harmonization we find appears to work through channels other than formal coordination agreements.
In February 4, 1994 the Federal Reserve began the practice of announcing changes in the targeted level for the federal funds rate immediately after such decisions were made. This paper investigates to what extent the policy of “the announcement” affected a key ingredient in the monetary transmission mechanism: the term structure of nominally risk-free, Treasury securities. We find that term rates react much more in unison during announcement days than at any other time. Moreover, the practice of circumscribing almost all changes in the federal funds rate target to Federal Open Market Committee (FOMC) meeting dates regiments the formation of market expectations in the overnight rate and the price discovery process of term rates, thus facilitating the Fed’s goal of controlling long-term rates.
This paper shows that greater uncertainty about monetary policy can lead to a decline in nominal interest rates. In the context of a limited participation model, monetary policy uncertainty is modeled as a mean preserving spread in the distribution for the money growth process. This increase in uncertainty lowers the yield on short-term maturity bonds because the household sector responds by increasing liquidity in the banking sector. Long-term maturity bonds also have lower yields but this decrease is a result of the effect that greater uncertainty has on the nominal intertemporal rate of substitution–which is a convex function of money growth. We examine the nature of these relations empirically by introducing the GARCH-SVAR model–a multivariate generalization of the GARCH-M model. The predictions of the model are broadly supported by the data: higher uncertainty in the federal funds rate can lower the yields of the three- and six-month treasury bill rates.
This paper shows that high-frequency, irregularly spaced, foreign exchange (FX) data can generate nonnormality, conditional heteroskedasticity, and leptokurtosis when aggregated into fixed-interval calendar time, even when these features are absent in the original DGP. Furthermore, we introduce a new approach to modeling these high-frequency irregularly spaced data based on the Poisson regression model. The new model is called the autoregressive conditional intensity model and it has the advantage of being simple and of maintaining the calendar timescale. To illustrate the virtues of this approach, we examine a classical issue in FX microstructure: the variation in information content as a function of fluctuations in the intensity of activity levels.
A Model for the Federal Funds Rate Target
Journal of Political Economy 110(5), July 2002, 1135-1167 | With Hamilton
This paper is a statistical analysis of the manner in which the Federal Reserve determines the level of the federal funds rate target, one of the most publicized and anticipated economic indicators in the financial world. The paper introduces new statistical tools for forecasting a discrete-valued time series such as the target, and suggests that these methods, in conjunction with a focus on the institutional details of how the target is determined, can significantly improve on standard VAR forecasts of the effective federal funds rate. We further show that the news that the Fed has changed the target has substantially different statistical content from the news that the Fed failed to make an anticipated target change, causing us to challenge some of the conclusions drawn from standard linear VAR impulse-response functions.
Testing Nonlinearity: Decision Rules for Choosing between Logistic and Exponential STAR Models
Spanish Economic Review 3, 2001, 193-209 | With Escribano
A new LM specification procedure to choose between Logistic and Exponential Smooth Transition Autoregressive (STAR) models is introduced. The new decision rule has better properties than those previously available in the literature when the model is ESTAR and similar properties when the model is LSTAR. A simple natural extension of the usual LM-test for linearity is introduced and evaluated in terms of power. Monte-Carlo simulations and empirical evidence are provided in support of our claims.
Random Time Aggregation in Partial Adjustment Models
Journal of Business and Economic Statistics 7(3), July 1999, 382-396
How is econometric analysis (of partial adjustment models) affected by the fact that, while data collection is done at regular, fixed intervals of time, economic decisions are made at random intervals of time? This paper addresses this question by modelling the economic decision making process as a general point process. Under randomtime aggregation: (1) inference on the speed of adjustment is biased–adjustments are a function of the intensity of the point procEss and the proportion of adjustment; (2) inference on the correlation with exogenous variables is generally downward biased; and (3) a non-constant intensity of the point process gives rise to a general class of regime dependent time series models. An empirical application to test the production smoothing-buffer stock model of inventory behavior illustrates, in practice, the effects
of random-time aggregation.
In advanced economies, a century-long near-stable ratio of credit to GDP gave way to rapid financialization and surging leverage in the last forty years. This “financial hockey stick” coincides with shifts in foundational macroeconomic relationships beyond the widely-noted return of macroeconomic fragility and crisis risk. Leverage is correlated with central business cycle moments, which we can document thanks to a decade-long international and historical data collection effort. More financialized economies exhibit somewhat less real volatility, but also lower growth, more tail risk, as well as tighter real-real and real-financial correlations. International real and financial cycles also cohere more strongly. The new stylized facts that we discover should prove fertile ground for the development of a new generation of macroeconomic models with a prominent role for financial factors.
FRB St. Louis Review 83(4), July 2001, 113-137 | With Hoover
Boletín Inflación y Analisis Económico: Predicción y Diagnóstico 68, June 2000
Improved Testing and Specification of Smooth Transition Regression Models
In Dynamic Modeling and Econometrics in Economics and Finance, Vol 1, Nonlinear Time Series Analysis of Economic and Financial Data, ed. by Rothman | Kluwer Academic Press, 1998. 289-319 | With Escribano
La Política Monetaria en los Estados Unidos: El Objetivo de los Tipos de Fondos Federales