We explore banks’ reactions to a shock arising from their exposure to the sharp oil price declines of 2014. Exposed banks tightened credit on corporate lending and on mortgages to be retained on-balance-sheet, while expanding credit for securitized mortgages. Banks therefore re-balanced their portfolio to lower their risk, rather than scaling back the size of their balance sheet or reducing lending uniformly. Thus, when assessing implications of bank stress for the broader economy, one must consider banks overall strategy, rather than focusing on isolated parts of the balance sheet.
Stress testing has become an important component of macroprudential regulation yet its goals and implementation are still being debated, reflecting the difficulty of designing such frameworks in the context of enormous model uncertainty. We illustrate methods for responding to possible misspecifications in models used for assessing bank vulnerabilities. We show how ‘exponential tilting’ allows the incorporation of external judgment, captured in moment conditions, into a forecasting model as a partial correction for misspecification. We also make use of methods from robust control to seek the most relevant dimensions in which a regulator’s forecasting model might be misspecified – a search for a ‘worst case’ model that is a ‘twisted’ version of the regulator’s initial forecasting model. Finally, we show how the two approaches can be blended so that one can search for a worst case model subject to restrictions on its properties, informed by the regulator’s judgment. We demonstrate the methods using the New York Fed’s CLASS model, a top-down capital stress testing framework that projects the effect of macroeconomic scenarios on U.S. banking firms.
Despite the general consensus that stress testing has been useful in financial and macro-prudential regulation, test techniques are still being debated. This paper proposes using robust forecasting analysis to construct adverse scenarios using a benchmark model that includes a modified worst-case distribution. These scenarios give regulators a way to identify vulnerabilities, while acknowledging that models may be misspecified in unknown ways.
What determines the frequency domain properties of a stochastic process? How much risk comes from high frequencies, business cycle frequencies or low frequency swings? If these properties are under the influence of an agent, who is compensated by a principal according to the distribution of risk across frequencies, then the nature of this contracting problem will affect the spectral properties of the endogenous outcome. We imagine two thought experiments: in the first, the principal is myopic with regard to certain frequencies – his understanding of the true process is intermediated through a filter – and the agent chooses to hide risk by shifting power from frequencies to which the regulator is attuned to those to which he is not. Thus, the regulator is fooled into thinking there has been an overall reduction in risk when, in fact, there has simply been a frequency shift. In the second thought experiment, the regulator is not myopic, but simply cares more about risk from certain frequencies, perhaps due to the preferences of the constituents he represents or because certain types of market incompleteness make certain frequencies of risk more damaging. We model this intuition by positing a filter design problem for the agent and also by a particular type of portfolio selection problem, in which the agent chooses among investment projects with different spectral properties. While abstract, these models suggest important implications for macroprudential policy and regulatory arbitrage.
Published Articles (Refereed Journals and Volumes)
Consumption-based asset-pricing models have experienced success in recent years by augmenting the consumption process in “exotic” ways. Two notable examples are the Long-Run Risk and rare disaster frameworks. Such models are difficult to characterize from consumption data alone. Accordingly, concerns have been raised regarding their specification. Acknowledging that both phenomena are naturally subject to ambiguity, we show that an ambiguity-averse agent may behave as if Long-Run Risk and disasters exist even if they do not or exaggerate them if they do. Consequently, prices may be misleading in characterizing these phenomena since they encode a pessimistic perspective of the data-generating process.
We study an investor who is unsure of the dynamics of the economy. Not only are parameters unknown, but the investor does not even know what order model to estimate. She estimates her consumption process nonparametrically–allowing potentially infinite-order dynamics–and prices assets using a pessimistic model that minimizes lifetime utility subject to a constraint on statistical plausibility. The equilibrium is exactly solvable and the pricing model always includes long-run risks. With risk aversion of 4.7, the model matches major facts about asset prices, consumption, and dividends. The paper provides a novel link between ambiguity aversion and nonparametric estimation.
In a real business cycle model, an agent’s fear of model misspecification interacts with stochastic volatility to induce time varying worst case scenarios. These time varying worst case scenarios capture a notion of animal spirits where the probability distributions used to evaluate decision rules and price assets do not necessarily reflect the fundamental characteristics of the economy. Households entertain a pessimistic view of the world and their pessimism varies with the overall level of volatility in the economy, implying an amplification of the effects of volatility shocks. By using perturbation methods and Monte Carlo techniques we extend the class of models analyzed with robust control methods to include the sort of nonlinear production-based DSGE models that are popular in academic research and policymaking practice.