Spring 2016

  • 2/11 Yacine Ait-Sahalia, Princeton (12-1, Skilling 193)
    “Principal Components Analysis of High Frequency Data”
    We develop the necessary methodology to conduct principal component analysis at high frequency. We construct estimators of realized eigenvalues, eigenvectors, and principal components and provide the asymptotic distribution of these estimators. Empirically, we study the high frequency covariance structure of the constituents of the S&P 100 Index using as little as one week of high-frequency data at a time. The explanatory power of the high frequency principal components varies over time. During the recent financial crisis, the first principal component becomes increasingly dominant, explaining up to 60% of the variation on its own, while the second principal component drives the common variation of financial sector stocks.
  • 4/7 Pierre Spatz, Murex (4:30-5:30, 200-203)
    “Effects of GPU, AAD and XVA on the Future Computing Architecture of Banks”
    The 2008 crisis has tremendously changed the way we approach financial computing in the banks. While the complexity and diversity of traded products have been reduced, volumes and regulatory computations needs have exploded while budgets became tight and we do not see any relief in the future. Several solutions including GPU– powerful parallel coprocessor – , AAD– an algorithm – or both of them have been implemented to cope with today workload. All these methods imply at least a partial rewrite of the code. We will come back on our experience and see how well each solution fit different test cases with current or future hardware and extrapolate how the future calculation servers of banks will look like.
  • 4/14  Shota Ishii, State Street Global Exchange, GX Labs (4:30-5:30, 200-203)
    “Data Driven Analysis of Multi-Asset Class Portfolios”
    Longer-horizon, multi-asset class portfolios characterize those of large pensions and sovereign funds, which drive some of the largest flows in the market.  Meanwhile, much of financial theory from the 1950s to the 1990s was driven by examining only the behavior of equities, particularly in the United States.  The rise of high-performance computing using a broader array of market and non-market data is allowing us to create richer models for understanding the sources of performance in these portfolios.  We explore some of the tools, methodologies, and new areas of potential exploration for these complex multi-asset class portfolios through the lens of contemporary computing and data science.
  • 4/21 Charles-Albert Lehalle, Capital Fund Management (4:30-5:30, 200-203)
    “Optimal Trading”
    We will go from the role of the financial system described as a large network of intermediaries to a fine description of high frequency market makers. The role of regulation in the recent transformations of participants practices will be exposed too. The viewpoint taken is the one of a practitioner or a researcher who has to put in place models. Existing models will be reviewed, and new challenges and the stakes of possible improvements will be discussed. Important stylized facts and important mechanisms that models should reproduce will be exposed.
  • 4/28 Simon Scheidegger, Stanford Hoover and University of Zürich (4:30-5:30, 200-203)
    “Pricing American options under multi-factor models with recursive adaptive sparse expectations”
    We introduce a fast and scalable numerical framework for pricing American options under multi-factor models. It has competitive algorithmic complexity for long maturities, scales well to high-dimensional settings, and is parallelizable in order to further speed up the time-to-solution process. We approximate the transition kernel on an adaptive sparse grid and recursively apply an expectation operator to build the value function with a low number of points. Our algorithm accurately delivers the price function of the contract in the presence of discrete dividends. We implement an American put option pricer under stochastic volatility and analyze its performance.
  • 5/5 Eric Aldrich, UC Santa Cruz (4:30-5:30, 200-203)
    “The Flash Crash: A New Deconstruction”
    On May 6, 2010, in the span of a mere four and half minutes, the Dow Jones Industrial Average lost approximately 1,000 points. In the following fifteen minutes it recovered essentially all of its losses. This “Flash Crash” occurred in the absence of fundamental news that could explain the observed price pattern and is generally viewed as the result of endogenous factors related to the complexity of modern equity market trading. We present the first analysis of the entire order book at millisecond granularity, and not just of executed transactions, in an effort to explore the causes of the Flash Crash. We also examine information flows as reflected in a variety of data feeds provided to market participants during the Flash Crash. While assertions relating to causation of the Flash Crash must be accompanied by significant disclaimers, we suggest that it is highly unlikely that, as alleged by the United States Government, Navinder Sarao’s spoofing orders, even if illegal, could have caused the Flash Crash, or that the crash was a foreseeable consequence of his spoofing activity. Instead, we find that the explanation offered by the joint CFTC-SEC Staff Report, which relies on prevailing market conditions combined with the introduction of a large equity sell order implemented in a particularly dislocating manner, is consistent with the data. We offer a simulation model that formalizes the process by which large sell orders of the sort observed in the CFTC-SEC Staff Report, combined with prevailing market conditions, could generate a Flash Crash in the absence of fundamental information. Our research also documents the emergence of heretofore unobserved anomalies in market data feeds that correlate very closely with the initiation of and recovery from the Flash Crash. Our analysis of these data feed anomalies is ongoing as we attempt to discern whether they were a symptom of the rapid trading that accompanied the Flash Crash or whether they were causal in the sense that they rationally contributed to traders’ decisions to withdraw liquidity and then restore it after the anomalies were resolved.
  • 5/12 Robert Anderson, UC Berkeley (4:30-5:30, 200-203)
    “PCA with Model Misspecification”
    The theoretical justifications for Principal Component Analysis (PCA) typically assume that the data is IID over the estimation window. In practice, this assumption is routinely violated in financial data. We examine the extent to which PCA-like procedures can be justified in the presence of two specific kinds of misspecification present in financial data: time-varying volatility, and the presence of regimes in factor loadings. Joint work with Stephen W. Bianchi (Berkeley).
  • 5/19 Mikhail Chernov, UCLA (4:30-5:30, 200-203)
    “Macroeconomic-driven Prepayment Risk and the Valuation of Mortgage-Backed Securities”
    We introduce a reduced-form modeling framework for mortgage-backed securities in which we solve for the implied prepayment function from the cross section of market prices. From the implied prepayment function, we find that prepayment rates are driven not only by interest rates, but also by two macroeconomic factors: turnover and rate response. Intuitively, turnover represents prepayments for exogenous reasons like employment-related moves, household income shocks, and foreclosures, while rate response reflects frictions faced by borrowers in refinancing into a lower rate. We find that the implied turnover and rate response measures are in fact significantly related to macroeconomic measures such as consumption growth, the unemployment rate, housing values, credit availability, and market uncertainty. Implied prepayments are substantially higher than actual prepayments, providing direct evidence of significant prepayment risk premia in mortgage-backed security prices. We analyze the properties of the prepayment risk premium and find that it is almost entirely due to compensation for turnover risk. We also find evidence that mortgage-backed security prices were significantly affected by Fannie Mae credit risk and the Federal Reserve’s Quantitative Easing Programs.
  • 5/26 Svetlana Bryzgalova, Stanford (4:30-5:30, 200-203)
    “Spurious Factors in Linear Asset Pricing Models”
    When a risk factor has small covariance with asset returns, risk premia in the linear asset pricing models are no longer identified. Weak factors, similar to weak instruments, make the usual estimation techniques unreliable. When included in the model, they generate spuriously high significance levels of their own risk premia estimates, overall measures of fit and may crowd out the impact of the true sources of risk. I develop a new approach to the estimation of cross-sectional asset pricing models that: a) provides simultaneous model diagnostics and parameter estimates; b) automatically removes the effect of spurious factors; c) restores consistency and asymptotic normality of the parameter estimates, as well as the accuracy of standard measures of fit; d) performs well in both small and large samples. I provide new insights on the pricing ability of various factors proposed in the literature. In particular, I identify a set of robust factors (e.g. Fama-French ones, but not only), and those that suffer from severe identification problems that render the standard assessment of their pricing performance unreliable (e.g. consumption growth, human capital proxies and others).

Archive: Fall 2015

  • 9/24 Bruce Cahan, Stanford (4:30-5:30, Education 334)
    “Models of Ethical Leadership in Banking”
  • 10/1 Markus Pelger, Stanford (4:30-5:30, Education 334)
    “Understanding Systematic Risk: A High-Frequency Approach”
    Under a large dimensional approximate factor model for asset returns, I use high-frequency data for the S&P 500 firms to estimate the time-varying latent continuous and jump factors. I estimate four very persistent continuous systematic factors, which can be approximated very well by a market, oil, finance and electricity portfolio. There is only one persistent jump market factor. Using implied volatilities, I find one persistent market and a temporary finance volatility factor. Based on the estimated factors, I decompose the leverage effect, i.e. the correlation of the asset return with its volatility, into a systematic and an idiosyncratic component. The negative leverage effect is mainly driven by the systematic component.
  • 10/5 Peter Carr, Morgan Stanley (2:30-3:30, Packard 202) NOTE UNUSUAL TIME/ROOM
    “Analogies between Bond Yields and Implied Volatilities”
    Intuitions about bond yields can be used to model implied volatilities and vice versa. In particular, we link the shape of the yield curve to the graph of (normal) implied volatilities across strikes.
  • 10/15 Financial Technology Panel (4:30-5:30, Education 334)
    “Evolution of Global Derivatives Capital Markets: Industry Practitioner’s Panel Discussion”
    Panelists: Patrick Flannery (Co-Founder and CEO at May Street LLC), Paul Riley (Senior Software Developer at Vatic Labs LLC), and Tim Levandoski (Vice President of Business Development at Eurex Exchange)
    Once purely the realm of traditional Economics and Finance, players in today’s Global Capital markets continue to be influenced by a variety of new factors and approaches to trading. This soup-to-nuts panel discussion highlights nuances along the entire trade cycle from the perspective of derivatives industry insiders. From the most effective software and hardware used by latency sensitive participants, to the various skill sets sought by today’s trading participants, and to the approaches utilized for trading different products at distributed venues, this panel will provide valuable insights relevant for successful participation in today’s rapid changing marketplace.
  • 10/22 Philipp Strack, UC Berkeley (4:30-5:30, Education 334)
    “Risk-Taking and Financial Competition: The impact of Fund Manager Compensation on Social Welfare”
    We analyze welfare consequences of competition between fund managers who get paid based on their relative performance in a standard financial markets model à la Black Scholes. In the unique symmetric equilibrium, fund managers use investment strategies which create endogenous risk. This risk is unrelated to the uncertainty generated by the assets traded in the underlying financial market. The excessive risk taking of fund managers leads to welfare losses if investors are risk-averse. Increasing competition between fund managers always increases the endogenous risk and decreases social welfare. In examples we show that welfare losses can be up to 40% of the total wealth invested in funds.
  • 10/26 Gustavo Schwenkler, Boston University (3:10-4:10, Hewlett 103) NOTE UNUSUAL TIME/ROOM
    “Efficient Parameter Estimation for Multivariate Jump-Diffusions”
    This paper develops an unbiased and computationally efficient Monte-Carlo estimator of the transition density of a multivariate jump-diffusion process. The drift, volatility, jump intensity, and jump magnitude are allowed to be state-dependent and non-affine. Most importantly, it is not necessary that the variance-covariance matrix can be diagonalized using a change of variable or change of time. Our density estimator facilitates the parametric estimation of multivariate jump diffusion models based on low frequency data. The parameter estimators we propose have the same asymptotic behavior as maximum likelihood estimators under mild conditions that can be verified using our density estimator. Numerical case studies illustrate our results. Joint work with François Guay.
  • 11/5 Ashby Monk, Stanford University (4:30-5:30, Education 334)
    “What’s Wrong with Finance and How to Fix It”
  • 11/19 Andrew Lo, MIT (4:30-5:30, Hewlett 200) NOTE UNUSUAL ROOM
    “Can Financial Engineering Cure Cancer?”
    Funding for biomedical innovation has been declining at the same time that breakthroughs in translational medicine seem to be occurring at ever increasing rates. One explanation for this counterintuitive state of affairs is that greater scientific knowledge in biomedicine can  actually lead to greater economic risk for biopharma investors. While the impact of the Human Genome project, high-throughput screening, and genetic biomarkers has been tremendously positive for clinicians and their patients, it has also increased the cost and complexity of the drug development process, causing investors to shift  their assets to more attractive investment opportunities.  In this talk, Prof. Lo will review this novel interpretation of recent trends in the biopharma industry and describe how financial engineering—-portfolio theory, securitization, credit default swaps, and other tools of modern finance—can be used to reduce the risk and increase the attractiveness of biomedical innovation  so as to bridge the “Valley of Death” in  translational medicine.
  • 11/30 Torben Andersen, Northwestern University (2:30-3:30, Huang HIVE) NOTE UNUSUAL TIME/ROOM
    “Pricing Short-Term Market Risk: Evidence from Weekly Options”
    We study short-term market risks implied by weekly S&P 500 index options. The introduction of weekly options has dramatically shifted the maturity profile of traded options over the last five years, with a substantial proportion now having expiry within one week. Economically, this reflects a desire among investors for actively managing their exposure to very short-term risks. Such short-dated options provide an easy and direct way to study market volatility and jump risks. Unlike longer-dated options, they are largely insensitive to the risk of intertemporal shifts in the economic environment, i.e., changes in the investment opportunity set. Adopting a novel general semi-nonparametric approach, we uncover variation in the shape of the negative market jump tail risk which is not spanned by market volatility. Incidents of such tail shape shifts coincide with serious mispricing of standard parametric models for longer-dated options. As such, our approach allows for easy identification of periods of heightened concerns about negative tail events on the market that are not always “signaled” by the level of market volatility and elude standard asset pricing models.

Archive: Winter 2015

  • 2/11 Johan Walden, UC Berkeley (4:15-5:15, Y2E2 111)
    “Recovery with Unbounded Diffusion Processes”
    We analyze the problem of recovering the pricing kernel and real probability distribution from observed option prices, when the state variable is an unbounded diffusion process. We derive necessary and sufficient conditions for recovery. In the general case, these conditions depend on the properties of the diffusion process, but not on the pricing kernel. We also show that the same conditions determine whether recovery works in practice, when the continuous problem is approximated on a bounded or discrete domain without further specification of boundary conditions. Altogether, our results suggest that recovery is possible for many interesting diffusion processes on unbounded domains.
  • 2/18 Gustavo Schwenkler, Boston University (4:15-5:15, Y2E2 111)
    “The systemic externalities of benchmarking”
    We consider a market consisting of one retail investor and two institutional investors. The investment opportunities consist of a short term bond and two risky stocks whose prices may jump. Unlike the retail investor, institutional investors measure the performance of their portfolios relative to a benchmark. We provide closed-form expressions for portfolio weights and asset prices. Despite having heterogeneous investors, our institutional investors end up constructing similar portfolios. They borrow from the retail investor and invest strongly in the benchmarked stock, increasing the share of institutional holdings in the benchmarked stock. This effect distorts the risk profiles of all assets. We analyze the implications of our results for value-at-risk of each investor and the aggregate, and establish an optimal rule for benchmarking. We also provide empirical evidence in support of our findings.
  • 2/25 Alireza Tahbaz-Salehi, Columbia Business School (4:15-5:15, Y2E2 111)
    “Financial Intermediation Networks”
    We study a dynamic model of financial intermediation in which interbank lending is subject to moral hazard, where intermediaries can divert funds towards inefficient projects. We show that despite the presence of moral hazard, secured lending contracts can discipline the investment choices of all market participants — even those with whom they are not directly contracting — thus partially overcoming market frictions. Our results provide a characterization of the relationship between the intermediation capacity of the system on the one hand, and the extent of moral hazard, the distribution of collateral and the network of interbank relationships on the other. We use this characterization to show that due to the recursive nature of the moral hazard problem, small changes in fundamentals may result in significant drops in the financial system’s intermediation capacity, leading to a complete credit freeze. This is joint work with Marco Di Maggio.
  • 3/4 Samim Ghamami, Federal Reserve Board  (4:15-5:15, Y2E2 111)
    “Derivatives Pricing under Bilateral Counterparty Default Risk: Path-Independent Probabilistic Valuation”
    We consider the problem of valuing a contingent claim under bilateral counterparty default risk in a reduced-form setting similar to that of Duffie and Huang [1996] and Duffie and Singleton [1999]. The probabilistic valuation formulas derived under this framework cannot be usually used for practical pricing due to their recursive path dependencies. Instead, finite- difference type methods are used to solve the quasi-linear partial differential equations that equivalently represent the claim value function. By restricting how the risk-free rate and the two credit spreads depend on the underlying uncertainty, we develop path-independent probabilistic valuation formulas that have closed form solution or can lead to computationally more efficient pricing schemes. In our framework the claim can be efficiently valued under the so-called wrong way risk. Our proposed modeling of wrong way risk is more insightful than the widely-used incorporation of wrong way risk in credit value adjustment (CVA) calculations driven by expected-discounted-loss type formulas as they do not represent the market price of counterparty credit risk. This is joint work with Peter Carr.
  • 3/11 Agostino Capponi, Columbia University (4:15-5:15, Y2E2 111)
    “Price Contagion through Balance Sheet Linkages”

    We study price linkages between assets held by financial institutions that are required to maintain fixed capital requirements over time.  We consider a market consisting of two sectors: banking and nonbanking. Firms in the banking sector actively manage their leverage ratios to conform with pre-specified target levels. The nonbanking sector consists of institutions, such as mutual funds, money market funds, insurances, and pension funds, that do not actively target a fixed leverage. We show that banks’ deleveraging activities may amplify asset return shocks and lead to large fluctuations in realized returns. They can cause spill-over effects, where assets held by leverage targeting banks can experience hikes or drops caused by shocks to otherwise unrelated assets held by the same banks. Our analysis thus suggests that regulatory policies aimed at stabilizing the financial system by imposing capital constraints on banks may have unintended consequences. Fire-sale externalities are produced if leverage targeting banks become too large relative to the nonbanking sector, as measured by elasticity-weighted assets. We show that these effects can be mitigated by encouraging banks to implement asset allocation strategies with higher exposure to liquid, rather than illiquid, assets.

Archive: Autumn 2014

  • 9/11 Jin-Chuan Duan, National University of Singapore (11:00-12:00, 103 Littlefield)
    “Actuarial Spread with Applications”
    The Actuarial Spread (AS) is a CDS spread like measure summarizing the information embedded in the term structure of physical probabilities of default (PDs). The AS is constructed as if it were a CDS premium rate for risk-neutral market participants and relies only on a term structure of physical PDs in conjunction with a recovery rate assumption. In short, it is a quantity computed solely based on the actuarial valuation principle. In considering the fact that mergers/acquisitions occur with a rate manyfold higher than that of defaults/bankruptcies, the AS design has factored in both default and other-exit probabilities. In addition, succession upon the exit of a reference obligor has been explicitly treated. The term structure of default and other-exit probabilities needed for AS can be estimated by the forward-intensity corporate default prediction model of Duan, Sun and Wang (2012). With the AS in place, we analyze a year-long time series of daily CDS spreads for Eastman Kodak prior to its Chapter 11 bankruptcy filing on January 19, 2012. The results suggest that the log-ratio of CDS spread over its corresponding AS is highly predictable by its lagged value, and this predictive relationship forms a good basis for empirical pricing of CDS by decomposition. I contend that the AS is a better risk benchmarking tool because (1) liquid CDS contracts are quite limited and greatly subjected to supply-demand imbalance, and (2) CDS price contains risk premium making it an indirect measure of physical default. The daily updated AS and PD for a range of tenors and on over 60,000 exchange-listed firms in 106 economies are made available as a “public good” by the Credit Research Iniative (CRI) at the Risk Management Institute of National University of Singapore ( I will briefly introduce this comprehensive resource.
  • 9/24 Stephan Ludwig, d-fine (4:15-5:15, Mitchell B67)
    “Change of financial networks through central clearing and collateralized banking”
    The introduction of central clearing through the Dodd-Frank act in the US and EMIR in the EU changes the structure of inter banking networks and risk management elementary. Driven by the shock of bank defaults during the financial crises, regulators implemented stabilizing components to absorb single defaults and enable more transparency. This talk presents a few but fundamental changes through central clearing. The focus is to introduce the method of margining, collateral management and default funds in order to ask the question: how does central clearing impacts our financial stability and the cost for it. The goal is to provide input for creating new network models for one of the most-discussed topic in today’s financial markets. 
  • 10/1 Christopher Jones, Chief Investment Officer, Financial Engines (4:15-5:15, Mitchell B67)
    “Financial Engines and the scale economics of personalized wealth management”
    Christopher  Jones, Chief Investment Officer and founding employee of Financial Engines discusses the role of advanced simulation and optimization techniques in enabling cost-effective, personalized investment advice and management for individual investors of modest means. Founded in 1996 by Noble Laureate William Sharpe and former SEC Commissioner and Stanford Law professor Joseph Grundfest, Financial Engines is now the largest independent registered investment advisor (RIA) in the country. The firm provides investment advice and management to more than 1MM individual investors and manages approximately $100B on a discretionary basis. Jones will discuss the technology behind Financial Engines and how the company grew to its current market position. Mr. Jones received master’s degrees in Business Technology and Engineering-Economic Systems as well as a bachelor’s degree in Economics from Stanford University.
  • 10/8 Anders Trolle, EPFL Lausanne (4:15-5:15, Mitchell B67)
    “Liquidity Risk in Credit Default Swap Markets”
    We show that liquidity risk is priced in the cross section of returns on credit default swaps (CDSs). Liquidity risk is defined as covariation between CDS returns and a liquidity factor that captures innovations to CDS market liquidity. Market-wide CDS illiquidity is measured by aggregating deviations of credit index levels from their no-arbitrage values implied by the index constituents’ CDS spreads, and the liquidity factor is the return on a diversified portfolio of index arbitrage strategies. Liquidity risk increases CDS spreads and the expected excess returns earned by sellers of credit protection. Our benchmark model implies that liquidity risk accounts for approximately 20% of CDS spreads, on average.
  • 10/15 Romuald Elie, University Paris-Est (4:15-5:15, Mitchell B67)
    “A stochastic control approach to pricing and hedging in incomplete markets”
    We will see how mathematical tools from the optimal stochastic control theory allow to compute the indifference utility or the super-hedging price of a claim in an incomplete market. In particular, we will focus on the impact of portfolio contraints, such as short sell prohibition, in multidimensional local volatility models. In the one dimensional case, we will observe that super-hedging a claim simply boils down to the replication of a proper ‘facelift’ transform of the claim. We will also provide alternatives to very costly super-hedging prices, by computing quantile hedging prices in a dynamically consistent manner.
  • 10/22 James Patterson, Head of Capital One Labs (4:15-5:15, Mitchell B67)
    “Inventing the Future of Money”
    Throughout history technology has played a transformational role in how people relate to and use money.  We are living in another time of great transformation as money moves from atoms to bits.  But many other factors beyond technology will separate the winners from the losers.  James Patterson, head of Capital One Labs, the experimental product and technology arm of Capital One, will share his experiences on reimagining the future of money.
  • 11/5 Lisa Goldberg, UC Berkeley (4:15-5:15, Mitchell B67)
    “The Cost of Financing US Equity Through Futures”
    Investors can lever through futures markets without explicitly borrowing money.  In this empirical study, we investigate the historical attractiveness of futures financing compared to explicit borrowing.  Specifically, we estimate the spread between the interest rate implicit in US equity futures (FIR) over the Eurodollar Deposit Rate (EDR) between January 1996 and August 2013.  Prior to the enactment of the Commodity Futures Modernization Act (CFMA) on December 21, 2000, the spread was positive and statistically significant, suggesting that financing through futures was unattractive during that period.   Since the enactment of the CFMA, the spread has been negative and statistically significant, suggesting that financing leverage through futures has been more attractive than through borrowing for more than a decade.  Notably, the level of the FIR-EDR spread differed materially during the pre-crisis, crisis, and post-crisis regimes.  The FIR had a positive and statistically significant spread over the T-bill rate throughout the period, indicating that financing leverage through futures is not free. This is joint work with Robert Anderson and Nicolas Gunther.
  • 11/12 Gerry Tsoukalas, Wharton  (4:15-5:15, Mitchell B67)
    “Operationalizing Financial Covenants”
    We study the interplay between financial covenants and the operational decisions of a firm that obtains financing through a secured (asset-based) lending contract. While it is widely held that covenants serve to protect lenders, the specific ways in which a borrowing firm can adapt its operations in response have not been studied. We characterize the product market conditions, involving demand distribution, growth potential, profit margin, and product depreciation rate, under which covenants are necessary, and argue that these are routinely met in practice. Furthermore, we show that covenants are not substitutable by other contractual terms, such as interest rates and loan limits, and provide operational insights for their optimal design. We discuss when covenants ensure that system-optimal decisions are taken in equilibrium, and show that operational flexibility can impact their effectiveness in a surprising, non-monotonic way.

Archive: Spring 2014

  • 5/28 Ronnie Sircar, Princeton University (4:15-5:15, 380-380Y)
    “Energy Production and Mean Field Games”
    One way to view energy markets is as competition between producers from different fuels and technologies with markedly varied characteristics.  For instance, oil is relatively cheap to extract, but in diminishing supply and polluting. Solar power is more expensive to set up, but essentially inexhaustible and clean. We construct dynamic oligopoly models of competition between heterogeneous energy producers to try and understand how the changing landscape may affect energy prices and supply. We discuss how continuous time Bertrand and Cournot competitions, in which firms producing similar goods compete with one another by setting prices or quantities respectively, can be analyzed as continuum dynamic mean field games under the constraint of finite supplies (or exhaustible resources). The continuum game is characterized by a coupled system of partial differential equations: a backward HJB PDE for the value function, and a forward Kolmogorov PDE for the density of players. We find that, in accordance with the two-player game, a large degree of competitive interaction causes firms to slow down production. The continuum system can therefore be used as an effective approximation to even small player dynamic games.
  • 5/21 Kenneth Judd, Stanford University (4:15-5:15, 380-380Y)
    “Numerical methods for solving dynamic portfolio problems”
    Numerical methods for optimization, approximation, and quadrature are combined to solve multi asset portfolio problems with transaction costs. Problems with portfolios of bonds, stocks, and options are solved, and error bounds for the computed solutions are small. In the absence of transaction costs, options  are redundant securities and their creation has no value. A common claim is that options are socially valuable if there are transaction costs (including bid-ask spreads) in asset markets. We apply our methods to estimate the social value of introducing options in the presence of transaction costs. This is joint work with Yongyang Cai.
  • 5/14 Viktor Todorov, Northwestern University (4:15-5:15, 380-380Y)
    “The Risk Premia Embedded in Index Options”
    We study the dynamic relation between aggregate stock market risks and risk premia via an exploration of the time series of equity-index option surfaces. The analysis is based on estimating a general parametric asset pricing model for the risk-neutral equity market dynamics using a panel of options on the S&P 500 index, while remaining fully nonparametric about the actual evolution of market risks. We find that the risk-neutral jump intensity, which controls the pricing of left tail risk, cannot be spanned by the market volatility (and its components), so an additional factor is required to account for its dynamics. This tail factor has no incremental predictive power for future equity return volatility or jumps beyond what is captured by the current and past level of volatility. In contrast, the novel factor is critical in predicting the future market excess returns over horizons up to one year, and it explains a large fraction of the future variance risk premium. We contrast our findings with those implied by structural asset pricing models that seek to rationalize the predictive power of option data. Relative to those studies, our findings suggest a wider wedge between the dynamics of equity market risks and the corresponding risk premia with the latter typically displaying a far more persistent reaction following market crises.
  • 4/23 Paul Glasserman, Columbia Business School (4:15-5:15, 380-380Y)
    “Robust Monte Carlo, Model Risk, and Counterparty Risk”
    Simulation methodology has traditionally focused on measuring and reducing sampling error in simulating well-specified models; it has given less attention to quantifying the effect of model error or model uncertainty.  But simulation actually lends itself well to bounding this sort of model risk. In particular, if the set of alternative models consists of all models within a certain “distance” of a baseline model, then the potential effect of model risk can be estimated at low cost within a simulation of the baseline model. I will illustrate this approach to making Monte Carlo robust with examples from finance, where concerns about model risk have received heightened attention. The problem of bounding “wrong-way risk” in  counterparty risk presents a related question in which model uncertainty is limited to the nature of the dependence between two otherwise certain marginal models for market and credit risk.  The effect of uncertain dependence can be bounded through a convenient combination of simulation and optimization. This talk is based on work with Xingbo Xu and Linan Yang.
  • 4/16 Timothy Levandoski, Eurex (4:15-5:15, 380-380Y)
    “The Role of an International Derivatives Exchange: Futures & Options Products and Trading Technology”
    We will introduce the exchange-traded derivatives markets and discuss the role of an international exchange in terms of products and technology. We will begin with an overview of Eurex as one of the largest derivatives exchanges in the world. We will then give a comparison between the U.S. and European market models. Further topics include: the similarities and differences in market models between the U.S. and European markets, the international nature of the global financial markets, current issues such as financial reform, high-frequency trading, etc., and trading strategies implemented by market participants to help the risk reward dilemma.
  • 4/9 Stefan Weber, Leibniz Universität Hannover (4:15-5:15, 380-380Y)
    “Distribution-based Risk Measures and Their Implementation”
    Banks and insurance companies typically use distribution-based risk measures for the evaluation of their downside risks. The statistical and numerical properties of these functionals are thus important. Recently, some authors emphasized the significance of the elicitability of risk measures, a notion closely related to Huber’s M-estimators and quantile regression. The talk characterizes elicitable distribution-based risk measures, analyzes their generalized Hampel-robustness, and explains their relationship to stochastic approximation theory.
  • 4/3 Jaksa Cvitanic, Caltech (4:15-5:15, Hewlitt 101)
    Moral Hazard in Dynamic Risk Management”
    We consider a contracting problem in which a principal hires an agent to manage a risky project. When the agent chooses volatility components of the output process and the principal observes the output continuously, the principal can compute the quadratic variation of the output, but not the individual components. This leads to moral hazard with respect to the risk choices of the agent. Using a very recent theory of singular changes of measures for Ito processes, we formulate the principal-agent problem in this context,  and solve it in the case of CARA preferences. In that case, the optimal contract is linear in these  factors: the contractible sources of risk, including the output, the quadratic variation of the output and  the cross-variations between the output and the contractible risk sources. Thus, path-dependent contracts naturally arise when there is moral hazard with respect to risk management. We also provide comparative statics via numerical examples. This is joint work with N. Touzi and D. Possamai.

Archive: Winter 2014

  • 3/12 Damir Filipovic, EPFL Lausanne (4:15-5:15, Sequoia 200)
    “Linear-Rational Term Structure Models”
    We introduce the class of linear-rational term structure models, where the state price density is modeled such that bond prices become linear-rational functions of the current state. This class is highly tractable with several distinct advantages: i) ensures non-negative interest rates, ii) easily accommodates unspanned factors affecting volatility and risk premia, and iii) admits analytical solutions to swaptions. For comparison, affine term structure models can match either i) or ii), but not both simultaneously, and never iii). A parsimonious specification of the model with three term structure factors and at least two unspanned factors has a very good fit to both interest rate swaps and swaptions since 1997. In particular, the model captures well the dynamics of the term structure and volatility during the recent period of near-zero interest rates. (Download Paper)
  • 3/5 Rafael Mendoza-Arriaga, UT Austin (4:15-5:15, Sequoia 200)
    Random Time Changes in Quantitative Finance
    Subjecting a stochastic process to a random time change is a classical technique in stochastic analysis. In this talk we survey our recent work on using random time changes as a flexible model-building tool for designing stochastic processes that possess rich features required for empirical realism in financial modeling. These features include state-dependent and time-inhomogeneous jumps and stochastic volatility. Moreover, our modeling framework offers analytical and computational tractability which are required for operational purposes. We sketch applications across credit, commodity, equity, power generation systems and insurance.
  • 2/26 Hemant Shah, Risk Management Solutions (joint with ETL Seminar) (4:30-5:30, NVIDIA Auditorium, Huang)
    Since spinning-out RMS from Stanford in 1989, Hemant Shah has built the company to over 1,000 employees providing innovative catastrophe modeling solutions to the insurance and financial services sectors. Join us as Shah discusses his entrepreneurial path and insights on growing a company from the ground up.
  • 2/19 Samim Ghamami, Federal Reserve Board (4:15-5:15, Huang 305)
    “Static Models of Central Counterparty Risk”
    The 2009 G20 clearing mandate has increased the importance of central counterparty (CCP) risk management substantially. International standard setting bodies have outlined a set of principles for CCP risk management; they have also devised CCP risk capital requirements on clearing members for their central counterparty exposures. There is still no consensus among CCP regulators, bank regulators, and CCPs on how central counterparty risk should be measured coherently in practice. A conceptually sound and coherent definition of the CCP risk capital in the absence of a unifying CCP risk measurement framework may not be possible. Incoherent CCP risk capital requirements can create an obscure environment for central clearing that may subsequently disincentivize the clearing mandate. Based on novel applications of well known mathematical models in nance, this paper introduces a model-based framework for the default waterfall resources of typical derivatives CCPs. The proposed CCP risk measurement framework can be viewed as a common ground for CCPs, their direct clearing members, bank regulators, and CCP regulators. It can be used for a risk sensitive model-based de nition of CCP risk capital based on which less risk sensitive methods can be developed and evaluated.

Archive: Fall 2013

  • 11/15 Tom Hurd, McMaster University
    Illiquidity and Insolvency Cascades in the Interbank Network
    The great crisis of 2007-08, followed by the ongoing Euro crisis, have highlighted the need for better mathematical and economic understanding of financial systemic risk. Are there “toy models” of systemic risk that are amenable to an exact probabilistic analysis? How do these models work, how useful are they, and what are some of the conclusions that can be drawn from them? As an illustration of some of the complex issues that can be addressed, I will show how to obtain results on large graph asymptotics for systemic risk in a model in which two kinds of contagion, insolvency and illiquidity, act in opposite directions in the network.
  • 11/6 Darrell Duffie, Stanford GSB
    The Design of Libor and Other Interest Rate Benchmarks
    Libor is a global system of interest rate benchmarks that are referenced in financial contracts whose total notional amount exceeds 300 trillion dollars. (Yes, that is trillion, not billion.) Because Libor has been mis-reported in various attempts to manipulate financial markets, new interest rate benchmarks, and new methods for estimating benchmarks based on market transactions, are being developed. This talk will discuss the benchmark design problem, drawing on work in progress for the Financial Stability Board. The modeling issues involve both economic theory and statistics.
  • 11/4 Nan Chen, Chinese University of Hong Kong
    Interconnected Balance Sheets, Market Liquidity, and the Amplification Effects in a Financial System
    This paper investigates two amplification effects of a financial system to develop individual defaults to a systemic catastrophe. In our model, the financial institutions interconnect via two mutually stimulating channels: their balance sheets are linked directly by holding debt claims against each other; they share the market liquidity to liquidate assets to meet debt liabilities when they face distress. Formulating it as an optimization problem with equilibrium constraints characterizes how the topological structures of the system and asset liquidation interact with each other to amplify the systemic risk. Two multipliers, network multiplier and liquidity multiplier, are identified in this work to capture the above amplification effects. The model has a significant computational advantage and can be solved efficiently through the linear-complementarity-technique based fixed-point algorithm. This research also builds up a close connection between the study of financial systemic risk and the literature on stochastic networks. Furthermore, we discuss some policy implications yielded from our numerical experiments on data from the European Banking Authority’s 2011 stress test. This is a joint work with David D. Yao (Columbia) and Xin Liu (CUHK).
  • 10/30 Stan Uryasev, University of Florida
    The Fundamental Risk Quadrangle in Risk Management, Optimization, and Statistical Estimation
    This presentation discusses the “Fundamental Quadrangle of Risk” framework including basic mathematical objects: Errors, Regrets, Risks, and Deviations. This framework suggests a consistent approach for defining and optimizing stochastic functions in various application areas. Random variables that stand for cost, loss or damage must be confronted in numerous situations. Dealing with them systematically in risk management, optimization and statistics is the theme of this presentation, which brings together ideas coming from many different areas. Measures of risk can be used to quantify the hazard in a random variable by a single value. Such quantification of risk can be portrayed on a higher level as generated from penalty-type expressions of “regret” about the mix of potential outcomes. A trade-off between an up-front level of hazard and the uncertain residual hazard underlies that derivation. Regret is the mirror image of utility, a concept for dealing with gains instead of losses. Measures of error can associate with any hazard variable a “statistic” along with a “deviation” which quantifies the variable’s nonconstancy. Measures of deviation, on the other hand, are paired closely with measures of risk exhibiting “aversity.” A direct correspondence can furthermore be identified between measures of error and measures of regret. The Fundamental Quadrangle of Risk puts all of this together in a unified scheme.