Risk management is a foundation discipline for the prudent conduct of investment management. Being effective requires ongoing evolution and adaptation. In The World of Risk Management, an expert team of contributors that include Nobel Prize laureates Robert C Merton and Harry M Markowitz addresses the important issues arising in the practice of risk management.
A common thread among these distinguished articles is a rigorous theoretical or conceptual basis. Illustrated with full color figures throughout, they discuss topics ranging from broad policy considerations to detailed how-to prescriptions, providing professionals and academics with useful practical implementations.
Chapter Summary - This paper explores a functional approach to financial system design in which financial functions instead of institutions are the “anchors” of such systems and the institutional structure of each system and its changes are determined within the theory. It offers a rudimentary synthesis of the neoclassical, neoinstitutional, and behavioral perspectives on finance to describe a process for driving changes in the institutional structures of financial systems over time and to explain their differences across geopolitical borders. The theory holds that within an existing institutional structure, when transaction costs or dysfunctional financial behavioral patterns cause equilibrium asset prices and risk allocations to depart significantly from those in the “frictionless,” rational-behavior neoclassical model, new financial institutions, financial markets, and supporting infrastructure such as regulatory and accounting rules evolve that tend to offset the resulting inefficiencies. Thus, market frictions and behavioral finance predictions, along with technological progress, are central in explaining financial system design and predicting its future evolution. However, in the longer-run equilibrium, after offsetting institutional structures have had time to develop, the predictions of the neoclassical model, albeit as a reduced form, will be approximately valid for asset prices and resource allocations.
Chapter Summary - Although liquidity has long been recognized as one of the most significant drivers of financial innovation, the collapse of several high-profile hedge funds such as Askin Capital Management in 1994 and Long Term Capital Management in 1998 has refocused the financial industry on the importance of liquidity in the investment management progress. Many studies have made considerable process in defining liquidity, measuring the cost of immediacy and price impact, deriving optimal portfolio rules in the presence of transactions costs, investigating the relationship between liquidity and arbitrage, and estimating liquidity risk premia in the context of various partial general equilibrium asset-pricing models. However, relatively little attention has been paid to the more practical problem of integrating liquidity directly into the portfolio construction process. In this paper, we attempt to remedy this state of affairs by modeling liquidity using simple measures such as trading volume and percentage bid/offer spreads, and then introducing these measures into the standard mean–variance portfolio optimization process to yield optimal mean–variance–liquidity portfolios. We begin by proposing several measures of the liquidity of an individual security, from which we define the liquidity of a portfolio as the weighted average of the individual securities’ liquidities. Using these liquidity metrics, we can construct three types of “liquidity-optimized” portfolios.
Chapter Summary - The purpose of this paper is to provide an overview of some of the risk management techniques used currently. And the paper then proposes the corporate model approach to manage enterprise risks of the firm. Section 1 reviews the current practices, which are considered most effective in risk management for the life insurers. In a similar fashion, Section 2 describes the practices for the property/casualty insurance. Section 3 discusses the challenges that these current practices face in our current environment and describes the corporate model approach to deal with these challenges. Finally, Section 4 contains the conclusions.
Chapter Summary - The risk surrounding the market’s rate of return—change in dollar value, divided by initial dollar value—is roughly stationary across time. To maintain constant dollar risk, investors concerned with their terminal wealth must sell when the stock market rises and buy when it falls. The frequent trading is probably the reason why few investors have tried to time diversify. Consider an asset whose dollar gains and losses are in one-to-one correspondence with the stock market’s rate of return: if the risk surrounding the latter is indeed stationary across time, then the risk surrounding the former will also be stationary. Using this principle and elementary calculus, we derive the asset. Although an asset with constant dollar risk does not exist in nature, it can be approximated with actual investment positions. The key to the approximation is the fact that a diversified asset’s beta expresses a power relation between its value and the market level.
Chapter Summary - Optimal portfolio choice is the central problem of equity portfolio management, asset allocation, and financial planning. Common optimality criteria such as the long-term geometric mean, utility function estimation, and return probability objectives have important theoretical or practical limitations. A portfolio choice framework consisting of resampled efficient portfolios and geometric mean analysis is a practical alternative for many situations of investment interest. Mean–variance optimization, the typical framework for defining an efficient portfolio set in practice, is estimation error sensitive and exhibits poor out-of-sample performance characteristics. Resampled efficiency, a generalization of mean–variance efficiency, improves out-of-sample performance on average and has important additional practical benefits. Geometric mean analysis gives the distribution of the multiperiod financial consequences of single-period efficient investments to clearly visualize the tradeoffs between risk and return and for assessing an appropriate level of risk. While Monte Carlo financial planning is a more flexible framework, geometric mean analysis may be less error prone, theoretically justifiable and convenient. Controversies that have limited geometric mean analysis applications are resolvable by improved understanding of distributional properties and rational decision-making issues.
Chapter Summary - This paper explores a novel algorithm for the pricing of derivative securities. There are now hundreds of different types of derivative securities, each with their own peculiar characteristics. Yet, no single approach works for every type of contract and, indeed, the literature in finance is replete with a vast number of different pricing models. The goal in this paper is to propose a novel pricing model that is tailored to some derivatives of more recent interest, for which dominant models do not as yet exist. The algorithm is based on a Markov chain Monte Carlo approach, developed in a different context by Sinclair and Jerrum (1989). While the use of Monte Carlo methods is well established for pricing derivatives, our approach differs in several respects: it uses backtracking to prevent the accumulation of errors in importance sampling; it has rigorously provable error bounds; and it is, in principle, applicable to derivative pricing on any nonrecombining lattice. In addition to describing the algorithm, we also present some initial experimental results that illustrate its application to a simple barrier option pricing problem.
Chapter Summary - Many practitioners are bewildered by the fact that ex post active risks of their portfolios are often significantly higher than ex ante tracking errors estimated by risk models. Why do risk models tend to underestimate active risk? The answer to this question has important implications to active management, in the areas of risk management, information ratio estimation, and manager selections. We present an answer to this puzzle. We show there is an additional source of active risk that is unique to each strategy. It is unique because its contribution to active risk depends on the variability of the strategy’s information coefficient through time. We name this risk the strategy risk. Consequently, the true active risk must consist of both the strategy risk and the risk-model tracking error; and, the active risk is often different from, and in many cases, significantly higher than the risk-model tracking error. Based on this result, we further show that a consistent estimation of information ratio is the ratio of average information coefficient to the standard deviation of information coefficient.We provide corroborating empirical evidence in support of our analysis and demonstrate the practicality of our findings. Specifically, we show how the understanding of strategy risk leads to more accurate ex ante forecasts of active risk and information ratio.
Chapter Summary - Money markets (Kidwell, Peterson and Blackwell, 1997) are generally described as short-term markets for liquidity where the lenders that provide the liquidity demand debt securities with low default risk and high marketability. Recent evidence shows both repo rates (Griffiths and Winters, 1997) and commercial paper rates (Musto, 1997) increase dramatically prior to the year-end and that the identified changes are consistent with a preferred habitat for liquidity at the year-end. Musto (1997) suggests that the price of risk in commercial paper may increase at the year-end. Using daily rates on 7-day, 15-day, and 30-day nonfinancial commercial paper from two different risk classes (AA and A2/P2), we find, across all terms and for both risk classes, that rates increase when a security begins to mature in the new-year and that rates decline across the year-end with the decline beginning a few days before the end of the year. These changes are consistent with the hypothesis of a year-end preferred habitat for liquidity. In addition, we find that the spread between the two risk classes, across all terms, increases at the same time indicating that the price of risk also increases at the year-end. In other words, when the lenders in the commercial paper market need their cash at the year-end they increase the rate charged for commercial paper across all borrowers, but they increase the rate more for higher risk borrowers.
Chapter Summary - This paper reports an experiment that tests two proposals for handling the fact that historical means, variances, and covariances are themselves noisy. One method is that of Michaud (1998). The other is an implementation of the diffuse Bayes approach widely discussed in texts and tracts on Bayesian inference. The experiment contains a simulated referee and two simulated players, namely a Michaud player and a Bayes player. The referee selects a “true” probability distribution of returns on eight asset classes. Given this probability distribution, the referee generates 217 monthly observations for the eight asset classes. These observations are handed to each player who then proceeds in its prescribed manner. The object of each player is to pick a portfolio which maximizes a specified function of portfolio mean and variance. This process is repeated for three different objective functions, for 100 historical samples drawn from a given truth, and for 10 truths. One of the investor objectives is long run growth. The others are two other “utility functions.” The two players, and therefore their methodologies, are evaluated in terms of their ability to provide portfolios which give greatest value to the objective function, and their ability to estimate how well they have done. The results of the experiment have implications for the relative merits of the two methodologies, and for probable weaknesses in other methods of estimating the inputs to an MPT portfolio analysis.
Chapter Summary - Fund managers now commonly try to beat specific benchmarks (e.g., theS&P500), and the widespread dissemination of return statistics on both index and actively managed funds makes it plausible that some individual investors may also be trying to do so. Academics now commonly evaluate fund performance by the size of the “alpha” from a multifactor generalization of the familiar Capital Asset Pricing Model, i.e. the size of the intercept in a linear regression of the fund’s returns on the returns of a broad based market index and other “factor” portfolios (e.g., those proposed in the influential work of Eugene Fama and Kenneth French). This paper theoretically and empirically argues that these two seemingly disparate facts may be closely connected. Specifically, the attempt of fund managers and/or individual investors to beat benchmark portfolios may cause those benchmarks (or proxies for them) to appear in the multifactor performance evaluation models advocated by academics. This casts additional doubt on the currently problematic academic presumption that the non-market factors proxy for predictors of fundamental risks that can affect future investment opportunities. Instead, the non-market factors in the Fama and French equity fund performance evaluation model may proxy for growth-oriented index portfolios, which some try to beat, and value-oriented index portfolios, which others try to beat.