Welcome to Jim Chen's SSRN abstracts. This is a human-friendly display of the RSS feed for Mr. Chen's official SSRN page (http://ssrn.com/author=68651), reorganized by reverse chronological order rather than number of downloads. To receive updates as Mr. Chen posts new papers or updates old papers, please use the following form:
REVISION: Sinking, Fast and Slow: Bifurcating Beta in Financial and Behavioral Space
(August 2, 2015)
Modern portfolio theory accords symmetrical treatment to all deviations from expected return, positive or negative. This assumption is vulnerable on both descriptive and behavioral grounds. Many of the predictive flaws in contemporary finance stem from mathematically elegant but empirically flawed Gaussian models. In reality, returns are skewed. The presumption that returns and volatility are symmetrical also defies human behavior. Losing hurts worse than winning feels good; investors do not react equally to upside gain and downside loss. Moreover, correlation tightening during bear markets, not offset by changes in correlation during bull markets, suggest that standard diversification strategies may erode upside returns without providing adequate protection during times of stress.
This article outlines mathematical tools for calculating volatility, variance, covariance, correlation, and beta, not merely across the entire spectrum of returns, but also on either side of mean ...
REVISION: The Promise and the Peril of Parametric Value-at-Risk (VaR) Analysis
(July 8, 2015)
Leptokurtosis, or the risk lurking in “fat tails,” poses the deepest epistemic threat to economic forecasting. Parametric value-at-risk (VaR) models are extremely vulnerable to kurtosis in excess of the levels associated with a normal, Gaussian distribution. This article provides step-by-step guidance on the use of Student’s t-distribution to enhance the statistical robustness of VaR forecasts. For degrees of freedom greater than 4, Student’s t-distribution can emulate any level of kurtosis exceeding that of a Gaussian distribution. Because VaR is elicitable from historical data, observed levels of excess kurtosis can inform the proper use of Student’s t-distribution to measure value-at-risk. In addition, the calculation of parametric VaR according to the number of degrees of freedom implied by historical levels of excess kurtosis leads directly to the corresponding value of expected shortfall. Conducted in this fashion, parametric VaR not only exploits the elicitability of that ...
New: Legal Signal Processing
(June 4, 2015)
It makes far more economic sense to prepare for disaster in advance than it does to stage heroic relief efforts after calamity strikes. For reasons rooted in politics and emotion, the law does exactly the opposite. Ad hoc relief, as expensive as it is spontaneous, dominates disaster law and policy.
The President’s unilateral power to declare a federal disaster under the Stafford Act invites political manipulation. To test whether presidential disaster declarations track the four-year presidential electoral cycle, this paper devises a generalized polynomial and multi-sinusoidal model for detecting cyclical patterns. This model draws heavily upon Fourier analysis and digital signal processing.
Presidential disaster declarations since 1953 reveal not one but two forms of periodicity. As expected, a “short wave” of four years shows how disaster declarations track the presidential election cycle. The effect is most pronounced not in election years (when declarations do spike), but in ...
New: Gini's Crossbow
(May 21, 2015)
The Gini coefficient remains a popular gauge of inequality throughout the social and natural sciences because it is visually striking and geometrically intuitive. It measures the “gap” between a hypothetically equal distribution of income or wealth and the actual distribution. But not all inequality curves yielding the same Gini coefficient are unequal in the same way. The Lorenz asymmetry coefficient, a second-order measure of asymmetry, provides further information about the distribution of income or wealth. To add even more interpretive power, this paper proposes a new angular measure derived from the Lorenz asymmetry coefficient. Adjusted azimuthal asymmetry is the angular distance of the Lorenz asymmetry coefficient from the axis of symmetry, divided by the maximum angular distance that can be attained for any given Gini coefficient.
REVISION: Leaps, Metes, and Bounds: Innovation Law and Its Logistics
(March 6, 2015)
Economic analysis of technological innovation, diffusion, and decline often proceeds according to sigmoid (S-shaped) models, either directly or as a component in more elaborate mathematical representations of the creative process. Three distinct aspects of American innovation policy — Aereo’s failed attempt to retransmit television broadcasts, agricultural biotechnology, and network neutrality — invite analysis according to one variant or another of the logistic function. Innovation and legal policies designed to foster it follow the leaps, metes, and bounds of sigmoid functions.
Part I introduces the logistic function as the simplest analytical expression of a sigmoid function. Its parameters provide very clear interpretations grounded in physical principles. Part II evaluates the Aereo controversy and agricultural biotechnology as instances of logistic substitution between competing products. The deployment of plant-incorporated pesticides and herbicide-resistant crops ...
REVISION: Creamskimming and Competition
(February 21, 2015)
The concept of “creamskimming” arises with regularity in the law of regulated industries. As a rhetorical weapon, the term “creamskimming” readily conjures images of the sort of putatively destructive competition that regulatory commissions are charged with patrolling. As a result, allegations of creamskimming have become a standard weapon in the legal arsenal of incumbent firms seeking to resist competitive entry. At an extreme, incumbent firms will characterize all forms of competitive entry as creamskimming. Sound regulatory responses to these allegations therefore depend on a proper understanding of the creamskimming concept.
This article proposes a definition of creamskimming that will help state and federal regulatory agencies distinguish genuine objections to proposed competitive entry from reflexive (and often improper) efforts to shield incumbent firms from competition. “Creamskimming” should be defined as “the practice of targeting only the customers that are the ...
REVISION: Modeling Citation and Download Data in Legal Scholarship
(February 2, 2015)
Impact factors among law reviews provide a measure of influence among these journals and the schools that publish them. Downloads from the Social Science Research Network (SSRN) serve a similar function. Bibliometrics is rapidly emerging as a preferred alternative to more subjective assessments of academic prestige and influence. Law should embrace this trend.
This paper evaluates the underlying mathematics of law review impact factors and per-author SSRN download rates by institution. Both of these measures follow the sort of stretched exponential distribution that characterizes many right-skewed distributions found in the natural and social sciences. Indeed, an ordinary exponential distribution — that is, a stretched exponential distribution with an exponent of 1 — generates strikingly accurate, even beautiful, models of both phenomena. Mindful of physicist Hermann Weyl's admonition that any choice between truth and beauty should favor beauty, I freely admit to sacrificing some ...
New: The Algebra of Financial Asymmetry: A Schematic Approach to Semideviation and Semivariance
(January 19, 2015)
Modern portfolio theory remains the dominant paradigm of financial risk management. Behavioral economics, however, targets one of modern portfolio theory’s greatest pitfalls: its symmetrical view of all deviations from expected return, positive or negative, as if investors viewed excess returns to be as troubling as failures to meet a targeted level of returns. This article evaluates a range of measures designed to gauge financial risk through semideviation or semivariance: the Sortino ratio, Morningside's upside and downside capture ratios, and the omega and kappa measures.
REVISION: Weighted-Average Methodologies for Evaluating Bar Examination Passage Rates
(December 8, 2014)
There are few truly “national” law schools in the United States. Most American law schools in the United States have a “dominant” state bar. A greater number of the graduates of nearly any law school take the bar examination administered by one state than any other bar examination. The American Bar Association and U.S. News and World Report's law school rankings rely on bar passage rates for the single largest cohort within any school’s graduating class. But the modal passage rate is misleading as a measure of any one school’s overall bar passage rates. The modal passage rate also fails to facilitate direct comparisons of bar examination performance at different schools.
To evaluate the overall bar examination performance of the graduates of any law school, I propose the use of weighted-average methodologies. Ideally, we should be able to measure, by use of weighted averages, each school’s bar passage z-score. Since the data needed to conduct proper standard scoring is ...
REVISION: Price-Level Regulation and Its Reform
(December 6, 2014)
Price-level, or “price-cap,” regulation offers an alluring alternative to the traditional technique of monitoring a regulated firm’s profits. Part II of this article contrasts price-level regulation with conventional cost-of-service ratemaking and with Ramsey pricing. Price-level regulation stands as a market-based, incentive-driven “third way” between traditional regulation and complete deregulation. Part III provides formal specifications of price-level regulation. Although some jurisdictions have set price caps according to operating cost and rate-of-return calculations that clearly parallel those steps in conventional ratemaking, this article will focus on price-level methodologies that combine an economy-wide measure of inflation with an x-factor reflecting total factor productivity within a regulated industry.
Part IV addresses the simpler component of price-level regulation, the choice of an inflation index. Part V devotes detailed attention to the treatment of the ...
REVISION: Correlation, Coverage, and Catastrophe: The Contours of Financial Preparedness for Disaster
(December 4, 2014)
Laws regulating financial preparedness for catastrophe reveal the actuarial suppositions underlying disaster law and policy. This article explores three facets of catastrophic risk transfer. First, it explores how risk transfer emerges as the preeminent tool for managing risk. Measures sufficient for managing risks break down as the probability of loss plummets, but the magnitude of potential loss increases. Second, this article explores one alternative risk transfer mechanism by which insurance companies have sought to deepen their financial reserves in anticipation of correlated risks. Correlation among risks, the primary obstacle to functional insurance markets for catastrophic coverage, emerges in new form as the motivation for catastrophe bonds — and as these instruments’ leading pitfall. Finally, this article explores constraints on public intervention into disaster insurance. Along the dimensions of space, time, and human behavior, policies compensating individuals for ...
REVISION: Indexing Inflation: The Impact of Methodology on Econometrics and Macroeconomic Policy
(October 17, 2014)
Because there is no perfect gauge of inflation, the macroeconomic enterprise of indexing inflation ultimately dissolves into a choice among imperfect methodologies. But that choice still matters. This article will highlight the practical significance of methodological choices made in the course of indexing inflation. It will focus on two different indexes of inflation in the United States: the Consumer Price Index (CPI) and the implicit price deflator of the gross domestic product (IPD). This article identifies a long-term gap in these competing indexes’ measurement of inflation. To explain why the CPI appears to overstate inflation, relative to the IPD, by roughly two-thirds of a percentage point each year, this article more fully describes the distinct methodologies underlying the CPI and the IPD. Lawmakers should adopt the implicit price deflator of the GDP, or some other inflation index that shares its best methodological features, as the best practicable measure of real ...
New: Measuring Gaps Between Hypothetical Investment Returns and Actual Investor Returns
(September 23, 2014)
Actual investor returns from mutual funds lag behind hypothetical returns based on a fixed initial investment and reinvestment of all distributions. This gap arises from behaviorally driven errors in timing. The nonproprietary literature on this performance gap has emphasized the relationship of this gap to overall returns on stocks and mutual funds. This article seeks to address more directly the relationship of behaviorally driven gaps in investment returns to downside risk, upside gain, and overall volatility. Documenting the existence of this gap across the universe of publicly traded securities — not only in the aggregate, but also on a security-by-security basis — may provide a legal basis for requiring mutual fund and exchange-traded fund managers to compute and disclose that gap.
REVISION: Bioprospect Theory
(September 3, 2014)
Conventional wisdom treats biodiversity and biotechnology as rivalrous values. The global south is home to most of earth's vanishing species, while the global north holds the capital and technology needed to develop this natural wealth. The south argues that intellectual property laws enable the industrialized north to commit biopiracy. By contrast, the United States has characterized calls for profit-sharing as a threat to the global life sciences industry. Both sides magnify the dispute, on the apparent consensus that commercial exploitation of genetic resources holds the key to biodiversity conservation.
Both sides of this debate misunderstand the relationship between biodiversity and biotechnology. Both sides have overstated the significance of bioprospecting. It is misleading to frame the issue as whether intellectual property can coexist with the international legal framework for preserving biodiversity. Any lawyer can reconfigure intellectual property to embrace all ...
REVISION: Indexing Inflation: Why Methodology Matters in Econometrics and Macroeconomic Policymaking
(August 18, 2014)
Because there is no perfect gauge of inflation, the macroeconomic enterprise of indexing inflation ultimately dissolves into a choice among imperfect methodologies. But that choice still matters. This article will highlight the practical significance of methodological choices made in the course of indexing inflation. It will focus on two different indexes of inflation in the United States: the Consumer Price Index (CPI) and the implicit price deflator of the gross domestic product (IPD). This article identifies a long-term gap in these competing indexes’ measurement of inflation. To explain why the CPI appears to overstate inflation, relative to the IPD, by roughly two-thirds of a percentage point each year, this article more fully describes the distinct methodologies underling the CPI and the IPD. Lawmakers should adopt the implicit price deflator of the GDP, or some other inflation index that shares its best methodological features, as the best practicable measure of real ...