Monograph feed
https://feeds.library.caltech.edu/groups/Social-Science-Working-Papers/monograph.rss
A Caltech Library Repository Feedhttp://www.rssboard.org/rss-specificationpython-feedgenenWed, 07 Aug 2024 18:12:11 -0700Market Power and Transferable Property Rights
https://resolver.caltech.edu/CaltechAUTHORS:20171004-153511320
DOI: 10.7907/mspfh-0sr83
The appeal of using markets as a means of allocating scarce resources stems in large part from the assumption that a market will approximate the competitive ideal. When competition is not a foregone conclusion, the question naturally arises as to how a firm might manipulate the market to its own advantage. This paper analyzes the issue of market power in the context of markets for transferable property rights. First, a model is developed which explains how a single firm with market power might exercise its influence. This is followed by an examination of the model in the context of a particular policy problemhttps://resolver.caltech.edu/CaltechAUTHORS:20171004-153511320Dimensions of Parallelism: Some Policy Applications of Experimental Methods
https://authors.library.caltech.edu/records/7ch9g-6sv81
<p>[Introduction] The term "parallelism" refers to a vague notion about how observations of simple laboratory phenomena can help one understand and predict the behavior of a complicated and changing world. Of what use are experimental results to someone who is interested in something vastly larger and more complicated, perhaps fundamentally different than anything that can be studied in a laboratory setting? Questions such as this and the related notion of parallelism have probably existed from the earliest development of scientific experimental methodology, and although I found the term in a paper by Vernon Smith (1980) the notion itself pervades all branches of science and engineering.</p><p>The purpose of this chapter is to isolate some examples of how the issue of parallelism has been approached in economics. The chapter outlines several strategies that have been employed in attempts to use experimental research in actual policy decision making. The topic to be explored is how issues have been posed in these policy-related studies so that experimental methods could be applied. The discussion is limited to 10 instances in which I have been involved personally at some level.</p><p>Many different opinions exist about experimental methodology and the relationship between laboratory work, field studies, and policy decisions. The opinions are strongly held and are just as likely to be held by those with no experience at all in applying the methods as by those with much experience. For example, the textbook by Samuelson and Nordhaus (1983, p. 8) boldly claims that experiments in economics are impossible. Presumably these authors believe that some sort of field study is the only way to approach an application of experimental methods. Referee reports frequently reflect methodological philosophies and related concepts of parallelism. Every experimentalist who has submitted a paper to a professional journal has read a referee report aggressively claiming that the experiments had nothing to do with the "real world" or that the experiments were not " relevant" for some reason or another. My impression is that such critics have very narrow views about the connections between laboratory and naturally occurring situations, and they approach experimental methods with unrealistic expectations about what can be learned from applications.</p><p>This impression brings me to my point: Economists should keep an open mind about experimental methodology and should judge work by the statements of results rather than by methodological principles. Methodological principles should evolve from our experiences with what works and what does not work. That point is reflected in the title and organization of this chapter. The topic is policy research as opposed to basic research. The issues are: What was attempted, what seemed to work and why, what was a flop and why?</p><p>The examples are organized according to what seems to have been the principal strategy for using the experiments. Each strategy can be viewed as a " dimension" or form of parallelism between policy problems and laboratory experiments. Five different strategies are identifiable. Each section treats a different strategy. The discussion includes a general description of the strategy, the context of the policy problem, and the role of the experiments in the final policy decision if any decision resulted.</p>https://authors.library.caltech.edu/records/7ch9g-6sv81An Experimental Study of the Centipede Game
https://resolver.caltech.edu/CaltechAUTHORS:20160405-154023040
We report on a series of experiments in which individuals play a version of the centipede game. In this game, two players alternately get a chance to take the larger portion of a continually escalating pile of money. As soon a.s one person takes, the game ends with that player getting the larger portion of the pile, and the other player getting the smaller portion. If one views the experiment as a complete information game, all standard game theoretic equilibrium concepts predict the first mover should take the large pile on the first round. The experimental results show that this does not occur.
An alternative explanation for the data. can be given if we reconsider the game as a game of incomplete information in which there is some uncertainty over the payoff functions of the players. In particular, if the subjects believe there is some small likelihood that the opponent is an altruist, then in the equilibrium of this incomplete information game, players adopt mixed strategies in the early rounds of the experiment, with the probability of ta.king increasing as the pile gets larger. vVe investigate how well a version of this model explains the data observed in the centipede experiments.https://resolver.caltech.edu/CaltechAUTHORS:20160405-154023040Finding All Equilibria
https://resolver.caltech.edu/CaltechAUTHORS:20170726-162258184
DOI: 10.7907/acfse-rbn61
I present a simple and fast algorithm that finds all the pure-strategy Nash equilibria in games with strategic complementarities. This is the first non-trivial algorithm for finding all pure-strategy Nash equilibria.https://resolver.caltech.edu/CaltechAUTHORS:20170726-162258184A bargaining model of legislative policy-making
https://resolver.caltech.edu/CaltechAUTHORS:20160526-102600263
DOI: 10.7907/2e007-xnw21
We present a general model of legislative bargaining in which the status quo is an
arbitrary point in a multidimensional policy space. In contrast to other bargaining models,
the status quo is not assumed to be "bad," and delay may be Pareto efficient. We
prove existence of stationary equilibria. The possibility of equilibrium delay depends on
four factors: risk aversion of the legislators, the dimensionality of the policy space, the
voting rule, and the possibility of transfers across districts. If legislators are risk averse,
if there is more than one policy dimension, and if voting is by majority rule, for example,
then delay will almost never occur. In one dimension, delay is possible if and only if the
status quo lies in the core of the voting rule, and then it is the only possible outcome.
This "core selection" result yields a game-theoretic foundation for the well-known median
voter theorem. Our comparative statics analysis yield two noteworthy insights: (i) if the
status quo is close to the core, then equilibrium policy outcomes will also be close to the
core (a moderate status quo produces moderate policy outcomes), and (ii) if legislators
are patient, then equilibrium proposals will be close to the core (legislative patience leads
to policy moderation).https://resolver.caltech.edu/CaltechAUTHORS:20160526-102600263The Economics of Social Networks
https://resolver.caltech.edu/CaltechAUTHORS:20110714-120443155
DOI: 10.7907/7r5g6-w1565
The science of social networks is a central field of sociological study, a major application of random graph theory, and an emerging area of study by economists, statistical
physicists and computer scientists. While these literatures are (slowly) becoming aware
of each other, and on occasion drawing from one another, they are still largely distinct in
their methods, interests, and goals. Here, my aim is to provide some perspective on the
research from these literatures, with a focus on the formal modeling of social networks
and the two major types of models: those based on random graphs and those based on
game theoretic reasoning. I highlight some of the strengths, weaknesses, and potential
synergies between these two network modeling approaches.https://resolver.caltech.edu/CaltechAUTHORS:20110714-120443155Diffusion on Social Networks
https://resolver.caltech.edu/CaltechAUTHORS:20170801-110359771
DOI: 10.7907/0tk6s-m5v26
We analyze a model of diffusion on social networks. Agents are connected according to an undirected graph (the network) and choose one of two actions (e.g., either to adopt a new behavior or technology or not to adopt it). The return to each of the actions depends on how many neighbors an agent has, which actions the agent's neighbors choose, and some agent-specific cost and benefit parameters. At the outset, a small portion of the population is randomly selected to adopt the behavior. We analyze whether the behavior spreads to a larger portion of the population. We show that there is a threshold where "tipping" occurs: if a large enough initial group is selected then the behavior grows and spreads to a significant portion of the population, while otherwise the behavior collapses so that no one in the population chooses to adopt the behavior. We characterize the tipping threshold and the eventual portion that adopts if the threshold is surpassed. We also show how the threshold and adoption rate depend on the network structure. Applications of the techniques introduced in this paper include marketing, epidemiology, technological transfers, and information transmission, among others.https://resolver.caltech.edu/CaltechAUTHORS:20170801-110359771A Test for Monotone Comparative Statics
https://resolver.caltech.edu/CaltechAUTHORS:20101008-110658112
DOI: 10.7907/q3yjn-rzc82
In this paper we design an econometric test for monotone comparative statics (MCS) often found in models with multiple equilibria. Our test exploits the observable implications of the MCS prediction: that the extreme (high and low) conditional quantiles of the dependent variable increase monotonically with the explanatory variable. The main contribution of the paper is to derive a likelihood-ratio test, which to the best of our knowledge, is the first econometric test of MCS proposed in the literature. The test is an asymptotic "chi-bar squared" test for order restrictions on intermediate conditional quantiles. The key features of our approach are: (1) it does not require estimating the underlying nonparametric model relating the dependent and explanatory variables to the latent disturbances; (2) it makes few assumptions on the cardinality, location or probabilities over equilibria. In particular, one can implement our test without assuming an equilibrium selection rule.https://resolver.caltech.edu/CaltechAUTHORS:20101008-110658112You won't harm me if you fool me
https://resolver.caltech.edu/CaltechAUTHORS:20101008-110010257
DOI: 10.7907/fjybj-cn073
A decision maker faces a new theory of how certain events unfold over time. The theory matters for choices she needs to make, but possibly the theory is a fabrication. We show that there is a test which is guaranteed to pass a true theory, and which is also conservative: A false theory will only pass when adopting it over the decision maker's initial theory would not cause substantial harm; if the agent is fooled she will not be harmed.
We also study a society of conservative decision makers with different initial theories. We uncover pathological instances of our test: a society collectively rejects most theories, be they true or false. But we also find well-behaved instances of our test, collectively accepting true theories and rejecting false. Our tests builds on tests studied in the literature in the context of non-strategic inspectors.https://resolver.caltech.edu/CaltechAUTHORS:20101008-110010257When does aggregation reduce uncertainty aversion?
https://resolver.caltech.edu/CaltechAUTHORS:20101008-104052139
DOI: 10.7907/ssk1n-sn790
We study the problem of uncertainty sharing within a household: "risk sharing," in a context of Knightian uncertainty. A household shares uncertain prospects using a social welfare function. We characterize the social welfare functions such that the household is collectively less averse to uncertainty than each member, and satises the Pareto principle and an independence axiom. We single out the sum of certainty equivalents as the unique member of this family which provides quasiconcave rankings over risk-free allocations.https://resolver.caltech.edu/CaltechAUTHORS:20101008-104052139Designing Experiments with Computational Testbeds: Effects of Convergence Speed in Coordination Games
https://resolver.caltech.edu/CaltechAUTHORS:20170727-134909825
DOI: 10.7907/gfcyn-5ad35
Using a computational testbed, we theoretically predict and experimentally show that in the minimum effort coordination game, as the cost of effort increases, 1) the game converges to lower effort levels, 2) the convergence speed increases, and 3) the average payoff is not monotonically decreasing. In fact, the average profit is an U-shaped curve as a function of cost. Therefore, contrary to the intuition, one can obtain a higher average profit by increasing the cost of effort.https://resolver.caltech.edu/CaltechAUTHORS:20170727-134909825Ambiguity Aversion in Asset Market: Experimental Study of Home Bias
https://resolver.caltech.edu/CaltechAUTHORS:20170727-135623631
DOI: 10.7907/7t9dp-ckn73
The equity market home bias occurs when the investors over-invest in their home country assets. The equity market home bias is a paradox because the investors are not hedging their risk optimally. Even with unrealistic levels of risk aversion, the equity market home bias cannot be explained using the standard mean-variance model. We propose ambiguity aversion to be the behavioral explanation. We design six experiments using real world assets and derivatives to show the relationship between ambiguity aversion and home bias. We tested for ambiguity aversion by showing that the investor's subjective probability is sub-additive. The result from the experiment provides support for the assertion that ambiguity aversion is related to the equity market home bias paradox.https://resolver.caltech.edu/CaltechAUTHORS:20170727-135623631Implications of Pareto Efficiency for Two-Agent (Household) Choice
https://resolver.caltech.edu/CaltechAUTHORS:20101008-103311343
DOI: 10.7907/szdsc-6bp13
We study when two-member household choice behavior is compatible with Pareto optimality. We ask when an external observer of household choices, who does not know the individuals' preferences, can rationalize the choices as being Pareto-optimal. Our main contribution is to reduce the problem of rationalization to a graph-coloring problem. As a result, we obtain simple tests for Pareto optimal choice behavior. In addition to the tests, and using our graph-theoretic representation, we show that Pareto rationalization is equivalent to a system of quadratic equations being solvable.https://resolver.caltech.edu/CaltechAUTHORS:20101008-103311343Testable Implications of Gross Substitutes in Demand
https://resolver.caltech.edu/CaltechAUTHORS:20170726-171803006
DOI: 10.7907/c4yzh-awq44
We present a non-parametric "revealed-preference test" for gross substitutes in demand.https://resolver.caltech.edu/CaltechAUTHORS:20170726-171803006Information Gatekeepers: Theory and Experimental Evidence
https://resolver.caltech.edu/CaltechAUTHORS:20170726-170755087
DOI: 10.7907/n9sr9-wbh87
We consider a model where two adversaries can spend resources in acquiring public information about the unknown state of the world in order to influence the choice of a decision maker. We characterize the sampling strategies of the adversaries in the equilibrium of the game. We show that, as the cost of information acquisition for one adversary increases, that person collects less evidence whereas the other adversary collects more evidence. We then test the results in a controlled laboratory setting. The behavior of subjects is close to the theoretical predictions. Mistakes are relatively infrequent (15%). They occur in both directions, with more over-sampling (39%) than under-sampling (8%). The main difference with the theory is the smooth decline in sampling around the theoretical equilibrium. Comparative statics are also consistent with the theory, with adversaries sampling more when their own cost is low and when the other adversary's cost is high. Finally, there is little evidence of learning over the 40 matches of the experiment.https://resolver.caltech.edu/CaltechAUTHORS:20170726-170755087The Not-So-Popular Branch: Bicameralism as a Counter-Majoritarian Device
https://resolver.caltech.edu/CaltechAUTHORS:20170726-165417007
DOI: 10.7907/xf3pj-3zw13
We estimate a model of voting in Congress that allows for dispersed information about the quality of proposals in an equilibrium context. The results highlight the effects of bicameralism on policy outcomes. In equilibrium, the Senate imposes an endogenous supermajority rule on members of the House. We estimate this supermajority rule to be about four-fifths on average across policy areas. Moreover, our results indicate that the value of the information dispersed among legislators is significant, and that in equilibrium a large fraction of House members (40-50%) vote in accordance with their private information. Taken together, our results imply a highly conservative Senate, in the sense that proposals are enacted into law only when it is extremely likely that their quality is high.https://resolver.caltech.edu/CaltechAUTHORS:20170726-165417007Clearinghouses for Two-Sided Matching: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20101008-102338754
DOI: 10.7907/jereh-qfx04
We study the performance of two-sided matching clearinghouses in the laboratory. Our experimental design mimics the Gale-Shapley (1962) mechanism, utilized to match hospitals and interns, schools and pupils, etc., with an array of preference profiles. Several insights come out of our analysis. First, only 48% of the observed match outcomes are fully stable. Furthermore, among those markets ending at a stable outcome, a large majority culminates in the best stable matching for the receiving-side. Second, contrary to the theory, participants on the receiving-side of the algorithm rarely truncate their true preferences. In fact, it is the proposers who do not make offers in order of their preference, frequently skipping potential partners. Third, market characteristics affect behavior and outcomes: both the cardinal representation and the span of the core influence whether outcomes are stable or close to stable, as well as the number of turns it takes markets to converge to the final outcome.https://resolver.caltech.edu/CaltechAUTHORS:20101008-102338754The Axiomatic Structure of Empirical Content
https://resolver.caltech.edu/CaltechAUTHORS:20101008-094039001
DOI: 10.7907/7yp6a-gf196
In this paper, we provide a formal framework for studying the empirical content of a given theory. We define the falsifiable closure of a theory to be the least weakening of the theory that makes only falsifiable claims. The falsifiable closure is our notion of empirical content. We prove that the empirical content of a theory can be exactly captured by a certain kind of axiomatization, one that uses axioms which are universal negations of conjunctions of atomic formulas. The falsifiable closure operator has the structure of a topological closure, which has implications, for example, for the behavior of joint vis a vis single hypotheses.
The ideas here are useful for understanding theories whose empirical content is well-understood (for example, we apply our framework to revealed preference theory, and Afriat's theorem), but they can also be applied to theories with no known axiomatization. We present an application to the theory of multiple selves, with a fixed finite set of selves and where selves are aggregated according to a neutral rule satisfying independence of irrelevant alternatives. We show that multiple selves theories are fully falsifiable, in the sense that they are equivalent to their empirical content.https://resolver.caltech.edu/CaltechAUTHORS:20101008-094039001Political Institutions and the Dynamics of Investment
https://resolver.caltech.edu/CaltechAUTHORS:20170726-162558894
DOI: 10.7907/0s53y-9sr42
We present a theoretical model of the provision of a durable public good over an infinite horizon. In each period, there is a societal endowment of which each of n districts owns a share. This endowment can either be invested in the public good or consumed. We characterize the planner's optimal solution and time path of investment and consumption. We then consider alternative political mechanisms for deciding on the time path, and analyze the Markov perfect equilibrium of these mechanisms. One class of these mechanisms involves a legislature where representatives of each district bargain with each other to decide how to divide the current period's societal endowment between investment in the public good and transfers to each district. The second class of mechanisms involves the districts making independent decisions for how to divide their own share of the endowment between consumption and investment. We conduct an experiment to assess the performance of these mechanisms, and compare the observed allocations to the Markov perfect equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170726-162558894Contracts vs. Salaries in Matching
https://resolver.caltech.edu/CaltechAUTHORS:20101008-093110872
DOI: 10.7907/8swa2-84668
Firms and workers may sign complex contracts that govern many aspects of their interactions. I show that when firms regard contracts as substitutes, bargaining over contracts can be understood as bargaining only over wages. Substitutes is the assumption commonly used to guarantee the existence of stable matchings of workers and firms.https://resolver.caltech.edu/CaltechAUTHORS:20101008-093110872Revealed Preference Tests using Supermarket Data: the Money Pump
https://resolver.caltech.edu/CaltechAUTHORS:20100927-114449846
DOI: 10.7907/rq8c5-k8y17
We use a money pump argument to measure deviations from the revealed preference axioms. Using a panel data set of food expenditures, we find a large number of violations of the weak axiom of revealed preference. The money pump costs are small, which indicate that the violations of revealed preference are not severe. While most households' behavior deviates from rationality, by our measure they are close to being rational.https://resolver.caltech.edu/CaltechAUTHORS:20100927-114449846The Value of Information in the Court. Get it Right, Keep it Tight
https://resolver.caltech.edu/CaltechAUTHORS:20160331-160352489
We estimate an equilibrium model of decision-making in the US Supreme Court which
takes into account both private information and ideological differences between justices.
We present a measure of the value of information in the court: the probability that a
justice votes differently than what she would have voted for in the absence of case-specific
information. Our results suggest a sizable value of information: in roughly 44% of cases,
justices' initial leanings are changed by their personal assessments of the case. Our results
also confirm the increased politicization of the Supreme Court in the last quarter century.
We perform counterfactual simulations to draw implications for institutional design.https://resolver.caltech.edu/CaltechAUTHORS:20160331-160352489Extreme Walrasian Dynamics: The Gale Example in the Lab
https://resolver.caltech.edu/CaltechAUTHORS:20100928-153906850
DOI: 10.7907/7x1yf-9nx90
We study the classic Gale (1963) economy using laboratory markets. Tatonnement theory
predicts prices will diverge from an equitable interior equilibrium towards infinity or zero
depending only on initial prices. The inequitable equilibria determined by these dynamics
give all gains from exchange to one side of the market. Our results show surprisingly strong
support for these predictions. In most sessions one side of the market eventually outgains the
other by more than twenty times, leaving the disadvantaged side to trade for mere pennies.
We also find preliminary evidence that these dynamics are sticky, resisting exogenous
interventions designed to reverse their trajectories.https://resolver.caltech.edu/CaltechAUTHORS:20100928-153906850A Reverse Auction for Toxic Assets
https://resolver.caltech.edu/CaltechAUTHORS:20100928-152231463
DOI: 10.7907/vm7yk-h0q12
The proposed 2008 TARP auction was intended to facilitate rapid purchases by the U.S.
department of Treasury of a wide array of mortgage-backed securities in order to remove
these "toxic assets" from the portfolios of financially stressed banks. The Treasury had
selected a Reference Price design whereby bids to sell different securities would be
normalized or "scored" by reference prices that reflect estimates of relative values.
Although the auction was suspended just prior to the 2008 Presidential election, we
continued with a series of laboratory experiments aimed at evaluating the performance of
Reference Price auctions relative to several benchmark conditions. The experimental
results indicate that a simple Reference Price auction can be an effective mechanism for
avoiding serious effects of adverse selection and strategic bid manipulation, even when
reference prices are not set accurately. An econometric analysis of bidding patterns and
auction outcomes reveals how underlying behavior produces efficiency differences in this
common-value framework.https://resolver.caltech.edu/CaltechAUTHORS:20100928-152231463Competitive Equilibrium in Markets for Votes
https://resolver.caltech.edu/CaltechAUTHORS:20100924-163224980
DOI: 10.7907/7qftr-6q446
We develop a competitive equilibrium theory of a market for votes. Before voting on a
binary issue, individuals may buy and sell their votes with each other. We definne ex ante
vote-trading equilibrium, identify weak sufficient conditions for existence, and construct one such equilibrium. We show that this equilibrium must always result in dictatorship and the market generates welfare losses, relative to simple majority voting, if the committee is large enough. We test the theoretical implications by implementing a competitive vote market in the laboratory using a continuous open-book multi-unit double auction.https://resolver.caltech.edu/CaltechAUTHORS:20100924-163224980General Revealed Preference Theory
https://resolver.caltech.edu/CaltechAUTHORS:20100924-161538346
DOI: 10.7907/65jz8-7zk78
We provide general conditions under which an economic theory has a universal axiomatization: one that leads to testable implications. Roughly speaking, if we obtain a universal axiomatization when we assume that unobservable parameters (such as preferences)are observable, then we can obtain a universal axiomatization purely on observables. The result "explains" classical revealed preference theory, as applied to individual rational choice. We obtain new applications to Nash equilibrium theory and Pareto optimal choice.https://resolver.caltech.edu/CaltechAUTHORS:20100924-161538346The winner's curse: experiments with buyers and with sellers
https://resolver.caltech.edu/CaltechAUTHORS:20100924-152310913
DOI: 10.7907/zz70j-d2260
This paper explores the winner's curse phenomena as it was studied experimentally by Kagel and Levin. Experiments with the winner's curse are complicated by the fact that subjects can lose money and the experimenter has only a limited means of collecting it from them. Thus subjects enjoy only limited liability which has theoretical implications for behavior. In the Kagel and Levin experiments subjects were removed from the bidders' competition after losses reached a predetermined value. This experimental procedure has unknown implications for the results so ambiguity exists about whether the winner's curse was actually observed. In this study their results were replicated in an environment in which subjects were not removed. The case in which competitors are sellers is also studied. Bankruptcy cannot be a problem in sellers' competition. In both cases the winner's curse is observed. Thus the limited liability cannot be an explanation for the phenomenon reported by Kagel and Levin. In addition the paper examines the bidding behavior of all individuals and shows that this behavior does not fit any of the tested theories either on the aggregate or individual level. The "winner's curse" did not disappear over time during the conduct of the research.https://resolver.caltech.edu/CaltechAUTHORS:20100924-152310913Beyond Gingles: influence districts and the pragmatic tradition in voting rights law
https://resolver.caltech.edu/CaltechAUTHORS:20100924-155441806
DOI: 10.7907/xjvb7-9m505
Should minority voters who are not numerous enough to form.a majority of an electoral district have a legal right to protection against vote dilution? This question of "influence districts" is not new, but has yet not been definitely resolved by judicial decisions. This paper examines the logic, history, and law related to influence districts.
Any proposal for a legal stance on the question of influence districts should continue the dominant line of tradition of Congress and the courts, rather than contravene it. Therefore, after an introduction. Section II of this paper traces what I term the "practical" or "pragmatic" tradition in voting rights law from the passage of the Reconstruction Constitutional Amendments through the 1982 amendments to the Voting Rights Act and the nearly simultaneously-issued U.S. Supreme Court decision in Rogers v. Lodge. Beginning in 1870, Congress and later. the courts, rejected an abstract, formulaic, "bright-line" approach to voting rights law except during the period of massive discrimination and disfranchisement. Both Congress and the Supreme Court went beyond protecting the bare right of members of minority groups to vote. Instead. they realized that to cast an effective vote, African-Americans and others had to be sheltered from violence, intimidation, and fraud, and they had to be free to speak and organize. In the 1940s, courts insisted on nondiscrimination in primaries, and in the late 1960s, they helped guarantee the right to be free of recently established discriminatory electoral structures.
The courts and Congress refused to accept two proffered bright lines: one drawn, in effect, between voting per se and everything else, and the other guaranteeing proportional representation. Rather. they adopted the less precise. but more nuanced "totality of the circumstances" test for proving both intent and effect.
In Section III of the paper. I discuss the three-pronged test outlined in Thornburg v. Gingles. Even though Gingles is sometimes interpreted to imply that courts need pay no attention to minority groups that cannot form effective majorities of electoral districts, I point out that Justice Brennan's opinion in Gingles specifically refuses to foreclose that question and argue that both the log1ic of the opinion a.d contemporary political experience contravene the alleged implication. More specifically, I suggest that it is wrong for courts to isolate the first prong of the Gingles test from the other two. Viewed as interconnected, the three parts of the test do not preclude a consideration of the question of influence districts. Indeed. election data from both hypothetical and actual examples demonstrates that. there is no possible theoretical division between influence districts and control districts. I conclude that there is no bright line in Gingles.
Section IV of the paper takes a very brief look at some federal court op1111Ons concerning influence districts. concentrating on the Garza, Armour, and Springfield Park District cases. Their diverse analyses and criticisms that can be made of them suggest two different, but more systematic approaches to the influence district problem - a "results" approach and an "intent" approach - which I flesh out in Section V. In both approaches, I concentrate on totality of the circumstances.standards, in line with the pragmatic tradition, the Congress's intent in extending and amending the Voting Rights Act in 1982, and the Supreme Court's decisions in White v. Regester and Rogers v. Lodge. Finally, I attempt to respond generally to criticisms of protecting the interests of small minority groups. I conclude that both the value of bright-line standards and the dangers of relaxing them have been exaggerated.https://resolver.caltech.edu/CaltechAUTHORS:20100924-155441806A Revealed Preference Approach to Computational Complexity in Economics
https://resolver.caltech.edu/CaltechAUTHORS:20100928-150942227
DOI: 10.7907/cm0gg-20z31
One of the main building blocks of economics is the theory of the consumer, which postulates that consumers are utility maximizing. However, from a computational perspective, this model is called into question because the task of utility maximization subject to a budget constraint is computationally hard in the worst-case under reasonable assumptions. In this paper, we study the empirical consequences of strengthening consumer choice theory to enforce that utilities are computationally easy to maximize. We prove the possibly surprising result that computational constraints have no empirical consequences whatsoever for consumer choice theory. That is, a data set is consistent with a utility maximizing consumer if and only if a data set is consistent
with a utility maximizing consumer having a utility function that can be maximized in strongly
polynomial time.
Our result motivates a general approach for posing questions about the empirical content of computational constraints: the revealed preference approach to computational complexity. The approach complements the conventional worst-case view of computational complexity in important ways, and is methodologically close to mainstream economics.https://resolver.caltech.edu/CaltechAUTHORS:20100928-150942227Strategic Voting in a Jury Trial with Plea Bargaining
https://resolver.caltech.edu/CaltechAUTHORS:20100928-154905812
DOI: 10.7907/rsgw3-8nh44
We study the criminal court process focusing on the interaction between plea bargaining
and jury trials. We model plea bargaining such that a prosecutor makes a take-it-or-leave-it offer and a defendant, who is either guilty or innocent, pleads either guilty
or not guilty. If the defendant pleads not guilty, the case goes to a jury trial, which
follows a strategic voting model. Plea bargaining produces a bias in which the defendant
is less likely to be guilty if the case goes to trial, which in turn alters the jurors' voting
behavior. Conversely, anticipated jury trial outcomes affect a prosecutor and a defendant
while they participate in a plea bargain. We find that the equilibrium behavior
in a court with plea bargaining and a jury trial, resembles the equilibrium behavior in
the separate jury model, though jurors may act as if they echo the prosecutor's preference
against convicting the innocent and acquitting the guilty. We also compare two
voting paradigms, unanimity and non-unanimity. The unanimity rule is inferior to non-unanimity
because the ex-ante punishment delivered to the innocent or undelivered to
the guilty by unanimity rule does not vanish as the size of jury gets large.https://resolver.caltech.edu/CaltechAUTHORS:20100928-154905812Aggregate Matchings
https://resolver.caltech.edu/CaltechAUTHORS:20101008-101034676
DOI: 10.7907/rrg6w-3dj69
This paper characterizes the testable implications of stability for aggregate matchings. We consider data on matchings where individuals are aggregated, based on their observable characteristics, into types, and we know how many agents of each type match. We derive stability conditions for an aggregate matching, and, based on these, provide a simple necessary and sufficient condition for an observed aggregate matching to be rationalizable (i.e. such that preferences can be found so that the observed aggregate matching is stable). Subsequently, we derive moment inequalities based on the stability conditions, and provide an empirical illustration using the cross-sectional marriage distributions across the US states.https://resolver.caltech.edu/CaltechAUTHORS:20101008-101034676Aspirations and growth: a model where the income of others acts as a reference point
https://resolver.caltech.edu/CaltechAUTHORS:20170726-163941193
DOI: 10.7907/taq1c-c6a27
We study an OLG model in which the average income of the society acts as a reference point for the agents' utility on consumption. To model this we use the functional form developed in behavioral economics to study reference-dependence: prospect theory. We then assume that: 1) the utility function is convex in an interval before the reference point; 2) the utility function is not differentiable at the reference point, and it is steeper below than above the reference point. We argue that this reference-dependence causes the economy to admit multiple equilibria, and we show that in any of these equilibria in finite time the wealth distribution will become, and remain, either polarized or of perfect equality. We then study growth rates and show that, if we look at the equilibria with the highest growth, then the society that grows the most is the one that starts with perfect equality. If we look at the equilibria with the lowest growth for each economy, however, then the society with a small amount of initial inequality is the one that grows (strictly) the most, while a society with perfect equality is the one that grows the least. All of these growth rates are weakly higher than the growth rate of a corresponding economy without reference-dependence.https://resolver.caltech.edu/CaltechAUTHORS:20170726-163941193Speculative Overpricing in Asset Markets with Information Flows
https://resolver.caltech.edu/CaltechAUTHORS:20170726-172615506
DOI: 10.7907/370gk-wbz86
In this paper, we derive and experimentally test a theoretical model of speculation in multi-period asset markets with public information flows. The speculation arises from the traders' heterogeneous posteriors as they make different inferences from sequences of public information. This leads to overpricing in the sense that price exceeds the most optimistic belief about the real value of the asset. We find evidence of speculative overpricing in both incomplete and complete markets, where the information flow is a gradually revealed sequence of imperfect public signals about the state of the world. We also find evidence of asymmetric price reaction to good news and bad news, another feature of equilibrium price dynamics under our model. Markets with a relaxed short-sale constraint exhibit less overpricing.https://resolver.caltech.edu/CaltechAUTHORS:20170726-172615506Misconceptions and Game Form Recognition of the BDM Method: Challenges to Theories of Revealed Preference and Framing
https://resolver.caltech.edu/CaltechAUTHORS:20140403-155709665
DOI: 10.7907/x40kz-8b124
This study explores the tension between the standard economic theory of preference and non-standard theories of preference that are motivated by an underlying theory of framing. A simple experiment was performed to measure a known preference, the value of a card that can be exchanged for $2 cash. The measurement does not produce the known preference and instead reports a preference that has properties often cited in support of non-standard preference theories and framing. Close examination reveals that the divergence of the measured preference from the known preference reflects a mistake, arising from some subjects' misconception of the game form. We conclude that choice data should not be granted an unqualified interpretation of preference revelation. Mistakes in choices obscured by a possible error at the foundations of the theory of framing, can masquerade as having been produced by non-standard preferences.https://resolver.caltech.edu/CaltechAUTHORS:20140403-155709665Transcript of a Five-Member Committee Experiment
https://resolver.caltech.edu/CaltechAUTHORS:20131219-120212562
DOI: 10.7907/ezztk-1nb98
The following pages contain the transcript of a committee
experiment of the type first introduced by Fiorina and Plott [ 1].
The subjects, law students in their second year of study were very
articulate, sensitive to moral concerns, and skilled in strategic
behavior. Their conversations, debates and reflections might provide
a good source of data for scholars seeking to explain the equilibrium
decision made by similar groups. The transcript may also provide
some experience for those interested in field applications of laboratory
results with the type of social environment within which mathematical
models are known to work.https://resolver.caltech.edu/CaltechAUTHORS:20131219-120212562On Game Solutions and Revealed Preference Theory
https://resolver.caltech.edu/CaltechAUTHORS:20131218-143710230
DOI: 10.7907/bmxc6-e7576
Several connections between concepts underlying the theory of revealed preference and the concepts underlying solutions to cooperative games, have been established by Wilson. In this paper we provide some new connections. Wilson established the relationship between the solution concept of Von Neumann and Morgenstern and the strongest forms of rational choice found at Richter and Hansson. Here, for the cases of finite sets of alternatives, we provide connections with weaker "degrees" of rationality found at Plott, Richter, and Sen.https://resolver.caltech.edu/CaltechAUTHORS:20131218-143710230Rationality and Relevance in Social Choice Theory
https://resolver.caltech.edu/CaltechAUTHORS:20131218-152844304
DOI: 10.7907/erhr3-gcv11
The central argument of this paper is that concepts such as "social preference," "social rationality," "public interest," "social benefits" and "social welfare" are unnecessary for the development and application of welfare economics principles and the design and/or modification of political economic processes. The primary reasons for using these constructions as offered by Samuelson and Arrow are misleading if not simply wrong. The features of the concepts which make their use compelling, are also features of other approaches to problems. Furthermore, since the tools themselves automatically restrict analysis to a rather "uninteresting" family of political-economic processes, their use may even be detrimental to the development of a relevant body of theory.https://resolver.caltech.edu/CaltechAUTHORS:20131218-152844304An Experimental Analysis of the Structure of Legal Fees:
American Rule vs. English Rule
https://resolver.caltech.edu/CaltechAUTHORS:20140325-161757272
DOI: 10.7907/7xn4b-ze373
The expanding volume of lawsuits and the ballooning of legal expenditures in recent years has
attracted the interest, concern, and even anger of the American public and politicians. These
developments have led law makers to consider alternative legal fee allocation rules as methods for
administering justice more efficiently. Under the traditional American rule, parties to a lawsuit
must each pay their own legal expenses. One reform proposal is the English rule, under which the
losing party must pay the prevailing party's attorney fees in addition to her own expenses. To
evaluate the different effects of these two rules on litigant behavior and legal outcomes, we
conduct a theoretical and experimental analysis of environments which can be interpreted as legal
disputes in which the probability of winning a lawsuit is partially determined by the legal
expenditures of the litigants and partially determined by the inherent merits of the case. We
investigate decisions regarding trial expenditure and examine the effects of the two allocation
rules on pretrial issues of suit and settlement. The data demonstrate that game theoretic
equilibrium models produce good qualitative predictions of the relative institutional response to
changes in the allocation rule and to differences in such parameters as case merit and lawyer
productivity. In our most significant result, we find that the English rule produces significantly
higher expenditure at trial than the American rule. On the other hand, the frequency of trial is
significantly lower under the English rule. Combining these two effects, we find that average
expenditure per legal dispute is higher under the English rule than under the American rule.https://resolver.caltech.edu/CaltechAUTHORS:20140325-161757272An Internal Fuel Efficiency Credit Market Mechanism for Meeting the CAFE Standard: Internalizing a Regulation Caused Externality
https://resolver.caltech.edu/CaltechAUTHORS:20140224-133248343
DOI: 10.7907/xy6p6-yfx04
The paper develops and analyzes an internal market based mechanism that enables a decentralized enterprise to meet the conditions of the Corporate Average Fuel Economy (CAFE) regulations. Divisions that produce vehicles with fuel economy (miles per gallon fuel) above the regulatory requirement receive Fuel Efficiency Credits (FEC). These credits can be sold in an internal FEC market to divisions that produce vehicles with fuel economy levels below the regulatory requirement. The FEC available for sale by fuel efficient vehicle production and the FEC needed as a condition of production of fuel inefficient vehicles are tied to the respective fuel efficiency levels. Experimental tests demonstrate that the enterprise can achieve near profit maximum levels while continuing to operate through decentralized profit centers. The FEC market "internalizes" the externality across divisions created by the CAFE regulation. The behavioral model supported by the data suggests that the policy can be successfully crafted to include multiple firms trading FECs.https://resolver.caltech.edu/CaltechAUTHORS:20140224-133248343The Stingy Shift Explained as a Majority Rule Equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20140304-155330028
DOI: 10.7907/x942g-s2266
[Introduction] Baron, Roper and Baron (1974) claim that group decisions
regarding contributions to a charitable cause sometimes represent
generally stingier options than the options picked by individuals
when choosing alone. Their study and interpretations are consistent
with a sizable social psychological literature which postulate:o;
that "choice shifts" of various kinds occur as a result of group
decision. Theories used to explain "choice shifts" usually rest
on principles of group decision involving concepts like cultural
values, responsibility, leadership, etc. In the present case, for
example, the diffusion of personal responsibility for uncharitable
behavior was offered as one explanation for smaller mean donations
by groups.https://resolver.caltech.edu/CaltechAUTHORS:20140304-155330028The FCC Rules for the 700MHZ Auction: A Potential Disaster
https://resolver.caltech.edu/CaltechAUTHORS:20140317-134505845
DOI: 10.7907/arjxm-55823
In July 2000 the FCC issued the rules to govern the upcoming 700MHz auction. The rules are a
departure from the auction architectures previously used by the FCC. Rather than all bidding
only on individual licenses, the auction participants will be able to bid on combinations or
packages of licenses. Several combinatorial auction processes exist in the literature and testing
demonstrates that such processes have a potential for substantially increasing the efficiency of
the auction. While combinatorial auction systems have been studied in various forms, indeed a
particular auction architecture was developed and studied extensively for the FCC, the rules that
emerged from the FCC deliberations are unlike any that have ever been implemented before.
The purpose of this note is to call attention to the fact that the particular rules developed by the
FCC hold the potential for tarnishing the long history of successful auctions within the FCC.
The questions posed in the pages that follow are; (i) Will the auction perform efficiently? (ii)
Does the FCC have the tools to accelerate the auction or hasten its timely termination? (iii) Will
the auction architecture scale up? The thesis of this paper is that the answer to all three of these
fundamental questions is "no". At base the auction rules rest on an inappropriate set of
principles. The principles and the intuition drawn from those principles might serve well when
the auction is restricted to bids on individual items but when the bids can be on packages of
items the principles simply do not apply.
The first section of this note outlines the rules. The second section lists problems that can evolve
from the implementation of the rules. The third section contains observations about the sources
of the problems caused by the rules and the final section suggests changes in the rules that will
remove all of the problems listed.https://resolver.caltech.edu/CaltechAUTHORS:20140317-134505845Information Aggregation in Experimental Asset Markets: Traps and Misaligned Beliefs
https://resolver.caltech.edu/CaltechAUTHORS:20140317-134842763
DOI: 10.7907/c02mr-kz141
The capacity of markets to aggregate information has been conclusively demonstrated but he limitations of that capacity have still not been fully explored. In this paper, we demonstrate the existence of "information traps". These traps appear to be a sort of equilibrium in which information existing in the market does not become revealed in prices. The foundation for the equilibrium is a pattern of misaligned beliefs in which each person's actions are based upon mistaken beliefs about the information held by others. The mistakes, themselves, have a type of mutual compatibility and cannot become revealed by the price discovery process because individuals have no incentives or resources to adjust. Attempts to probe the nature of the phenomena involved two period markets with a contingent claim instruments, experienced participants, and unlimited short selling opportunities.https://resolver.caltech.edu/CaltechAUTHORS:20140317-134842763Principles of Continuous Price Determination In An Experimental Environment With Flows Of Random Arrivals And Departures
https://resolver.caltech.edu/CaltechAUTHORS:20140317-145139816
DOI: 10.7907/4zdjm-c8812
A new experimental market environment is developed. Continuously arriving incentives replace the traditional period structure. The issue posed is whether classical principles of market behavior apply when the environment is constantly changing. Three broad results emerge. (1) Natural "flow" generalizations of the laws of demand and supply exist and dictate much of the market behavior. (2) Two different classes of laws operate: the "temporal equilibrium", which is based on the parameters that exist in the market at a moment and the "flow competitive equilibrium," which reflects the probabilistic structure of the parameters. (3) The markets exhibit extraordinarily high levels of efficiency.https://resolver.caltech.edu/CaltechAUTHORS:20140317-145139816Information Aggregation Mechanisms: Concept, Design and Implementation for a Sales Forecasting Problem
https://resolver.caltech.edu/CaltechAUTHORS:20140317-135547085
DOI: 10.7907/n9y0a-a5y79
Information Aggregation Mechanisms are economics mechanisms designed explicitly for the purpose of collecting and aggregating information. The modern theory of rational expectations, together with the techniques and results of experimental economics, suggest that a set of properly designed markets can be a good information aggregation mechanism. The paper reports on the deployment of such an information aggregation mechanism inside Hewlett-Packard Corporation for the purpose of makings sales forecasts. Results who that IAMs performed better than traditional methods employed inside Hewlett-Packard. The structure of the mechanism, the methodology and the results are reported.https://resolver.caltech.edu/CaltechAUTHORS:20140317-135547085Toward a Theory of Professional Diagnosis and Service: Consumer Behavior
https://resolver.caltech.edu/CaltechAUTHORS:20140304-164416349
DOI: 10.7907/5n4yg-fw786
A model is developed for situations in which consumers depend upon producers of a good or service for information which has an impact on their demand. Nonsupply sources of information do not exist and consumers are forced to rely on comparisons between suppliers as their only check on potential fraud. The optimal search strategies are characterized and some of the implications for the resulting patterns of advice are analyzed.https://resolver.caltech.edu/CaltechAUTHORS:20140304-164416349A Computerized Laboratory Market System and Research Support Systems for the Multiple Unit Double Auction
https://resolver.caltech.edu/CaltechAUTHORS:20140320-161443776
DOI: 10.7907/n7dgx-ah530https://resolver.caltech.edu/CaltechAUTHORS:20140320-161443776Private R&D and Second Sourcing in Procurement: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20140311-132705679
DOI: 10.7907/stvhg-z4356
This study focuses on two topics in government procurement problems: second sourcing, and private research and development investment procurement. A simple theoretical framework is developed to analyze the likely effects on private R&D and procurement prices of recent proposals regarding competition in procurement and the associated data0rights policy. The framework is also used to demonstrate a major flaw in the current methodology used in the evaluation of the benefits of sourcing. IN procurement environments where private R&D is an important factor and potential sellers have commercial markets which may be adversely affected, second sourcing may reduce competition in the initial procurement stage. Experimental methods are used to test for the existence of the effect.https://resolver.caltech.edu/CaltechAUTHORS:20140311-132705679A Review of Decision Theoretic Literature with Implications Regarding Governmental Research and Development Policies
https://resolver.caltech.edu/CaltechAUTHORS:20140304-162217916
DOI: 10.7907/g2m80-7v364
N/Ahttps://resolver.caltech.edu/CaltechAUTHORS:20140304-162217916Local Telephone Exchanges, Regulation, and Entry
https://resolver.caltech.edu/CaltechAUTHORS:20140320-164955793
DOI: 10.7907/n2m6h-wsy26
This paper explores the relationship between technology and the policies that govern competition in the local telephone business. Analysis of competition policies requires a "long run" modeling perspective in which not only the entry and exit of firms are allowed but also allowed in changes in the nature of the investment in the underlying network technology such as network backbone and it s topology. This long run perspective requires a focus on the sources and conditions of joint production and public goods that exist in the production process and how they are influenced by the finance of the business and the constraints policies placed on firms to provide services that are not profitable.https://resolver.caltech.edu/CaltechAUTHORS:20140320-164955793Asset Bubbles and Rationality: Additional Evidence from Capital Gains Tax Experiments
https://resolver.caltech.edu/CaltechAUTHORS:20140317-152859749
DOI: 10.7907/661k1-aq830
[Introduction] The remarkable phenomenon of bubbles and crashes in laboratory asset markets was first
discovered and reported in Smith et al (1988). Subsequent research inquired about the robustness
of the phenomenon and how it might be explained. One interpretation of the data is that public
knowledge of rationality is lacking in the subjects, which leads to a type of individually rational,
bubble creating speculation as part of an attempt to acquire capital gains. A different
interpretation is that subjects begin with a type of confusion or mistaken understanding about this
particular environment and that such "irrationality" at the individual level initiates the bubble,
which could be sustained by a lack of common knowledge of rationality even after all confusion
becomes removed during the process of participating in the market. This paper explores these
two ideas through the study of experiments in which a capital gains tax is imposed that makes
speculation for capital gains unprofitable except under extreme circumstances.https://resolver.caltech.edu/CaltechAUTHORS:20140317-152859749From Non Market Attitudes to Market Behavior: Laboratory Market Experiments in Moscow, and the Hvatat Property of Human Behavior
https://resolver.caltech.edu/CaltechAUTHORS:20140320-165915758
This paper reports on laboratory market experiments that were conducted in Moscow
during the fall of 1992. Two broad concerns guided the design of the experiments. The
first concern is the obvious cultural differences between people in Russia and those in the
west where traditional laboratory market experiments have been conducted. Most
economists who have conducted experiments would assume that cultural background
would not play a major part in the equilibration process, and that the law of supply and
demand would operate essentially the same in all cultures and at all times. As Russia is
undergoing a dramatic social change and before an orientation to market attitudes
permeates the society, a unique opportunity presented itself to test this assumption about
the universal nature of the laws of the market.
In the language of the experimenters, this first concern is whether or not some of the
known properties of markets that operate in western culture are robust to major changes
in the culture of the subject population. The properties of markets that are of interest are
the fact of equilibration, the influence of price ceilings on equilibration and the influence
of the asymmetries in the shapes of the demand and supply curves on the direction of
convergence to equilibrium. The second concern reflects a desire to explore some more
recently discovered aspects of market behavior. A transactions cost was added to the
market and the question posed was whether or not the cost frustrated or biased
equilibration (Jamison and Plott, 1997).
The results of the experiments were not as anticipated. The standard convergence process
was observed but when the data were applied to test the asymmetric rent hypothesis or
the hypothesis regarding the nonbinding price controls, some surprises surfaced. In
particular the data were not the same as had been observed in other experiments. This
paradoxical failure of previous results to generalize prompted a detailed investigation into
the behavior of individuals in the experiments. The hope was to find the causes of the discrepancies. A special section of the paper is devoted to a conjecture that resulted from
the ex-post examination of the data and is offered as an explanation.
The paper is organized as follows. In the first section the purposes of the project are
outlined in greater detail. The second section contains the experimental design,
procedures and parameters. The third section contains the predictions of alternative
models as applied to the special case of the parameters of the experiments. The fourth
section contains the results of the experiments. The fifth section contains the post
experiment speculations about the possible explanations of the surprising dynamic
behavior. Here the idea of Hvatat, which means "to grab" in Russian, is introduced and
explored. The final section is a summary of conclusions.https://resolver.caltech.edu/CaltechAUTHORS:20140320-165915758Transparency Versus Back-Room Deals in Bargaining
https://resolver.caltech.edu/CaltechAUTHORS:20160229-145809760
DOI: 10.7907/0g2gw-ayy10
We design an experiment to study the effects of transparency on bargaining processes. We show that whether transparency arises endogenously depends on the degree of competition between subjects. In a competitive setting there is no transparency: subjects use private communication channels to compete for favors from those in power and establish backroom deals. In the absence of competition the bargaining process is transparent: subjects communicate publicly and outcomes are more egalitarian. We further show that in a competitive setting, imposing transparency by requiring all communication to be public reduces the observed competition between subjects and leads to more egalitarian outcomes.https://resolver.caltech.edu/CaltechAUTHORS:20160229-145809760A Parimutuel-like Mechanism from Information Aggregation: A Field Test Inside Intel
https://resolver.caltech.edu/CaltechAUTHORS:20140403-161916829
DOI: 10.7907/ws7ab-8k442
Field tests of a new Information Aggregation Mechanism (IAM) developed via laboratory experimental methods were implemented inside Intel Corporation for sales forecasting. The IAM, which incorporates selected features of parimutuel betting, is uniquely designed to collect and quantize as probability distributions any dispersed, subjectively held information that might exist. The tests demonstrate the robustness of experimental results and the practical usefulness of the IAM. The IAM yields predicted distributions of future sales that are very accurate at short horizons; indeed, more accurate than Intel's official in-house forecast 59% of the time. A symmetric game model suggests why the IAM works.https://resolver.caltech.edu/CaltechAUTHORS:20140403-161916829Duality in dynamic discrete choice models
https://resolver.caltech.edu/CaltechAUTHORS:20160329-093057029
DOI: 10.7907/4gxmb-6an66
Using results from convex analysis, we investigate a novel approach to identification and estimation of discrete choice models which we call the "Mass Transport Approach" (MTA). We show that the conditional choice probabilities and the choice-specific payoffs in these models are related in the sense of conjugate duality, and that the identification problem is a mass transport problem. Based on this, we propose a new two-step estimator for these models; interestingly, the first step of our estimator involves solving a linear program which is identical to the classic assignment (two-sided matching) game of Shapley and Shubik (1971). The application of convex-analytic tools to dynamic discrete choice models, and the connection with two-sided matching models, is new in the literature.https://resolver.caltech.edu/CaltechAUTHORS:20160329-093057029Average Choice
https://resolver.caltech.edu/CaltechAUTHORS:20160321-140851688
DOI: 10.7907/gnvny-ekt49
This is an investigation of stochastic choice when only the average of the choices is observable. For example when one observes aggregate sales numbers from a store, but not the frequency with which each item was purchased. The focus of our work is on the Luce model, also known as the Logit model. We show that a simple path independence property of average choice uniquely characterizes the Luce model. We also characterize the linear Luce mode, using similar tools. A linear version of the Luce model is used most frequently in empirical work by applied economists.
Our characterization is based on the property of path independence, which runs counter to early impossibility results on path independent choice. From an empirical perspective, our results provide a small-sample advantage over the tests of Luce's model that rely on estimating choice frequencies.https://resolver.caltech.edu/CaltechAUTHORS:20160321-140851688Call Market Experiments: Efficiency and Price Discovery through Multiple Calls and Emergent Newton Adjustments
https://resolver.caltech.edu/CaltechAUTHORS:20160222-141654454
DOI: 10.7907/6kery-xsc22
We study multiple-unit, laboratory experimental call markets in which orders are cleared by a single price at a scheduled "call". The markets are independent trading "days" with two calls each day preceded by continuous and public order flow. Markets approach the competitive equilibrium over time. The price formation dynamics operate through the flow of bids and asks configured as the "jaws" of the order book with contract execution structured by an underlying mathematical principle, the Newton method for solving systems of equations. Thus, both excess demand and its slope play a systematic role in call market price discovery.https://resolver.caltech.edu/CaltechAUTHORS:20160222-141654454Two Information Aggregation Mechanisms for Predicting the Opening Weekend Box Office Revenues of Films: Boxoffice Prophecy and Guess of Guesses
https://resolver.caltech.edu/CaltechAUTHORS:20160222-140722119
DOI: 10.7907/bgxee-k1z71
Successful field tests were conducted on two new Information Aggregation Mechanisms (IAMs). The mechanisms collected information held as intuitions about opening weekend box office revenues for movies in Australia. Participants were film school students. One mechanism is similar to parimutuel betting that produces a probability distribution over box office amounts. Except for "art house films", the predicted distribution is indistinguishable from the actual revenues. The second mechanism is based on guesses of the guesses of others and applied when incentives for accuracy could not be used. It tested well against data and contains information not encompassed by the first mechanism.https://resolver.caltech.edu/CaltechAUTHORS:20160222-140722119Multiple Items, Ascending Price Auctions: An Experimental Examination of Alternative Auction Sequences
https://resolver.caltech.edu/CaltechAUTHORS:20160222-134834747
DOI: 10.7907/7ds1y-hk484
The paper investigates the revenue and efficiency of different ascending price auction architectures for the sale of three items and five bidders. Four architectures are studied: two different sequences of single item auctions, simultaneous auctions with a common countdown clock, and simultaneous auctions with item specific countdown clocks. A countdown clock measures the time until the auction closes but resets with each new bid. The environment contains independent private values, no uncertainty about own preferences, no information about other's preferences, and a one unit budget constraint. The Nash equilibrium best response with straight forward bidding fits both dynamic and outcome data well. When non-unique Nash equilibria exist as in the case of simultaneous markets with a common clock, the social value maximizing Nash equilibrium emerges as the equilibrium selection. Both total revenue and efficiencies depend on the architecture as predicted by the Nash model, with the exception of the independent clocks architecture, which performs poorly on all dimensions.https://resolver.caltech.edu/CaltechAUTHORS:20160222-134834747Mechanism Design with Public Goods: Committee Karate,
Cooperative Games, and the Control of Social Decisions through Subcommittees
https://resolver.caltech.edu/CaltechAUTHORS:20160222-140238779
DOI: 10.7907/v28ye-tjm02
Axioms from social choice theory and the core of cooperative games in effectiveness form are used to design an organization that influences a voting group to choose the alternative preferred by a designer. The designer has information about individual preferences and can dictate organization but cannot dictate choice. The designer's influence works through decision centers (subcommittees). Subcommittee memberships, subcommittee separation, the alternatives available to the subcommittees, the chairpersons and voting rules can be used to create games with appropriate configurations of cores that result in group decisions according to the designer's wishes. The institutions leave considerable flexibility to subcommittee decisions and appear to be fair. Manipulation is not detected. Core alternatives emerge as the group choice. Conflicting individual preferences enable organizational structures such that a wide range of alternative can be made the solution. Experiments demonstrate that the resulting model is a very accurate predictor of the group choice.https://resolver.caltech.edu/CaltechAUTHORS:20160222-140238779The Process of Choice in Guessing Games
https://resolver.caltech.edu/CaltechAUTHORS:20160229-095936436
DOI: 10.7907/avqen-78z78
This paper employs a new experimental design to provide insight into strategic choice in one shot games. We incentivize and observe provisional choices in the 2/3 guessing game in the period after the structure of the game has been communicated. Early selections in this "strategic choice process" data provide insight into naive (L0) play, and support the standard assumption that such choices average 50. While average strategic sophistication rises over time, we identify significant individual differences in this respect. These differences appear to be broad-based: those whose strategic sophistication grows most in our experiment also perform best at separate
learning tasks.https://resolver.caltech.edu/CaltechAUTHORS:20160229-095936436Flip-Flopping, Primary Visibility and the Selection of Candidates
https://resolver.caltech.edu/CaltechAUTHORS:20160229-095607758
DOI: 10.7907/5tzpw-t0696
We present an incomplete information model of two-stage elections in which candidates can choose different platforms in primaries and general elections. Voters do not directly observe the chosen platforms, but infer the candidates' ideologies from observing candidates' campaigns. The ability of voters to detect candidates' types depends on the visibility of the race. This model captures two patterns: the post-primary moderation effect, in which candidates pander to the party base during the primary and shift to the center in the general election; and the divisive-primary effect, which refers to the detrimental effect of hard-fought primaries on a party's general-election prospects.https://resolver.caltech.edu/CaltechAUTHORS:20160229-095607758Stochastic Choice and Preferences for Randomization
https://resolver.caltech.edu/CaltechAUTHORS:20160229-095152009
DOI: 10.7907/77dwy-0rp12
We conduct an experiment to investigate the origin of stochastic choice and to differentiate between the three main classes of models that account for it: Random Expected Utility; Mistakes; and Deliberate Randomization. Subjects face the same questions multiple times in two ways: 1) following the literature, with repetitions distant from each other; 2) in a novel treatment, with repetitions in a row, telling subjects that questions will be repeated. A large majority of subjects exhibited stochastic choice in both cases, and stochasticity is strongly correlated in the two cases. Our results support the class of models of Deliberate Randomization.https://resolver.caltech.edu/CaltechAUTHORS:20160229-095152009The Political Economy of Public Debt: A Laboratory Study
https://resolver.caltech.edu/CaltechAUTHORS:20160301-115133535
DOI: 10.7907/fbzrt-67w98
This paper reports the results from a laboratory experiment designed to study political distortions in the accumulation of public debt. A legislature bargains over the levels of a public good and of district specific transfers in two periods. The legislature can issue or purchase risk-free bonds in the first period and the level of public debt creates a dynamic linkage across policymaking periods. In line with the theoretical predictions, we find that public policies are inefficient and efficiency is increasing in the size of the majority requirement, with higher investment in public goods and lower debt associated with larger majority requirements. Also in line with the theory, we find that debt is lower when the probability of a negative shock to the economy in the second period is higher, evidence that legislators use debt to smooth consumption. The experiment also highlights two phenomena that are not predicted by standard theories and have not been previously documented. First, balancing the budget in each period appears to be a focal point for some legislators, leading to lower distortions than predicted. Second, higher majority requirements induce significant delays in reaching an agreement.https://resolver.caltech.edu/CaltechAUTHORS:20160301-115133535External Validation of Voter Turnout Models by Concealed Parameter Recovery
https://resolver.caltech.edu/CaltechAUTHORS:20160301-134313074
DOI: 10.7907/x0b0n-1k851
We conduct a model validation analysis of several behavioral models of voter turnout, using laboratory data. We call our method of model validation concealed parameter recovery, where estimation of a model is done under a veil of ignorance about some of the experimentally controlled parameters — in this case voting costs. We use quantal response equilibrium as the underlying, common structure for estimation, and estimate models of instrumental
voting, altruistic voting, expressive voting, and ethical voting. All the models except the ethical model recover the concealed parameters reasonably well. We also report the results of a counterfactual analysis based on the recovered parameters, to compare the policy implications of the different models about the cost of a subsidy to increase turnout.https://resolver.caltech.edu/CaltechAUTHORS:20160301-134313074The Dynamic Free Rider Problem: A Laboratory Study
https://resolver.caltech.edu/CaltechAUTHORS:20160301-135235371
DOI: 10.7907/5wr2k-ph607
We report the results from an experiment designed explicitly to study the Markov Perfect Equilibrium (MPE) dynamics of free riding behavior in the accumulation of a durable public good. We consider two cases: economies with reversibility (RIE), where the agents can either increase or decrease the accumulated stock; and economies with irreversibility (IIE), where contributions are non-negative. Our findings support the key qualitative prediction of MPE: IIE converges to an accumulated level of public good that is an order of magnitude higher than RIE. We also find that the accumulation path is inefficiently slow in both RIE and IIE, and the public good is significantly under-provided.https://resolver.caltech.edu/CaltechAUTHORS:20160301-135235371Quantal Response and Nonequilibrium Beliefs Explain Overbidding in Maximum-Value Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20160301-134857624
DOI: 10.7907/pr9cf-1xg79
We report an experiment on a simple common value auction to investigate the extent to which bidding can be explained by quantal response equilibrium, in combination with different
assumptions about the structure of bidder beliefs|the cursed equilibrium model and models that posit levels of strategic sophistication. Using a structural estimation approach, we find a close correspondence between the theoretical predictions of those models and experimental
behavior. The basic pattern of average bids in the data consists of a combination of overbidding for low signals, and value-bidding for higher signals. The logit QRE model with heterogeneous bidders fits this pattern reasonably well. Combining quantal response with either cursed beliefs (CE-QRE) or a level-k of strategic sophistication (LK-QRE, CH-QRE) leads to a close match with the data. All these variations on quantal response models predict minimal differences of average bidding behavior across different versions of the game, consistent with the experimental
findings. Finally, we reanalyze data from an earlier experiment on the same auction by Ivanov, Levin and Niederle (2010). While their data exhibit much more variance compared with ours, nonetheless, we still find that these models also fit their data reasonably well, even in the presence of extreme overbidding observed in that experiment. Overall, our study indicates that the winner curse phenomenon in this auction is plausibly attributable to limits on strategic thinking combined with quantal response.https://resolver.caltech.edu/CaltechAUTHORS:20160301-134857624What Makes Voters Turn Out: The Effects of Polls and Beliefs
https://resolver.caltech.edu/CaltechAUTHORS:20160229-144052263
DOI: 10.7907/v0mdy-epq83
We use laboratory experiments to test for one of the foundations of the rational voter paradigm { that voters respond to probabilities of being pivotal. We exploit a
setup that entails stark theoretical effects of information concerning the preference distribution (as revealed through polls) on costly participation decisions. We find
that voting propensity increases systematically with subjects' predictions of their preferred alternative's advantage. Consequently, pre-election polls do not exhibit the detrimental welfare effects that extant theoretical work predicts. They lead to more participation by the expected majority and generate more landslide elections.https://resolver.caltech.edu/CaltechAUTHORS:20160229-144052263Trading Votes for Votes. A Decentralized Matching Algorithm
https://resolver.caltech.edu/CaltechAUTHORS:20160301-115651807
DOI: 10.3386/w21645
Vote-trading is common practice in committees and group decision-making. Yet we know very little about its properties. Inspired by the similarity between the logic
of sequential rounds of pairwise vote-trading and matching algorithms, we explore three central questions that have parallels in the matching literature: (1) Does a stable allocation of votes always exists? (2) Is it reachable through a decentralized algorithm? (3) What welfare properties does it possess? We prove that a stable allocation exists and is always reached in a finite number of trades, for any number of voters and issues, for any separable preferences, and for any rule on how trades are prioritized. Its welfare properties however are guaranteed to be desirable only under specific conditions. A laboratory experiment confirms that stability has predictive
power on the vote allocation achieved via sequential pairwise trades, but lends only weak support to the dynamic algorithm itself.https://resolver.caltech.edu/CaltechAUTHORS:20160301-115651807Ignorance and Bias in Collective Decisions
https://resolver.caltech.edu/CaltechAUTHORS:20160301-133946961
DOI: 10.7907/6fx75-j5292
We study theoretically and experimentally a committee with common interests. Committee members do not know which of two alternatives is the best, but each member can acquire
privately a costly signal before casting a vote under either majority or unanimity rule. In the experiment, as predicted by Bayesian equilibrium, voters are more likely to acquire information under majority rule, and attempt to counter the bias in favor of one alternative under unanimity rule. As opposed to Bayesian equilibrium predictions, however, many committee members vote when uninformed. Moreover, uninformed voting is strongly associated with a lower propensity to acquire information. We show that an equilibrium model of subjective prior beliefs can account for both these phenomena, and provides a good overall fit to the observed patterns of behavior both in terms of rational ignorance and biases.https://resolver.caltech.edu/CaltechAUTHORS:20160301-133946961How Cheap Talk Enhances Efficiency in Public Goods Games
https://resolver.caltech.edu/CaltechAUTHORS:20160301-120023499
DOI: 10.7907/99d56-9zp69
This paper uses a Bayesian mechanism design approach to investigate the effects of communication in a threshold public goods game. Individuals have private information about contribution costs. If at least some fraction of the group make a discrete contribution, a public benefit accrues to all members of the group. We experimentally implement three different communication structures prior to the decision move: (a) simultaneous exchange of binary messages, (b) larger finite numerical message space and (c) unrestricted text chat. We obtain theoretical bounds on the efficiency gains that are obtainable under these different communication structures. In an experiment with three person groups and a threshold of two, we observe significant efficiency gains only with the richest of these communication structures, where participants engage in unrestricted text chatting. In that case, the efficiency bounds implied by mechanism design theory are achieved.https://resolver.caltech.edu/CaltechAUTHORS:20160301-120023499Testable Implications of Translation Invariance and Homotheticity: Variational, Maxmin, CARA and CRRA preferences
https://resolver.caltech.edu/CaltechAUTHORS:20160308-151531257
DOI: 10.7907/wfa41-5z837
We provide revealed preference axioms that characterize models of translation invariant preferences. In particular, we characterize the models of variational, maxmin, CARA and CRRA utilities. In each case we present a revealed preference axiom that is satisfied by a dataset if and only if the dataset is consistent from the corresponding utility representation. Our results complement traditional exercises in decision theory that take preferences as primitive.https://resolver.caltech.edu/CaltechAUTHORS:20160308-151531257Static and Dynamic Underinvestment: an Experimental Investigation
https://resolver.caltech.edu/CaltechAUTHORS:20160229-144613925
DOI: 10.7907/dv3vt-ztf54
In this paper we design a stylized version of an environment with public goods, dynamic linkages, and legislative bargaining. Our theoretical framework studies the provision of a durable public good as a modified version of Battaglini et al. (2012). We develop an experimental design that allows us to disentangle inefficiencies that would result in a one-shot world (static inefficiencies) from extra inefficiencies that emerge in an environment in which decisions in the present affect the future (dynamic inefficiencies). We solve for efficiency and also characterize the bargaining equilibrium, a symmetric stationary subgame-perfect equilibrium, which is the most common concept used in applied work. The experimental results indicate that subjects do react to dynamic linkages and, as such, we find evidence of both static and dynamic inefficiencies.
In fact, the quantitative predictions of the model with respect to the share of dynamic inefficiencies are closest to the data when dynamic linkages are high. To the extent that behavior is different from what is predicted by the model, a systematic pattern emerges, namely the use of strategic cooperation whereby subjects increase the efficiency of period one proposals by selectively punishing, in period two, subjects who did not propose efficient allocations.https://resolver.caltech.edu/CaltechAUTHORS:20160229-144613925Political Reputations and Campaign Promises
https://resolver.caltech.edu/CaltechAUTHORS:20160314-151357281
DOI: 10.7907/fkwdm-vm028
We analyze conditions under which candidates' reputations may affect the beliefs of the voters over what policy will be implemented by the winning candidate of an election. We develop a model of repeated elections with complete information in which candidates are purely ideological. We analyze an equilibrium in which voters' strategies involve a credible threat to punish candidates who renege of their campaign promises, and all campaign promises are believed by voters, and honored by candidates. We characterize the maximal credible campaign promises and obtain that the degree to which promises are
credible in equilibrium is an increasing function of the value of a candidate's reputation.https://resolver.caltech.edu/CaltechAUTHORS:20160314-151357281Testable Implications of Exponential Discounting
https://resolver.caltech.edu/CaltechAUTHORS:20160321-132448266
DOI: 10.7907/4jx8b-zxb60
We develop a behavioral axiomatic characterization of exponentially discounted utility (EDU) over consumption streams. Given is an individual agent's behavior in the market: assume a finite collection of purchases across periods. We show that such behavior satisfies a "revealed preference axiom" if and only if there exists a EDU model (a discount rate per period and a concave utility function over money) that accounts for the given intertemporal consumption.https://resolver.caltech.edu/CaltechAUTHORS:20160321-132448266The Perception-Adjusted Luce Model
https://resolver.caltech.edu/CaltechAUTHORS:20160321-134531088
DOI: 10.7907/spc2v-qkd82
We develop an axiomatic model that builds on Luce's (1959) model to incorporate a role for perception. We identify agents' "perception priorities" from their violations of Luce's axiom of independence from irrelevant alternatives. Using such perception priorities, we adjust choice probabilities to account for the effects of perception. Our axiomatization requires that the agents' adjusted random choice conforms to Luce's model. Our model can explain the attraction, compromise, and similarity effects, which are very well-documented behavioral phenomena in individual choice.https://resolver.caltech.edu/CaltechAUTHORS:20160321-134531088Strategic Uncertainty and Unraveling in Matching Markets
https://resolver.caltech.edu/CaltechAUTHORS:20160321-134856275
DOI: 10.7907/qmt4b-v3n68
We present a theoretical explanation of inefficient early matching in matching markets. Our explanation is based on strategic uncertainty and strategic unraveling. We identify a negative externality imposed on the rest of the market by agents who make early offers. As a consequence, an agent may make an early offer because she is concerned that others are making early offers. Yet other agents make early offers because they are concerned that others worry about early offers; and so on and so forth. The end result is that any given agent is more likely to make an early offer than a later offer.https://resolver.caltech.edu/CaltechAUTHORS:20160321-134856275Ordinal and cardinal solution concepts for two-sided matching
https://resolver.caltech.edu/CaltechAUTHORS:20160321-134222786
We characterize solutions for two-sided matching, both in the transferable- and in the nontransferable-utility frameworks, using a cardinal formulation. Our approach makes
the comparison of the matching models with and without transfers particularly transparent. We introduce the concept of a no-trade matching to study the role of transfers in matching. A no-trade matching is one in which the availability of transfers do not affect the outcome.https://resolver.caltech.edu/CaltechAUTHORS:20160321-134222786Testable Implications of Bargaining Theories
https://resolver.caltech.edu/CaltechAUTHORS:20160322-142103902
DOI: 10.7907/emff8-xea35
We develop the testable implications of well-known theories of bargaining over money. Given a finite data set of bargaining outcomes, where utility functions are unknown, we ask if a given theory could have generated the observations. When the data come with a fixed disagreement point, we show that the Nash, utilitarian, and the egalitarian max-min bargaining solutions are all observationally equivalent. These theories are in turn characterized by a simple test of comonotonicity of bargaining outcomes.
When the disagreement point is allowed to vary, we characterize the testable implications of the equal gain/loss egalitarian solution. The main application of our result is to testing the tax code for compliance with the principle of equal loss. For other theories, we introduce a general method based on the study of real solutions to systems of polynomial inequalities.https://resolver.caltech.edu/CaltechAUTHORS:20160322-142103902The Core Matchings of Markets with Transfers
https://resolver.caltech.edu/CaltechAUTHORS:20160321-141503758
DOI: 10.7907/mzhde-p8e50
We characterize the structure of the set of core matchings of an assignment game (a two-sided market with transfers). Such a set satisfies a property we call consistency. Consistency of a set of matchings states that, for any matching v, if, for each agent i there exists a matching μ in the set for which μ(i) = v(i), then v is in the set. A set of matchings satisfies consistency if and only if there is an assignment game for which all elements of the set maximize the surplus. We also identify conditions under which we can assume the assignment game has nonnegative values.https://resolver.caltech.edu/CaltechAUTHORS:20160321-141503758Response Time and Utility
https://resolver.caltech.edu/CaltechAUTHORS:20160321-135336141
DOI: 10.7907/fbmth-qyz68
Response time is the time an agent needs to make a decision. One fundamental finding in psychology and neuroscience is that, in a binary choice, the response time is shorter as the difference between the utilities of the two options becomes larger. We consider situations in which utilities are not observed, but rather inferred from revealed preferences: meaning they are inferred from subjects' choices. Given data on subjects' choices, and the time to make those choices, we give conditions on the data that characterize the property that response time is decreasing in utility differences.https://resolver.caltech.edu/CaltechAUTHORS:20160321-135336141General Luce Model
https://resolver.caltech.edu/CaltechAUTHORS:20160323-155625592
DOI: 10.7907/s6fev-wn796
We extend the Luce model of discrete choice theory to satisfactorily handle zero-probability choices. The Luce model (or the Logit model) is the most widely applied and used model in stochastic choice, but it struggles to explain choices that are never made. The Luce model requires that if an alternative y is never chosen when x is available, then there is no set of alternatives from which y is chosen with positive probability: y cannot be chose, if from sets of alternatives that exclude x. We relax this assumption. In our model, if an alternative y is never chosen when x is available, then we infer that y is dominated by x. While dominated by x, y may still be chosen with positive probability - even with high probability - when grouped with a comparable set of alternatives.https://resolver.caltech.edu/CaltechAUTHORS:20160323-155625592Social History and Agricultural Productivity: The Paris Basin, 1450-1800
https://resolver.caltech.edu/CaltechAUTHORS:20160321-095731969
DOI: 10.7907/f7ts1-4xm48
This paper uses a sample of leases and a new method to examine total factor productivity in the Paris Basin during the years 1450-1789. After defending the methodology, the paper analyzes he results from the sample, which should dispel the myth of agricultural stagnation in Old-Regime France, at least in the Paris Basin.https://resolver.caltech.edu/CaltechAUTHORS:20160321-095731969Economic Theory and Sharecropping in Early Modern France
https://resolver.caltech.edu/CaltechAUTHORS:20160321-095554842
DOI: 10.7907/d8p0s-tkx41
This paper uses a simple economic model of contract choice to explain the growth of sharecropping in sixteenth- and seventeenth century France--a topic that figures in much of the social and economic history of the period. The theory turns out to fit both qualitative and quantitative evidence, and although the results are as yet only preliminary, the theory does provide a better account of the spread of sharecropping than the explanations early modern historians have tended to rely upon.https://resolver.caltech.edu/CaltechAUTHORS:20160321-095554842New Evidence for an Old Controversy: Scattered Landholdings and Open Fields
https://resolver.caltech.edu/CaltechAUTHORS:20160321-100029690
DOI: 10.7907/rw4ty-5r351
We bring new evidence to bear on McCloskey's argument that farmers in the open fields reduced risk by scattering their land holdings. The new evidence is the grain output from a number of plots of land in two French villages, Onnaing and Quarouble, during the years 1701-1790. When combined with prices and wages, the output figures provide financial returns for each plot of land, and financial theory then allows us to construct land portfolios that minimize portfolio variance for a given mean return. The virtue of using returns (rather than simple output correlations) is that the returns take into account the price fluctuations farmers encountered. They also allow us to distinguish the benefits of scattering from those produced by crop diversification and they do so with greater accuracy than the output figures. In the end, the returns demonstrate that scattering of land holdings provided relatively little insurance. The real reduction in risk came not from scattering but from the diversification across crops inherent in the three-field system.https://resolver.caltech.edu/CaltechAUTHORS:20160321-100029690Pious Bequests in Wills: A Statistical Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20160321-095011052
DOI: 10.7907/8k9fz-2dq44
This paper employs maximum likelihood methods to analyze the increase in pious bequests in early modern French wills. Other historians have described the rise of pious bequests in wills, but no one has used multivariate statistical methods to explain the phenomenon. It turns out that at any level of wealth pious bequests rose over the course of the seventeenth century and that the bequests were most pronounced among the literate and women. The paper argues that the increase in pious bequests was mark of growing support for the Counter Reformation, which attracted an inordinate number of supporters in educated circles and in the female population.https://resolver.caltech.edu/CaltechAUTHORS:20160321-095011052Social History and Taxes: the Case of Early Modern France
https://resolver.caltech.edu/CaltechAUTHORS:20160321-095244068
DOI: 10.7907/h71ej-w4525
[Introduction] Apart from a flurry of interest in tax revolts ten years ago, social historians of early modern Europe have by and large ignored taxation. Their neglect is perhaps understandable, given that social history itself arose as a revolt against traditional political history and all that it entailed, including the operations of the fisc. The fact that details of early modern fiscal systems often lie interred in tedious administrative histories or that many political historians themselves seem to overlook matters of interest to social historians of course only compounds the problem.https://resolver.caltech.edu/CaltechAUTHORS:20160321-095244068Voting and Lottery Drafts as Efficient Public Goods Mechanisms
https://resolver.caltech.edu/CaltechAUTHORS:20160328-160830650
DOI: 10.7907/x8d9n-q6988
This paper characterizes interim efficient mechanisms for public good production and
cost allocation in a two-type environment with risk neutral, quasi-linear preferences and
fixed size projects, where the distribution of the private good, as well as the public
goods decision, affects social welfare. An efficient public good decision can always be
accomplished by a majority voting scheme, where the number of "YES" votes required
depends on the welfare weights in a simple way. The results are shown to have a natural
geometry and an intuitive interpretation. We also extend these results to allow for
restrictions on feasible transfer rules, ranging from the traditional unlimited transfers to
the extreme case of no transfers.
For a range of welfare weights, an optimal scheme is a two-stage procedure which
combines a voting stage with a second stage where an even-chance lottery is used to
determine who pays. We call this the "lottery draft mechanism". Since such a costsharing
scheme does not require transfers, it follows that in many cases transfers are
not necessary to achieve the optimal allocation. For other ranges of welfare weights
the second stage is more complicated, but the voting stage remains the same. If transfers
are completely infeasible, randomized voting rules may be optimal. The paper also
provides a geometric characterization of the effects of voluntary participation constraints.https://resolver.caltech.edu/CaltechAUTHORS:20160328-160830650Simple Two-Stage Inference for A Class of Partially Identified Models
https://resolver.caltech.edu/CaltechAUTHORS:20160330-152952577
DOI: 10.7907/3gn2w-wxz77
This note proposes a new two-stage estimation and inference procedure for a class of partially identified models. The procedure can be considered an extension of classical minimum distance estimation procedures to accommodate inequality constraints and partial identification. It involves no tuning parameter, is nonconservative and is conceptually and computationally simple. The class of models includes models of interest to applied researchers, including the static entry game, a voting game with communication and a discrete mixture model.https://resolver.caltech.edu/CaltechAUTHORS:20160330-152952577Random Projection Estimation of Discrete-Choice Models With Large Choice Sets
https://resolver.caltech.edu/CaltechAUTHORS:20160329-095921634
DOI: 10.7907/9g8y6-hqs08
We introduce sparse random projection, an important dimension-reduction tool from machine learning, for the estimation of discrete-choice models with high-dimensional choice sets. Initially, the high-dimensional data are compressed into a lower-dimensional Euclidean space using random projections. Subsequently, estimation proceeds using cyclic monotonicity moment inequalities implied by the multinomial choice model; the estimation procedure is semi-parametric and does not require explicit distributional assumptions to be made regarding the random utility errors. The random projection procedure is justified via the Johnson-Lindenstrauss Lemma: - the pairwise distances between data points are preserved during data compression, which we exploit to show convergence of our estimator. The estimator works well in a computational simulation and in a application to a supermarket scanner dataset.https://resolver.caltech.edu/CaltechAUTHORS:20160329-095921634Testing the Quantal Response Hypothesis
https://resolver.caltech.edu/CaltechAUTHORS:20160329-095654596
DOI: 10.7907/2ajb4-d0492
This paper develops a formal test for consistency of players' behavior in a series of games with the quantal response equilibrium (QRE). The test exploits a characterization of the equilibrium choice probabilities in a QRE as the gradient of a convex function, which thus satisfies the cyclic monotonicity inequalities. Our testing procedure utilizes
recent econometric results for moment inequality models. We assess the performance of the test using both Monte Carlo simulation and lab experimental data from a series of generalized matching pennies games. Our experimental findings are consistent with the literature: the joint hypothesis of QRE, risk neutrality and player role homogeneity is rejected in the pooled data, but cannot be rejected in the individual data for over half of the subjects. By considering subsets of cycle monotonicity inequalities, our approach also highlights the nature of QRE consistency violations.https://resolver.caltech.edu/CaltechAUTHORS:20160329-095654596To Score or Not to Score? Estimates of a Sponsored Search Auction Model
https://resolver.caltech.edu/CaltechAUTHORS:20160329-095057403
DOI: 10.7907/gjyrs-nap73
We estimate a structural model of a sponsored search auction model. To accomodate the "position paradox", we relax the assumption of decreasing click volumes with position ranks, which is often assumed in the literature. Using data from "Website X", one of the largest online market places in China, we find that merchants of different qualities adopt different bidding strategies: high quality merchants bid more aggressively for informative keywords, while low quality merchants are more likely to be sorted to the top positions for value keywords. Counterfactual evaluations show that the price trend becomes steeper after moving to a score-weighted generalized second price auction, with much higher prices obtained for the top position but lower prices for the other positions. Overall, there is only a very modest change in total revenue from introducing popularity scoring, despite the intent in bid scoring to reward popular merchants with price discounts.https://resolver.caltech.edu/CaltechAUTHORS:20160329-095057403Estimating Multinomial Choice Models using Cyclic Monotonicity
https://resolver.caltech.edu/CaltechAUTHORS:20160329-095352718
DOI: 10.7907/njsz3-40831
This paper proposes a new identification and estimation approach to semi-parametric multinomial choice models that easily applies to not only cross-sectional settings but also panel data settings with unobservable fixed effects. Our approach is based on cyclic monotonicity, which is a defining feature of the random utility framework underlying multinomial choice models. From the cyclic monotonicity property, we derive identifying
inequalities without requiring any shape restriction for the distribution of the random utility shocks. These inequalities point identify model parameters under straightforward assumptions on the covariates. We propose a consistent estimator based on these inequalities, and apply it to a panel data set to study the determinants of the demand of bathroom tissue.https://resolver.caltech.edu/CaltechAUTHORS:20160329-095352718A Simple Estimator for Dynamic Models with Serially Correlated Unobservables
https://resolver.caltech.edu/CaltechAUTHORS:20160330-124651115
DOI: 10.7907/6372h-8ss62
We present a method for estimating Markov dynamic models with unobserved state
variables which can be serially correlated over time. We focus on the case where all
the model variables have discrete support. Our estimator is simple to compute because
it is noniterative, and involves only elementary matrix manipulations. Our estimation
method is nonparametric, in that no parametric assumptions on the distributions of the
unobserved state variables or the laws of motions of the state variables are required.
Monte Carlo simulations show that the estimator performs well in practice, and we
illustrate its use with a dataset of doctors' prescription of pharmaceutical drugs.https://resolver.caltech.edu/CaltechAUTHORS:20160330-124651115Preference Identification
https://resolver.caltech.edu/CaltechAUTHORS:20170707-095244159
DOI: 10.48550/arXiv.1807.11585
An experimenter seeks to learn a subject's preference relation. The experimenter produces pairs of alternatives. For each pair, the subject is asked to choose. We argue that, in general, large but finite data do not give close approximations of the subject's preference, even when countably infinite many data points are enough to infer the preference perfectly. We then provide sufficient conditions on the set of alternatives, preferences, and sequences of pairs so that the observation of finitely many choices allows the experimenter to learn the subject's preference with arbitrary precision. The sufficient conditions are strong, but encompass many situations of interest. And while preferences are approximated, we show that it is harder to identify utility functions. We illustrate our results with several examples, including expected utility, and preferences in the Anscombe-Aumann model.https://resolver.caltech.edu/CaltechAUTHORS:20170707-095244159ACE: A Combinatorial Market Mechanism
https://resolver.caltech.edu/CaltechAUTHORS:20170707-110629443
DOI: 10.7907/mte8a-n4r44
In 1990 the South Coast Air Quality Management District created a tradable emissions program to more efficiently manage the ex- tremely bad emissions in the
Los Angeles basin. The program created 136 different assets that an environmental engineer could use to cover emissions in place of installing expensive abatement equipment. Standard markets could not deal with this complexity and little trading occurred. A new combinatorial market was created in response and operated successfully for many years. That market design, called ACE (approximate competitive equilibrium), is described in detail and its successful performance in practice is analyzed.https://resolver.caltech.edu/CaltechAUTHORS:20170707-110629443Candidate entry and political polarization: An experimental study
https://resolver.caltech.edu/CaltechAUTHORS:20170707-100521521
DOI: 10.7907/92153-cfm32
We report the results of a laboratory experiment based on a citizen‐candidate model with private information about ideal points. Inefficient political polarization is observed in all treatments; that is, citizens with extreme ideal points enter as candidates more often than moderate citizens. Second, less entry occurs, with even greater polarization, when voters have directional information about candidates' ideal points, using ideological party labels. Nonetheless, this directional information is welfare enhancing because the inefficiency from greater polarization is outweighed by lower total entry costs and better voter information. Third, entry rates are decreasing in group size and the entry cost. These findings are all implied by properties of the unique symmetric Bayesian equilibrium of the entry game. Quantitatively, we observe too little (too much) entry when the theoretical entry rates are high (low). This general pattern of observed biases in entry rates is implied by logit quantal response equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170707-100521521Nowhere to Run, Nowhere to Hide: Asset Diversification in a Flat World
https://resolver.caltech.edu/CaltechAUTHORS:20170707-102156656
DOI: 10.7907/5e9f8-dp598
We present new international diversification indexes across equity, sovereign debt, and real estate. The indexes reveal a marked and near ubiquitous decline in diversification potential across asset classes and markets for the post-2000 period. Analysis of panel data suggests that the decline is related to higher levels of market credit risk and volatility as well as to technology and communications innovation as proxied by internet diffusion. The decline in diversification opportunity is associated with sharply higher levels of investment risk.https://resolver.caltech.edu/CaltechAUTHORS:20170707-102156656Empirical Evidence of Overbidding in M&A Contests
https://resolver.caltech.edu/CaltechAUTHORS:20170707-101356502
DOI: 10.7907/t9c90-47y20
Surprisingly few papers have attempted to develop a direct empirical test for overbidding in M&A contests. We develop such a test grounded on a necessary condition for profit maximizing bidding behavior. The test is not subject to endogeneity concerns. Our results strongly support the existence of overbidding. We provide evidence that overbidding is related to conflicts of
interest, but also some indirect evidence that it arises from failing to fully account for the winner's curse.https://resolver.caltech.edu/CaltechAUTHORS:20170707-101356502The Effects of Income Mobility and Tax Persistence on Income Redistribution and Inequality
https://resolver.caltech.edu/CaltechAUTHORS:20170707-111918164
DOI: 10.7907/15m59-zva74
We explore the effect of income mobility and the persistence of redistributive tax policy on the level of redistribution in democratic societies. An infinite-horizon theoretical model is developed, and the properties of the equilibrium tax rate and the degree of after-tax inequality are characterized. Mobility and stickiness of tax policy are both negatively related to the equilibrium tax rate. However, neither is sufficient by itself. Social mobility has no effect on equilibrium taxes if tax policy is voted on in every period, and tax persistence has no effect in the absence of social mobility. The two forces are complementary. Tax persistence leads to higher levels of post-tax inequality, for any amount of mobility. The effect of mobility on inequality is less clear-cut and depends on the degree of tax persistence. A laboratory experiment is conducted to test the main comparative static predictions of the theory, and the results are generally supportive.https://resolver.caltech.edu/CaltechAUTHORS:20170707-111918164Manipulation
https://resolver.caltech.edu/CaltechAUTHORS:20170707-094613945
DOI: 10.7907/wxbw4-tfx47
[Introduction] Systematic opportunities for manipulation emerge as a by-product of the structure of all group decision processes. Theory suggests that no process is immune. The study of manipulation provides principles and insights about how parts of complex decision systems work together and how changes in one part can have broad impact. Thus, manipulation strategies are derived from many features of voting processes. Often they are the product of changes in the decision environment, including rules, procedures and influence on others, in order to achieve a specific purpose. The issues and variables go beyond individual's own voting strategy within a specific setting and whether or not preferences are truthfully revealed – an issue often studied. Hopefully, the insights can lead to avenues for improvements to decision processes and thus, produce a better understanding of process vulnerabilities.https://resolver.caltech.edu/CaltechAUTHORS:20170707-094613945Subset Optimization for Asset Allocation
https://resolver.caltech.edu/CaltechAUTHORS:20170725-111618998
DOI: 10.7907/vrg1e-9xr80
Subset optimization provides a new algorithm for asset allocation that's particularly useful in settings with many securities and short return histories. Rather than optimizing weights for all N securities jointly, subset optimization constructs Complete Subset Portfolios (CSPs) that naively aggregate many "Subset Portfolios," each optimizing weights over a subset of only ^N randomly selected securities. With known means and variances, the complete subset efficient frontier for different subset sizes characterizes CSPs' utility loss due to satisficing, which generally decreases with ^N . In finite samples, the bound on CSPs' expected out-of-sample performance loss due to sampling error generally increases with ^N . By balancing this tradeoff, CSPs' expected out-of-sample performance dominates both the 1=N rule and sample-based optimization. Simulation and backtest experiments illustrate CSPs' robust performance against existing asset allocation strategies.https://resolver.caltech.edu/CaltechAUTHORS:20170725-111618998Communication Among Voters Benefits the Majority Party
https://resolver.caltech.edu/CaltechAUTHORS:20170725-105126699
DOI: 10.7907/33gtv-w8n81
How does communication among voters affect turnout? And who benefits from it? In a laboratory experiment in which subjects, divided into two competing parties, choose between costly voting and abstaining, we study three pre-play communication treatments: No Communication, a control; Public Communication, where all voters exchange public messages through computer chat; and Party Communication, where messages are also exchanged but only within one's own party. Our main finding is that communication always benefits the majority party by increasing its expected turnout margin and, hence, its expected margin of victory and probability of winning the election. Party communication increases overall turnout, while public communication increases turnout with a high voting cost but decreases it with a low voting cost. With communication, we find essentially no support for the standard Nash equilibrium predictions and limited consistency with correlated equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170725-105126699Multi-Utilitarianism in Two-Agent Quasilinear Social Choice
https://resolver.caltech.edu/CaltechAUTHORS:20170731-163356703
DOI: 10.7907/jsqq1-8qq48
We introduce a new class of rules for resolving quasilinear social choice problems. These rules extend those of Green. We call such rules multi-utilitarian rules. Each multi-utilitarian rule is associated with a probability measure over the set of weighted utilitarian rules, and is derived as the expectation of this probability. These rules are characterized by the axioms efficiency, translation invariance, monotonicity, continuity, and additivity. By adding recursive invariance, we obtain a class of asymmetric rules generalizing those Green characterizes. A multi-utilitarian rule satisfying strong monotonicity has an associated probability measure with full support.https://resolver.caltech.edu/CaltechAUTHORS:20170731-163356703Hedonism vs. Nihilism: No Arbitrage and Tests of Urban Economic Models
https://resolver.caltech.edu/CaltechAUTHORS:20170801-150130331
DOI: 10.7907/k41v7-g5s87
We present two notions of "no arbitrage" in urban economic models and show that there is no model satisfying both. The standard hedonic housing model of urban economics and its generalizations are consistent with the first of these, but inconsistent with the second. We present a model consistent with the second notion of "no arbitrage" and a continuum of models consistent with neither notion that are observationally
equivalent to the standard model, even if the utility function of consumers is known. Only one of these is the standard model. Thus, the available tests of the standard model cannot provide much evidence of its validity. Finally, we examine nonlinear price systems consistent with the second notion of "no arbitrage" and their welfare consequences.https://resolver.caltech.edu/CaltechAUTHORS:20170801-150130331The Relevance of a Choice of Auction Format in a Competitive Environment
https://resolver.caltech.edu/CaltechAUTHORS:20170731-145236001
DOI: 10.7907/tkn9d-mbv20
We examine the relevance of an auction format in a competitive environment by analyzing and comparing uniform and discriminatory price auctions with many bidders. We show that if the number of objects for sale is small relative to the number of bidders, then all equilibria of both auctions are approximately efficient and lead to approximately the same revenue. When the number of objects for sale is proportional to the number of bidders, then the particulars of the auction format matter. All equilibria of the uniform auction are efficient while all of the equilibria of the discriminatory auction are inefficient. The relative revenue rankings of the auction formats can go in either direction, depending on the specifics of the environment.https://resolver.caltech.edu/CaltechAUTHORS:20170731-145236001Counting Combinatorial Choice Rules
https://resolver.caltech.edu/CaltechAUTHORS:20170731-131625843
DOI: 10.7907/5zgfd-ahq27
I count the number of combinatorial choice rules that satisfy certain properties: Kelso-Crawford substitutability, and independence of irrelevant alternatives. The results are important for two-sided matching theory, where agents are modeled by combinatorial choice rules with these properties. The rules are a small, and asymtotically vanishing, fraction of all choice rules. But they are still exponentially more than the preference relations over individual agents—which has positive implications for the Gale-Shapley algorithm of matching theory.https://resolver.caltech.edu/CaltechAUTHORS:20170731-131625843A Voting Model of Federal Standards with Externalities
https://resolver.caltech.edu/CaltechAUTHORS:20170801-134757637
DOI: 10.7907/7cdr5-27t20
This paper proposes a framework for studying policy making in a federal system in the presence of spillover externalities. Local jurisdictions choose local policies by majority rule subject to standards that are set by majority rule at the federal level. We characterize the induced preferences of voters for federal policies, prove existence of local majority rule equilibrium, provide an example of nonexistence of global majority rule equilibrium, and explore the welfare properties of federal standards in the presence of spillovers.https://resolver.caltech.edu/CaltechAUTHORS:20170801-134757637A Model of Elections with Spatial and Distributive Preferences
https://resolver.caltech.edu/CaltechAUTHORS:20170731-132542621
DOI: 10.7907/fszkv-dad98
This paper introduces a model where elections are games where voters have preferences over a public good (policy platforms) and a private good (transfers). The model produces the standard social choice results such as core convergence and policy separation. Furthermore, by introducing transfers, I am able to make more precise predictions about candidate locations and their dynamics than is possible under the standard spatial model. Another purpose of this paper lies in the creation of favored groups in elections.
Ultimately, it is important to characterize political behavior while considering the different preferences that might exist in the constituents. By incorporating utility for private goods into standard utility assumptions, this model introduces these considerations into the standard spatial model, allowing us to have a richer and more nuanced look into elections.https://resolver.caltech.edu/CaltechAUTHORS:20170731-132542621Secure Implementation Experiments: Do Strategy-proof Mechanisms Really Work?
https://resolver.caltech.edu/CaltechAUTHORS:20170801-151328081
DOI: 10.7907/hgxxb-qc166
Strategy-proofness, requiring that truth-telling is a dominant strategy, is a standard concept used in social choice theory. Saijo, et al., (2003) argue that this concept has serious drawbacks. In particular, announcing one's true preference may not be a unique dominant strategy, and almost all strategy-proof mechanisms have a continuum of Nash equilibria. For only a subset of strategy-proof mechanisms do the set of Nash equilibria
and the set of dominant strategy equilibria coincide. For example, this double coincidence occurs in the Groves mechanism when preferences are single-peaked. We report experiments using two strategy-proof mechanisms where one of them has a large number of Nash equilibria, but the other has a unique Nash equilibrium. We found clear differences in the rate of dominant strategy play between the two.https://resolver.caltech.edu/CaltechAUTHORS:20170801-151328081Is the Status Quo Relevant in a Representative Democracy?
https://resolver.caltech.edu/CaltechAUTHORS:20170731-163819462
DOI: 10.7907/rmqqz-v4303
This work studies the effect of the value of the status quo in the candidates' decisions and policy outcomes in a representative democracy with endogenous candidates. Following the citizen-candidate model due to Besley and Coate (1997) we show, for a unidimensional policy issue and for both an odd and even number of citizens, that some equilibria only hold for certain values of
the status quo policy. In particular we find that a moderate status quo rules out equilibrium outcomes in which there is an uncontested candidate and that two-candidate equilibria exist more generally when the number of citizens is even.https://resolver.caltech.edu/CaltechAUTHORS:20170731-163819462A Theory of Stability in Many-to-many Matching Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170731-153916899
DOI: 10.7907/c57yy-fm360
We develop a theory of stability in many-to-many matching markets. We give conditions under which the setwise-stable set, a core-like concept, is nonempty and can be approached through an algorithm. The setwise-stable set coincides with the pairwise-stable set, and with the predictions of a non-cooperative bargaining model. The set-wise stable set possesses the canonical conflict/coincidence of interest properties from many-to-one, and one-to-one models. The theory parallels the standard theory of stability for many-to-one, and one-to-one, models. We provide results for a number of core-like
solutions, besides the setwise-stable set.https://resolver.caltech.edu/CaltechAUTHORS:20170731-153916899Dynamic Urban Models: Agglomeration and Growth
https://resolver.caltech.edu/CaltechAUTHORS:20170801-144041981
DOI: 10.7907/941k4-0x975
Theoretical models of urban growth are surveyed in a common framework. Exogenous growth models, where growth in some capital stock as a function of investment is assumed, are examined first. Then endogenous growth models, where use of some factor by a firm increases the productivity of other firms, are studied. These are all models with perfect competition among agents. Next, models with imperfect competition are discussed. There are two varieties: those employing a monopolistic competition approach to product differentiation, and those employing explicit externalities but lacking some markets. Finally, avenues for future research are explored. Correlations between agglomeration and growth in the various models and data are compared.https://resolver.caltech.edu/CaltechAUTHORS:20170801-144041981Strategy-proof Sharing
https://resolver.caltech.edu/CaltechAUTHORS:20170801-134753475
DOI: 10.7907/f8k4x-e9y30
We consider the problem of sharing a good, where agents prefer more to less. In this environment, we prove that a sharing rule satisfies strategy-proofness if and only if it has the quasi-constancy property: no one changes her own share by changing her announcements. Next, by constructing a system of linear equations, we provide a way to find all of the strategy-proof sharing rules, and identify a necessary and sufficient condition for the existence of a non-constant, strategy-proof sharing rule. Finally, we show that it is only the equal sharing rule that satisfies strategy-proofness and symmetry.https://resolver.caltech.edu/CaltechAUTHORS:20170801-134753475Comparative Statics, English Auctions, and the Stolper-Samuelson Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170731-162537566
DOI: 10.7907/14ndq-cem43
Changes in the parameters of an n-dimensional system of equations induce changes in its solutions. For a class of such systems, we determine the qualitative change in solutions given certain qualitative changes in parameters. Our methods and results are elementary yet useful. They highlight the existence of a common thread, our "own effect" assumption, in formally diverse areas of economics. We discuss several applications; among them, we establish the existence of efficient equilibria in English auctions with interdependent valuations, and a version of the Stolper-Samuelson Theorem for an nxn trade model.https://resolver.caltech.edu/CaltechAUTHORS:20170731-162537566On the Informational Inefficiency of Discriminatory Price Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170731-144544163
DOI: 10.7907/kf5pq-qm872
We analyze bidding behavior in large discriminatory price auctions where the number of objects is a non-trivial proportion of the number of bidders. Bidders observe private signals that are affiliated with the common value. We show that the average price in the auction is biased downward from the expected value of the objects, even in the competitive limit. In particular, we show that conditional on relatively low signals, bidders bid the expected value of the objects conditional on their information and winning; while bids at higher signals
flatten out and are below the expected value conditional on winning.https://resolver.caltech.edu/CaltechAUTHORS:20170731-144544163Group Play in Games and the Role of Consent in Network Formation
https://resolver.caltech.edu/CaltechAUTHORS:20170731-141927016
DOI: 10.7907/a7xa5-mmq08
We study games played between groups of players, where a given group decides which strategy it will play through a vote by its members. When groups consist of two voting players, our games can also be interpreted as network-formation games. In experiments on Stag Hunt games, we find that that the structure of the voting rule completely determines which equilibrium is played, independently of the payoff structure. Thus, we find a stark contrast between how groups and individuals play our games, with payoffs playing a much more important role in equilibrium selection in the latter case. We also explore play between groups where one member of each group dictates the play of that group. We find that the dictator tends to play a less risky strategy when choosing for a group than when playing only for him or herself.
We develop a new solution concept, robust-belief equilibrium, which explains the data that we observe. We provide results showing that this solution concept has application beyond the particular games in our experiments.https://resolver.caltech.edu/CaltechAUTHORS:20170731-141927016Virtual Repeated Implementation
https://resolver.caltech.edu/CaltechAUTHORS:20170731-162153289
DOI: 10.7907/3q77x-a0h59
We show that in the context of repeated implementation, any social choice rule which realizes all alternatives for a positive (yet arbitrarily small) amount of time is Nash implementable. The results complement those of the virtual implementation literature.https://resolver.caltech.edu/CaltechAUTHORS:20170731-162153289Sequencing Strategies in Large, Competitive, Ascending Price Automobile Auctions: An experimental examination
https://resolver.caltech.edu/CaltechAUTHORS:20170801-104840850
DOI: 10.7907/n0e45-k4k76
This paper reports on a large scale field experiment testing strategies available to a seller participating in simultaneous competitive sequential, ascending price automobile auctions. Every other week, the seller offered approximately 120 vehicles for sale in an auction environment in which several competitive sellers offered on the order of 3,000 vehicles. The experiment tested various sequences in which the seller could offer the vehicles, such as high values first or low values first. Surprisingly, and contrary to intuition drawn from the theory of single item and single seller auctions, the worst performing sequence from those tested is for the seller to order vehicles from highest to lowest values. The best sequence is to group the vehicles by type and offer the low valued vehicles first and then move to offer the higher valued vehicles. Our conjecture is that this sequence reduces the competition with other sellers for the attention of specialized buyers.https://resolver.caltech.edu/CaltechAUTHORS:20170801-104840850Minorities and Storable Votes
https://resolver.caltech.edu/CaltechAUTHORS:20170801-092302519
DOI: 10.7907/ncqpj-fb226
The paper studies a simple voting system that has the potential to increase the power of minorities without sacrificing aggregate efficiency. Storable votes grant each voter a stock of votes to spend as desired over a series of binary decisions. By accumulating votes on issues that it deems most important, the minority can win occasionally. But because the majority typically can outvote it, the minority wins only if its strength of preference is high and the majority's strength of preference is low. The result is that with storable votes, aggregate efficiency either falls little or in fact rises. The theoretical predictions of our model are confirmed by a series of experiments: the frequency of minority victories, the relative payoff of the minority versus the majority, and the aggregate payoffs all match the theory.https://resolver.caltech.edu/CaltechAUTHORS:20170801-092302519The Formation of Networks with Transfers Among Players
https://resolver.caltech.edu/CaltechAUTHORS:20170731-140304247
DOI: 10.7907/ykz9t-00w66
We examine the formation of networks among a set of players whose payoffs depend on the structure of the network. We focus on games where players may bargain by promising or demanding transfer payments when forming links. We examine several variations of the transfer/bargaining aspect of link formation. One aspect is whether players can only make and receive transfers to other players to whom they are directly linked, or whether they can also subsidize links that they are not directly involved in. Another aspect is whether or not transfers related to a given link can be made contingent on the full resulting network or only on the link itself. A final aspect is whether or not players can pay other players to refrain from forming links. We characterize the networks that are supported under these variations and show how each of the above aspects is related
either to accounting for a specific type of externality, or to dealing with the combinatorial nature of network payoffs.https://resolver.caltech.edu/CaltechAUTHORS:20170731-140304247Production, Trade, Prices, Exchange Rates and Equilibration in Large Experimental Economies
https://resolver.caltech.edu/CaltechAUTHORS:20170731-151806094
DOI: 10.7907/e83m7-dtn86
We study market equilibration in laboratory economies that are larger and more complex than any that have been studied experimentally to date. Complexity is derived from the fact that the economies are international in economic structure with multiple input, output, and foreign exchange markets in operation. The economies have twenty-one markets and due to the fact that they have roughly �fifty agents, the economies are characterized by several hundred equations. In spite of the complexity and interdependence of the economy, the results demonstrate the substantial power of the general equilibrium model of perfect competition to predict the direction of movement of market-level variables. Empirical patterns in the convergence process are explored and described.https://resolver.caltech.edu/CaltechAUTHORS:20170731-151806094Attack Politics: Who Goes Negative and Why?
https://resolver.caltech.edu/CaltechAUTHORS:20170801-102048204
DOI: 10.7907/p4aew-npv33
I introduce a formal model of campaign strategy to show when candidates will engage in negative campaigning and how it can affect election results. The model separates campaign strategies by target (self or opponent) and dimension (issue or character), and defines negative campaigning as attacking one's opponent on the character dimension. Whether candidates choose negative campaigning depends upon three factors: the preconceptions of the voter, the voter's preferred dimension, and the candidate types. I show that eliminating negative campaigning has an ambiguous effect on voter welfare. In some cases, eliminating the negative option can hurt superior candidates.https://resolver.caltech.edu/CaltechAUTHORS:20170801-102048204A General Characterization of Interim Efficient Mechanisms for Independent Linear Environments
https://resolver.caltech.edu/CaltechAUTHORS:20170731-152920042
DOI: 10.7907/fwfvf-9g770
We consider the class of Bayesian environments with independent types, and utility functions which are both quasi-linear in a private good and linear in a one-dimensional private-value type parameter. We call these independent linear environments. For these environments, we fully characterize interim efficient allocation rules which satisfy interim incentive compatibility and interim individual rationality constraints. We also prove that they correspond to decision rules based on virtual surplus maximization, together with the appropriate incentive taxes. We demonstrate how these techniques can be applied easily to the design of auctions, markets, bargaining rules, public good provision, and assignment problems.https://resolver.caltech.edu/CaltechAUTHORS:20170731-152920042Preference Aggregation under Uncertainty: Savage vs. Pareto
https://resolver.caltech.edu/CaltechAUTHORS:20170731-155251093
DOI: 10.7907/7mjtb-5d987
Following Mongin, we study social aggregation of subjective expected utility preferences in a Savage framework. We argue that each of Savage's P3 and P4 are incompatible with the strong Pareto property. A representation theorem for social preferences satisfying Pareto indifference and conforming to the state-dependent expected utility model is provided.https://resolver.caltech.edu/CaltechAUTHORS:20170731-155251093A Social Choice Lemma on Voting over Lotteries with Applications to a Class of Dynamic Games
https://resolver.caltech.edu/CaltechAUTHORS:20170801-153654831
DOI: 10.7907/eqx93-z9g89
We prove a lemma characterizing majority preferences over lotteries on a subset of Euclidean space. Assuming voters have quadratic von Neumann-Morgenstern utility representations, and assuming existence of a majority undominated (or "core") point, the core voter is decisive: one lottery is majority-preferred to another if and only if this is the preference of the core voter. Several applications of this result to dynamic voting games are discussed.https://resolver.caltech.edu/CaltechAUTHORS:20170801-153654831What Matchings Can Be Stable? The Refutability of Matching Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170801-105611460
DOI: 10.7907/pwgkb-16d48
When can a collection of matchings be stable, if preferences are unknown? This question lies behind the refutability of matching theory. A preference profile rationalizes a collection of matchings if the matchings are stable under the profile. Matching theory is refutable if there are observations of matchings that cannot be rationalized. I show that the theory is refutable, and provide a characterization of the matchings that can be rationalized.https://resolver.caltech.edu/CaltechAUTHORS:20170801-105611460Efficiency, Equity, and Timing of Voting Mechanisms
https://resolver.caltech.edu/CaltechAUTHORS:20170801-091439642
DOI: 10.7907/2mhxz-jsh85
We compare the behavior of voters under simultaneous and sequential voting rules when voting is costly and information is incomplete. In many political institutions, ranging from small committees to mass elections, voting is sequential, which allows some voters to know the choices of earlier voters. For a stylized model, we generate a variety of predictions about the relative efficiency and participation equity of these two systems, which we test using controlled laboratory experiments. Most of the qualitative predictions are supported by the data, but there are significant departures from the predicted equilibrium strategies, in both the sequential and simultaneous voting games. We find a tradeoff between information aggregation, efficiency, and equity in sequential voting: a sequential voting rule aggregates information better than simultaneous voting and is more efficient in some information environments, but sequential voting is inequitable because early voters pay greater participation costs.https://resolver.caltech.edu/CaltechAUTHORS:20170801-091439642Learning Dynamics in Mechanism Design: An Experimental Comparison of Public Goods Mechanisms
https://resolver.caltech.edu/CaltechAUTHORS:20170731-161011632
DOI: 10.7907/hnkz4-k2r54
In a repeated-interaction public goods economy, dynamic behavior may affect the efficiency of various mechanisms thought to be efficient in one-shot games. Inspired by results obtained in previous experiments, the current paper proposes a simple best response model in which players' beliefs are functions of previous strategy profiles. The predictions of the model are found to be highly consistent with new experimental data from five mechanisms with various types of equilibria. Interesting properties of a 2-parameter Vickrey-Clarke-Groves mechanism help to draw out this result. The simplicity of the model makes it useful in predicting dynamic stability of other mechanisms.https://resolver.caltech.edu/CaltechAUTHORS:20170731-161011632Spatial Competition Between Two Candidates of Different Quality: The Effects of Candidate Ideology and Private Information
https://resolver.caltech.edu/CaltechAUTHORS:20170801-141851171
DOI: 10.7907/v19cb-txp80
This paper examines competition in a spatial model of two-candidate elections, where one candidate enjoys a quality advantage over the other candidate. The candidates care about winning and also have policy preferences. There is two-dimensional private information. Candidate ideal points as well as their tradeoffs between policy preferences and winning are private information. The distribution of this two-dimensional type is common knowledge. The location of the median voter's ideal point is uncertain, with a distribution that is commonly known by both candidates. Pure strategy equilibria always exist in this model. We characterize the effects of increased uncertainty about the median voter, the effect of candidate policy preferences, and the effects of changes in the distribution of private information. We prove that the distribution of candidate policies approaches the mixed equilibrium of Aragones and Palfrey (2002a), when both candidates' weights on policy preferences go to zero.https://resolver.caltech.edu/CaltechAUTHORS:20170801-141851171Mental Processes and Strategic Equilibration: An fMRI Study of Selling Strategies in Second Price Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170731-150528277
DOI: 10.7907/qh08g-khx44
This study is the first to attempt to isolate a relationship between cognitive activity and equilibration to a Nash Equilibrium. Subjects, while undergoing fMRI scans of brain activity, participated in second price auctions against a single competitor following predetermined strategy that was unknown to the subject. For this auction there is a unique strategy that will maximize the subjects' earnings, which is also a Nash equilibrium of the associated game theoretic model of the auction. As is the case with all games, the bidding strategies of subjects participating in second price auctions most often do not reflect the equilibrium bidding strategy at first but with experience, typically exhibit a process of equilibration, or convergence toward the equilibrium. This research is focused on the process of convergence.https://resolver.caltech.edu/CaltechAUTHORS:20170731-150528277Reduced Form Auctions Revisited
https://resolver.caltech.edu/CaltechAUTHORS:20170731-164441362
DOI: 10.7907/4f7kp-cze06
This note uses Farkas's Lemma to prove new results on the implementability of general, asymmetric auctions, and to provide simpler proofs of known results for symmetric auctions. The tradeoff is that type spaces are taken to be finite.https://resolver.caltech.edu/CaltechAUTHORS:20170731-164441362Interim Efficient Mechanism Design with Interdependent Valuations
https://resolver.caltech.edu/CaltechAUTHORS:20170801-104202489
DOI: 10.7907/x5vbk-r4382
We consider the class of Bayesian environments with one-dimensional private signals and interdependent valuations. For these environments, we fully characterize the set of interim efficient mechanisms which satisfy interim incentive compatibility and interim individual rationality constraints. In these mechanisms, the allocation rules assign probability one to an alternative that maximizes the sum of all agents' virtual valuations that are defined for these economic settings and transfer functions are defined depending on agents' welfare weights. This set of mechanisms is compelling since interim efficient mechanisms are the best in the sense that there is no other incentive compatible and individually rational mechanism that is preferred by each agent.https://resolver.caltech.edu/CaltechAUTHORS:20170801-104202489Nomination Processes and Policy Outcomes
https://resolver.caltech.edu/CaltechAUTHORS:20170801-112425972
DOI: 10.7907/q99jt-2ty33
We model and compare three different processes by which political parties nominate candidates for a general election: nominations by party leaders, nominations by a vote of party members, and nominations by a spending competition among potential candidates. We show that in equilibrium, non-median outcomes can result when two parties compete using nominations via any of these processes. We also show that more extreme outcomes can emerge from spending competition than from nominations by votes or by party leaders. When voters (and potential nominees) are free to switch political parties, then median outcomes ensue when nominations are decided by a vote but not when nominations are decided by spending competition.https://resolver.caltech.edu/CaltechAUTHORS:20170801-112425972Design Improved Parimutuel-type Information Aggregation Mechanisms: Inaccuracies and the Long-Shot Bias as Disequilibrium Phenomena
https://resolver.caltech.edu/CaltechAUTHORS:20170728-164034910
DOI: 10.7907/ztse6-z0b19
Information Aggregation Mechanisms (IAMs) based on parimutuel-type betting systems can aggregate information from complex environments. However, the performance of previously studied systems leaves something to be desired due to possible bluffing, strategic timing of decisions and a so called long shot bias. This paper demonstrates that two modifications of parimutuel systems improve information aggregation performance by removing disinformation due to strategic behavior and by removing misleading disequilibrium behavior. The experiments also demonstrate that the so called long shot bias results from disequilibrium behavior as opposed to having roots in the psychology of the individuals.https://resolver.caltech.edu/CaltechAUTHORS:20170728-164034910Budget Balancedness and Optimal Income Taxation
https://resolver.caltech.edu/CaltechAUTHORS:20170801-152541438
DOI: 10.7907/5d8t8-qfj73
We make two main contributions to the theory of optimal income taxation. First, assuming conditions sufficient for existence of a Pareto optimal income tax and public goods mechanism, we show that if agents' preferences satisfy an extended notion of single crossing called capacity constrained single crossing, then there exists a Pareto optimal income tax and public goods mechanism that
is budget balancing. Second, we show that, even without capacity constrained single crossing, existence of a budget balancing, Pareto optimal income tax and public goods mechanism is guaranteed if the set of agent types contains no atoms.https://resolver.caltech.edu/CaltechAUTHORS:20170801-152541438Contestable Leaderships: Party Discipline and Vote Buying in Legislatures
https://resolver.caltech.edu/CaltechAUTHORS:20170801-102925452
DOI: 10.7907/f6486-5ae40
This paper examines the institutional determinants of discipline in legislative parties building on the premise that leaders need to maintain support within the organization to continue leading. Payments distributed by the incumbent on the spot increase the value of promises of future benefits by fostering individuals' perceived chances that the incumbent will retain her position. The main result of the paper shows, in fact, that the party leader can use promises of future benefits to induce members to vote for a position disliked by the majority of the party only if she also distributes benefits on the spot.https://resolver.caltech.edu/CaltechAUTHORS:20170801-102925452On behavioral complementarity and its implications
https://resolver.caltech.edu/CaltechAUTHORS:20170728-161413630
DOI: 10.7907/zxcag-rzk39
We study the behavioral definition of complementary goods: if the price of one good increases, demand for a complementary good must decrease. We obtain its full implications for observable demand behavior (its testable implications), and for the consumer's underlying preferences. We characterize those data sets which can be generated by rational preferences exhibiting complementarities. In a model in which income results from selling an endowment (as in general equilibrium models of exchange economies), the notion is surprisingly strong and is essentially equivalent to Leontief preferences. In the model of nominal income, the notion describes a class of preferences whose extreme cases are Leontief and Cobb-Douglas respectively.https://resolver.caltech.edu/CaltechAUTHORS:20170728-161413630Diffusion of Behavior and Equilibrium Properties in Network Games
https://resolver.caltech.edu/CaltechAUTHORS:20170801-085146282
DOI: 10.7907/b57qw-xmy24
We analyze games on social networks where agents select one of two actions (whether or not to adopt a new technology, withdraw money from the bank, become politically active, etc.). Agents' payoffs from each of the two actions depend on how many neighbors she has, the distribution of actions among her neighbors, and a possibly idiosyncratic cost for each of the actions. We analyze the diffusion of behavior when in each period agents choose a best response to last period's behavior. We characterize how the equilibrium points of such a process and their stability depend on the network architecture, the distribution of costs, and the payoff structure. We also illustrate how the dynamics of behavior depends on the number of neighbors that agents have. Our results have implications and applications to marketing, epidemiology, financial contagions, and technology adoption.https://resolver.caltech.edu/CaltechAUTHORS:20170801-085146282Heterogeneous Quantal Response Equilibrium and Cognitive Hierarchies
https://resolver.caltech.edu/CaltechAUTHORS:20170801-093456313
DOI: 10.7907/1dxq5-hze86
We explore an equilibrium model of games where players' choice behavior is given by logit response functions, but their payoff responsiveness is heterogeneous. We extend the definition of quantal response equilibrium to this setting, calling it heterogeneous quantal response equilibrium (HQRE), and prove existence under weak conditions. We generalize HQRE to allow for limited insight, in which players can only imagine others with low responsiveness. We identify a formal connection between this new equilibrium concept, called truncated quantal response equilibrium (TQRE), and the Cognitive Hierarchy (CH) model. We show that CH can be approximated arbitrarily closely by TQRE. We report a series of experiments comparing the performance of QRE, HQRE, TQRE and CH. A surprise is that the fi of the models are quite close across a variety of matrix and dominance-solvable asymmetric information betting games. The key link is that in the QRE approaches, strategies with higher expected payoffs are chosen more often than strategies with lower expected payoff. In CH this property is not built into the model, but generally holds true in the experimental data.https://resolver.caltech.edu/CaltechAUTHORS:20170801-093456313Electoral Competition with Privately-Informed Candidates
https://resolver.caltech.edu/CaltechAUTHORS:20170801-142748283
DOI: 10.7907/qghv9-ecr54
We consider a model of elections in which two office-motivated candidates receive private signals about the location of the median voter's ideal point prior to taking policy positions. We show that at most one pure strategy equilibrium exists and provide a sharp characterization, if one exists: After receiving a signal, each candidate locates at the median of the distribution of the median voter's location, conditional on the other candidate receiving the same signal. It follows that a candidate's position, conditional on his/her signal, is a biased estimate of the true median, with candidate positions tending to the extremes of the policy space. We provide sufficient conditions for the existence of a pure strategy equilibrium. Essentially, the pure strategy equilibrium exists if for each signal, the other candidate is sufficiently likely to receive a signal in the same direction that is at least as extreme. Though the electoral game exhibits discontinuous payoffs for the candidates, we prove that mixed strategy equilibria exist generally, that equilibrium expected payoffs are continuous in the parameters of the model, and that mixed strategy equilibria are upper hemicontinuous. We investigate the robustness of the median voter theorem to private information: Pure strategy equilibria may fail to exist in models close to the Downsian model, but mixed strategy equilibria must, and they will be "close" to the Downsian equilibrium. Finally, we provide bounds on the support of mixed strategy equilibria and restrictions on possible atoms of equilibrium mixed strategies.https://resolver.caltech.edu/CaltechAUTHORS:20170801-142748283Voting Blocs, Coalitions and Parties
https://resolver.caltech.edu/CaltechAUTHORS:20170801-095722280
DOI: 10.7907/crqbr-xmr36
In this paper I study the strategic implications of coalition formation in an assembly. A coalition forms a voting bloc to coordinate the voting behavior of its members, acting as a single player and affecting the policy outcome. I prove that there exist stable endogenous voting bloc structures and in an assembly with two parties I show how the incentives to form a bloc depend on the types of the agents, the sizes of the parties, and the rules the blocs use to aggregate their preferences. I also provide an empirical application of the model to the US Supreme Court and I show that justices face a strategic incentive to coalesce into voting blocs.https://resolver.caltech.edu/CaltechAUTHORS:20170801-095722280Fairness, or Just Gambling on It? An Experimental Analysis of the Gift Exchange Game
https://resolver.caltech.edu/CaltechAUTHORS:20170731-155937511
DOI: 10.7907/7qnhm-ztm07
Fehr, Kirchsteiger and Riedl experimentally test a labor market in which worker effort levels are chosen after wages are set. They observe high wages and effort levels in the repeated game, contrary to the equilibrium prediction. In a similar experimental test of lemons markets, Lynch Miller, Plott and Porter find support for the equilibrium prediction. The current paper finds more evidence of repeated game effects than in previous studies. In a model of incomplete information regarding the reciprocal nature of other players, the FKR design is shown to be conducive to reputation effects while the LMPP design is not.https://resolver.caltech.edu/CaltechAUTHORS:20170731-155937511Auctioning off the Agenda: Bargaining in Legislatures with Endogenous Scheduling
https://resolver.caltech.edu/CaltechAUTHORS:20170728-170556066
DOI: 10.7907/wkd74-vbm27
There are many examples of allocation problems where the final allocation affects more than one agent, but the models developed to study them typically allow for side payments between agents. However, there are political economy applications where it is hard to imagine monetary transfers between the agents, at least not legal ones. In this paper we propose a general political economic framework for the study of allocation problems with externalities without side payments. We consider a setup with complete information and we formulate the problem as one where the status quo describes an initial allocation that can altered in a sequence of proposals. The number of these proposals is restricted. In the context of our main application, bidding for slots on a legislative agenda, such restriction can be interpreted as scarcity of plenary time for considering the possible bills to move the policy. The intuition for our model comes out of framing the problem as a special type of a multi-good auction. We show that equilibria generically exist within the general model.https://resolver.caltech.edu/CaltechAUTHORS:20170728-170556066The Effect of Voter Identification Laws on Turnout
https://resolver.caltech.edu/CaltechAUTHORS:20170728-165011731
DOI: 10.2139/ssrn.1084598
Since the passage of the "Help America Vote Act" in 2002, nearly half of the states have adopted a variety of new identification requirements for voter registration and participation by the 2006 general election. There has been little analysis of whether these requirements reduce voter participation, especially among certain classes of voters. In this paper we document the effect of voter identification requirements on registered voters as they were imposed in states in the 2000 and 2004 presidential elections, and in the 2002 and 2006 midterm elections. Looking first at trends in the aggregate data, we find no evidence that voter identification requirements reduce participation. Using individual-level data from the Current Population Survey across these elections, however, we find that the strictest forms of voter identification requirements—combination requirements of presenting an identification card and positively matching one's signature with a signature either on file or on the identification card, as well as requirements to show picture identification—have a negative impact on the participation of registered voters relative to the weakest requirement, stating one's name. We also find find evidence that the stricter voter identification requirements depress turnout to a greater extent for less educated and lower income populations, for both minorities and non-minorities.https://resolver.caltech.edu/CaltechAUTHORS:20170728-165011731Existence of Equilibrium in Single and Double Private Value Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170731-142826926
DOI: 10.7907/bn99c-1r239
We show existence of equilibria in distributional strategies for a wide class of private value auctions, including the first general existence result for double auctions. The set of equilibria is invariant to the tie-breaking rule. The model incorporates multiple unit demands, all standard pricing rules, reserve prices, entry costs, and stochastic demand and supply. Valuations can be correlated and asymmetrically distributed. For double auctions, we show further that at least one equilibrium involves a positive volume of trade. The existence proof establishes new connections among existence techniques for discontinuous Bayesian games.https://resolver.caltech.edu/CaltechAUTHORS:20170731-142826926Supermodular Bayesian Implementation: Learning and Incentive Design
https://resolver.caltech.edu/CaltechAUTHORS:20170728-171530308
DOI: 10.7907/t8mnr-wr958
I develop supermodular implementation in incomplete information. Supermodular implementable social choice functions (scf) are scf that are Bayesian implementable with mechanisms that induce a supermodular game. If a mechanism induces a supermodular game, agents may learn to play some equilibrium in a dynamic setting. The paper has two parts. The first part is concerned with sufficient conditions for (truthful) supermodular implementability in quasilinear environments. There, I describe a constructive way of modifying a mechanism so that it supermodularly implements a scf. I prove that, any Bayesian implementable decision rule that satisfies a joint condition with the valuation functions, requiring their composition to produce bounded substitutes, is (truthfully) supermodular implementable. This joint condition is always satisfied on finite type spaces; it is also satisfied by C decision rules and valuation functions on a compact type space. Then I show that allocation-efficient decision rules are (truthfully) supermodular im- plementable with balanced transfers. Third, I establish that C^2 Bayesian implementable decision rules satisfying some dimensionality condition are (truthfully) supermodular implementable with an induced game whose interval prediction is the smallest possible. The second part provides a Supermodular Revelation Principle.https://resolver.caltech.edu/CaltechAUTHORS:20170728-171530308On the Weights of Nations: Assigning Voting Weights in a Heterogeneous Union
https://resolver.caltech.edu/CaltechAUTHORS:20170731-134203644
DOI: 10.7907/hd4q4-hvz64
Consider a voting procedure where countries, states, or districts comprising a union each elect representatives who then participate in later votes at the union level on their behalf. The countries, provinces, and states may vary in their populations and composition. If we wish to maximize the total expected utility of all agents in the union, how to weight the votes of the representatives of the different countries, states or districts at the union level? We provide a simple characterization of the efficient voting rule in terms of the weights assigned to different districts and the voting threshold (how large a qualified� majority is needed to induce change versus the status quo). Next, in the context of a model of the correlation structure of agents preferences, we analyze how voting weights relate to the population size of a country. We then analyze the voting weights in Council of the European Union under the Nice Treaty and the recently proposed constitution, and contrast them under different versions of our model, and compare them to the weights derived from poll data.https://resolver.caltech.edu/CaltechAUTHORS:20170731-134203644Self-correcting Information Cascades
https://resolver.caltech.edu/CaltechAUTHORS:20170731-133225091
DOI: 10.7907/3dd4s-jz194
In laboratory experiments, information cascades are ephemeral phenomena, collapsing soon after they form, and then reforming again. These formation/collapse/reformation cycles occur frequently and repeatedly. Cascades may be reversed (collapse followed by a cascade on a different state) and more often than not, such a reversal is self-correcting: the cascade switches from the incorrect to the correct state. Past experimental work focused on relatively short horizons, where these interesting dynamics are rarely observed. We present experiments with a longer horizon, and also investigate the effect of signal informativeness. We propose a theoretical model, based on quantal response equilibrium, where temporary and self-correcting cascades arise as equilibrium phenomena. The model also predicts the systematic differences we observe experimentally in the dynamics, as a function of signal informativeness. We extend the basic model to include a parameter measuring base rate neglect and find it to be a statistically significant factor in the dynamics, resulting in somewhat faster rates of social learning.https://resolver.caltech.edu/CaltechAUTHORS:20170731-133225091An Experimental Study of Storable Votes
https://resolver.caltech.edu/CaltechAUTHORS:20170731-170114634
DOI: 10.7907/m1m03-52j79
The storable votes mechanism is a method of voting for committees that meet periodically to consider a series of binary decisions. Each member is allocated a fixed budget of votes to be cast as desired over the multiple decisions. Voters are induced to spend more votes on those decisions that matter to them most, shifting the ex ante probability of winning away from decisions they value less and towards decisions they value more, typically generating welfare gains over standard majority voting with non-storable votes. The equilibrium strategies have a very intuitive feature–-the number of votes cast must be monotonic in the voter's intensity of preferences–-but are otherwise difficult to calculate,
raising questions of practical implementation. In our experiments, realized efficiency levels were remarkably close to theoretical equilibrium predictions, while subjects adopted monotonic but off-equilibrium strategies. We are lead to conclude that concerns about the complexity of the game may have limited practical relevance.https://resolver.caltech.edu/CaltechAUTHORS:20170731-170114634The Swing Voter's Curse in the Laboratory
https://resolver.caltech.edu/CaltechAUTHORS:20170801-090500575
DOI: 10.7907/7b0ph-rw267
This paper reports the first laboratory study of the swing voter's curse and provides insights on the larger theoretical and empirical literature on "pivotal voter" models. Our experiment controls for different information levels of voters, as well as the size of the electorate, the distribution of preferences, and other theoretically relevant parameters. The design varies the share of partisan voters and the prior belief about a payoff relevant state of the world. Our results support the equilibrium predictions of the Feddersen-Pesendorfer model, and clearly reject the notion that voters in the laboratory use naive decision-theoretic strategies. The voters act as if they are aware of the swing voter's curse and adjust their behavior to compensate. While the compensation is not complete and there is some heterogeneity in individual behavior, we find that aggregate outcomes, such as efficiency, turnout, and margin of victory, closely track the theoretical predictions.https://resolver.caltech.edu/CaltechAUTHORS:20170801-090500575Secure Implementation: Strategy-Proof Mechanisms Reconsidered
https://resolver.caltech.edu/CaltechAUTHORS:20170731-164437007
DOI: 10.7907/2zwtm-v1486
Strategy-proofness, requiring that truth-telling is a dominant strategy, is a standard concept in social choice theory. However, the concept of strategy-proofness has serious drawbacks. First, announcing one's true preference may not be a unique dominant strategy, and using the wrong dominant strategy may lead to the wrong outcome. Second, almost all strategy-proof mechanisms have a continuum of Nash equilibria, and most of which produce the wrong outcome. Third, experimental evidence shows that most of the strategy-proof mechanisms do not work well. We argue that a possible solution to this dilemma is to require double implementation in Nash equilibrium and in dominant strategies, which we call secure implementation. We characterize environments where secure implementation is possible, and compare it with dominant strategy implementation. An interesting example of secure implementation is a Groves mechanism when preferences are single-peaked.https://resolver.caltech.edu/CaltechAUTHORS:20170731-164437007Sequential Entry in Many-to-one Matching Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170728-162757118
DOI: 10.7907/bwpm5-t6p55
We study sequential bargaining in many-to-one matching markets. We show that there is an advantage to entering late in the market, and that the last agent to enter the market will receive his or her best partner in a stable matching, extending the results of Blum and Rothblum (2002) and Cechlárová (2002) for the marriage model. We also discuss the relation between sequential bargaining and a possible alternative formulation based on the NTU Shapley value.https://resolver.caltech.edu/CaltechAUTHORS:20170728-162757118Endogenous Entry and Self-selection in Private Value Auctions: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20170731-171226910
DOI: 10.7907/y1s3v-psz75
This paper presents the results of an experimental study of endogenous entry and bidding behavior in first-price independent private value auctions. In the first stage N potential bidders simultaneously decide whether to participate in an auction or to claim a fixed outside option. At this stage all potential bidders know N, the distribution of possible values, and the value of the outside option. In the second stage, each entering bidder submits a bid after learning their own private value for the object and the number of entering bidders. We find evidence of self-selection effect, as predicted by an equilibrium model of heterogeneous risk averse bidders. The theoretical model predicts that bidding in the auction will be lower with endogenous entry because only the less risk averse bidders will choose to enter. We also find that entry decreases with the value of the outside option, as predicted. One surprising finding is that we observe over-entry relative to the theoretical predictions.https://resolver.caltech.edu/CaltechAUTHORS:20170731-171226910The Compromise Game: Two-sided Adverse Selection in the Laboratory
https://resolver.caltech.edu/CaltechAUTHORS:20170801-094538005
DOI: 10.7907/xv853-bxx57
We analyze a game of two-sided private information characterized by extreme adverse selection, and study a special case in the laboratory. Each player has a privately known "strength" and can decide to fight or compromise. If either chooses to fight, there is a conflict; the stronger player receives a high payoff and the weaker player receives a low payoff. If both choose to compromise, conflict is avoided and each receives an intermediate payoff. The only equilibrium in both the sequential and simultaneous versions of the game is for players to always fight, independent of their own strength. In our experiment, we observe among other things (i) frequent compromise, (ii) little evidence of learning, and (iii) different behavior between first, second and simultaneous movers. We explore several models in an attempt to understand the reasons underlying these anomalous choices, including quantal response equilibrium, cognitive hierarchy, and cursed equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170801-094538005The Impact of Minority Representation on Policy Outcomes: Evidence from the U.S. States
https://resolver.caltech.edu/CaltechAUTHORS:20170728-142011149
DOI: 10.7907/nvwwt-0cv87
Over the last forty years, racial minorities in the United States have made substantial progress in achieving greater representation in legislatures. However, there is surprisingly little empirical evidence as to whether this has had any direct impact on policy outcomes. Exploiting two instances of 'exogenous shocks' that led to large increases in the number of African American legislators, this paper empirically tests the relationship between descriptive representation of minorities and substantive representation of their interests. By examining the school district-level data from the 1970s through the late 1990s across the United States, the paper finds statistically robust evidence that the political representation of African Americans is associated with a more equitable allocation of state aid to school districts, which suggests that representation of traditionally underrepresented groups can lead to tangible changes in public policy. The results are robust to controls for the effects of other political and demographic factors.https://resolver.caltech.edu/CaltechAUTHORS:20170728-142011149Strongly Stable Networks
https://resolver.caltech.edu/CaltechAUTHORS:20170802-140336366
DOI: 10.7907/f3hqc-xpm67
We analyze the formation of networks among individuals. In particular, we examine the existence of networks that are stable against changes in links by any coalition of individuals. We show that to investigate the existence of such strongly stable networks one can restrict focus on a component-wise egalitarian allocation of value. We show that when such strongly stable networks exist they coincide with the set of efficient networks (those maximizing the total productive value). We show that the existence of strongly stable networks is equivalent to core existence in a derived cooperative game and use that result to characterize the class of value functions for which there exist strongly stable networks via a "top convexity" condition on the value function on networks. We also consider a variation on strong stability where players can make side payments, and examine situations where value functions may be non-anonymous - depending on player labels.https://resolver.caltech.edu/CaltechAUTHORS:20170802-140336366Turnout and Power Sharing
https://resolver.caltech.edu/CaltechAUTHORS:20170727-102658422
DOI: 10.7907/sh7hr-xth11
Differences in electoral rules and/or legislative, executive or legal institutions across countries induce different mappings from election outcomes to distributions of power. We explore how these different mappings affect voters' participation in a democracy. Assuming heterogeneity in the cost of voting, the effect of such institutional differences on turnout depends on the distribution of voters' preferences for the parties: when the two parties have similar support, turnout is higher in a winner-take-all system than in a power sharing system; the result is reversed when one side has a larger base. The results obtained in the rational voter model are shown to continue to hold in other models of turnout such as ethical voter models and voter mobilization models. Finally, the theoretical comparative results are validated by an experiment, comparing the Levine and Palfrey (2007) results with new data on power sharing elections.https://resolver.caltech.edu/CaltechAUTHORS:20170727-102658422Using Information From Trading In Trading And Portfolio Management: Ten Years Later
https://resolver.caltech.edu/CaltechAUTHORS:20170802-161529767
DOI: 10.7907/52ppd-wdc18
The centerpiece of this paper is a comprehensive analysis of all of the trading by a large US pension fund in 1991. The first version was published in the Summer 1995 issue of the Journal of Investing and later incorporated in the AIMR's CFA reading. This updated version, requested by the AIMR, reflects the technological and market structure changes of the last ten years, and adds a new empirical analysis of all the 2001 trading by a $7 billion US equity manager.
Trading technology has changed in many ways, yet many of the same characteristics of institutional trading are seen just as strongly as they were before. The attention given to large difficult orders still shows in lower than expected trading costs. Small "no-brainer" orders still represent the largest component of overall trading costs. These are precisely the type of costs that modern electronic trading systems are designed to reduce.
The idea that trading costs were a substantial drag on performance was a relatively novel in 1991. Today, it is a central concern for many managers, including those profiled here.https://resolver.caltech.edu/CaltechAUTHORS:20170802-161529767Information aggregation in standing and ad hoc committees
https://resolver.caltech.edu/CaltechAUTHORS:20170728-145558781
DOI: 10.7907/s6x5b-ztg37
This paper reports results from a laboratory experiment comparing voting behavior and decision making efficiency in standing and ad hoc committees, where decisions are made by unanimity rule. We also compare sequential and simultaneous (secret ballot) voting procedures. The data are remarkably consistent across treatments, in both qualitative (comparative statics) and quantitative terms. The different procedures and the ad hoc or standing nature of the committees generally do not seem to lead to the selection of different equilibria, with the exception of some evidence of bandwagon effects in the sequential procedure.https://resolver.caltech.edu/CaltechAUTHORS:20170728-145558781Non-Excludable Public Good Experiments
https://resolver.caltech.edu/CaltechAUTHORS:20170801-170555282
DOI: 10.7907/znask-yjk17
We conducted a two-stage game experiment with a non-excludable public good. In the first stage, two subjects choose simultaneously whether or not they commit to contributing nothing to provide a pure public good. In the second stage, knowing the other subject's commitment decision, subjects who selected not to commit in the first stage choose contributions to the public good. We found no support for the evolutionary stable strategy equilibrium, and the ratio of subjects who did not commit to contributing nothing increased as rounds advanced; that is, the free-riding rate declined over time. Furthermore, this behavior did not arise due to altruism or kindness among subjects, but from spiteful behavior of subjects.https://resolver.caltech.edu/CaltechAUTHORS:20170801-170555282Social Networks in Determining Employment and Wages: Patterns, Dynamics, and Inequality
https://resolver.caltech.edu/CaltechAUTHORS:20170802-140339953
DOI: 10.7907/b2v43-0c139
We develop a model where agents obtain information about job opportunities through an explicitly modeled network of social contacts. We show that an improvement in the employment status of either an agent's direct or indirect contacts leads to an increase in the agent's employment probability and expected wages, in the sense of first order stochastic dominance. A similar effect results from an increase in the network contacts of an agent. In terms of dynamics and patterns, we show that employment is positively correlated across time and agents, and the
same is true for wages. Moreover, unemployment exhibits persistence in the sense of duration dependence: the probability of obtaining a job decreases in the length of time that an agent has been unemployed. Finally, we examine inequality between two groups. If staying in the labor market is costly (in opportunity costs, education costs, or skills maintenance) and one group starts with a worse employment status or a smaller network, then that group's drop-out rate will be higher and their employment prospects and wages will be persistently below that of the other group.https://resolver.caltech.edu/CaltechAUTHORS:20170802-140339953Envy-Freeness and Implementation in Large Economies
https://resolver.caltech.edu/CaltechAUTHORS:20170801-165622147
DOI: 10.7907/vhrmj-xdr58
We show that an asymptotic envy-freeness is a necessary condition for a form of robust approximate implementation in large economies.https://resolver.caltech.edu/CaltechAUTHORS:20170801-165622147A Measure of Bizarreness
https://resolver.caltech.edu/CaltechAUTHORS:20170728-160719330
DOI: 10.7907/jsfjx-92s55
We introduce a path-based measure of convexity to be used in assessing the compactness of legislative districts. Our measure is the probability that a district will contain the shortest path between a randomly selected pair of its points. The measure is defined relative to exogenous political boundaries and population distributions.https://resolver.caltech.edu/CaltechAUTHORS:20170728-160719330Why Was It Europeans Who Conquered the World?
https://resolver.caltech.edu/CaltechAUTHORS:20170725-155356195
DOI: 10.7907/8cgh9-55682
By the eighteenth century, Europeans dominated the military technology of gunpowder weapons, which had enormous advantages for fighting war at a distance and conquering other parts of the world. Their dominance, however, was surprising, because the technology had originated in China and been used with expertise in Asia and the Middle East. To account for their prowess with gunpowder weapons, historians have often invoked competition, but it cannot explain why they pushed this technology further than anyone else. The answer lies in the peculiar form that military competition took in western Europe: it was a winner take all tournament, and a simple model of the tournament shows why it led European rulers to spend heavily on improving the
gunpowder technology, and why political incentives and military conditions kept such a tournament from developing elsewhere in the world. As a result, rulers elsewhere in Eurasia had much less reason to advance the gunpowder technology or to catch up with the Europeans. The consequences were huge, from colonialism to the slave trade and even the Industrial Revolution.https://resolver.caltech.edu/CaltechAUTHORS:20170725-155356195The Mathematics and Statistics of Voting Power
https://resolver.caltech.edu/CaltechAUTHORS:20170802-153714426
DOI: 10.7907/hpnm8-cmw66
In an election, voting power—the probability that a single vote is decisive—is affected by the rule for aggregating votes into a single outcome. Voting power is important for studying political representation, fairness and strategy, and has been much discussed in political science. Although power indexes are often considered as mathematical definitions, they ultimately depend on statistical models of voting. Mathematical calculations of voting power usually have been performed under the model that votes are decided by coin flips. This simple model has interesting implications for weighted elections, two-stage elections (such as the U.S. Electoral College) and coalition structures. We discuss empirical failings of the coin-flip model of voting and consider, first, the implications for voting power and, second, ways in which votes could be modeled more realistically. Under the random voting model, the standard deviation of the average of n votes is proportional to 1/√n, but under more general models, this variance can have the form cn^(−α) or √a−b log n. Voting power calculations under more realistic models present research challenges in modeling and computation.https://resolver.caltech.edu/CaltechAUTHORS:20170802-153714426Equilibrium Agenda Formation
https://resolver.caltech.edu/CaltechAUTHORS:20170802-133318577
DOI: 10.7907/wpdgx-d8t61
We develop a definition of equilibrium for agenda formation in general voting settings. The definition is independent of any protocol. We show that the set of equilibrium outcomes for any Pareto efficient voting rule is uniquely determined, and in fact coincides with that of the outcomes generated by considering all full agendas. Under voting by successive elimination (or amendment), the set of equilibrium outcomes corresponds with the Banks set. We also examine the implications in several specific settings and show that studying equilibrium agendas can lead to sharp predictions, in contrast with well-known "chaos" theorems.https://resolver.caltech.edu/CaltechAUTHORS:20170802-133318577Inequality aversion and risk aversion
https://resolver.caltech.edu/CaltechAUTHORS:20170727-151232544
DOI: 10.7907/4b9sj-yrr47
This note shows that for two social welfare functions which are inequality averse and anonymous, if one is more inequality averse than the other, it induces a more risk averse household through optimal sharing than the other. We present examples showing that this comparative static can be reversed if either the property of inequality aversion or anonymity is dropped.https://resolver.caltech.edu/CaltechAUTHORS:20170727-151232544On the Concentration of Allocations and Comparisons of Auctions in Large Economies
https://resolver.caltech.edu/CaltechAUTHORS:20170802-144326047
DOI: 10.7907/x6e1x-crv92
We analyze competitive pressures in a sequence of auctions with a growing number of bidders, in a model that includes private and common valuations as special cases. We show that the key determinant of bidders' surplus (and implicitly auction revenue) is how the goods are distributed. In any setting and sequence of auctions where the allocation of good(s) is concentrated among a shrinking proportion of the population, the winning bidders enjoy no surplus in the limit. If instead the good(s) are allocated in a dispersed manner so that a non-vanishing proportion of the bidders obtain objects,
then in any of a wide class of auctions bidders enjoy a surplus that is bounded away from zero. Moreover, under dispersed allocations, the format of the auction matters. If bidders have constant marginal utilities for objects up to some limit, then uniform price auctions lead to higher revenue than discriminatory auctions. If agents have decreasing marginal utilities for objects, then uniform price auctions are asymptotically efficient, while discriminatory auctions are asymptotically inefficient. Finally, we show that in some cases where dispersed allocations are efficient, revenue may increase by bundling goods at the expense of efficiency.https://resolver.caltech.edu/CaltechAUTHORS:20170802-144326047The control of game form recognition in experiments: Understanding dominant strategy failures in a simple two person "guessing" game
https://resolver.caltech.edu/CaltechAUTHORS:20170728-154142982
DOI: 10.7907/75t80-xz320
The paper focuses on instructions and procedures as the reasons that subjects fail to behave according to the predictions of game theory as observed in two person guessing game experiments. In this game, each of two people has to choose simultaneously a number between 0 and 100. The winner is the person whose chosen number is the closest to 2/3 of the average of the two numbers. The weakly dominant strategy is zero. Because of the simplicity of the game (once it is understood), the widespread failure of subjects to choose the weakly dominant strategy has been interpreted as evidence of some fundamental inability to behave strategically.
The experiments reported here demonstrate that the failure to act strategically is related to how the game is presented. Several different presentations are studied. Some subjects fail to recognize the game form when it is presented abstractly. When the game is transformed into the simple isomorphic game and presented in a familiar context, subjects do choose weakly dominant strategies. Suggestions for better experiment control are given.https://resolver.caltech.edu/CaltechAUTHORS:20170728-154142982Salary Competition in Matching Markets with Private Information
https://resolver.caltech.edu/CaltechAUTHORS:20170727-111603591
DOI: 10.7907/h5k3q-z3618
We analyze a game in which firms with private information compete for workers by making a single salary offer. Once salaries are chosen, firms make offers to workers, who care only about salary. Firms and workers are matched according to the Gale-Shapley deferred acceptance algorithm that dominates the theory of two-sided matching. For a two-firm, two-worker model, we prove existence of a Bayesian Nash equilibrium in which each firm type chooses a salary according to a continuous distribution with interval support in the salary space. We find a 'separation' of types in equilibrium, in the sense that between two types with a common most preferred worker, one type always makes higher offers than the other type. The type that makes the higher offers depends on the relative marginal values attached to the workers by the different firm types. Moreover, more 'popular' workers attract higher average equilibrium salaries.
We also consider an extension of the model to larger markets by replicating the two- firm two-worker case, in order to examine the effects of market size on competition and equilibrium salaries. In the limit, there is no aggregate uncertainty about the realization of firm types. We characterize the equilibria for this limit case in which there are a continuum of firms and a continuum of workers divided into two equal-sized worker classes. Finally, we conjecture the existence and convergence of the sequence of equilibria in finite replicated markets to the corresponding continuum equilibrium as the number of replications approaches infinity.https://resolver.caltech.edu/CaltechAUTHORS:20170727-111603591Endogenous Games and Mechanisms: Side Payments Among Players
https://resolver.caltech.edu/CaltechAUTHORS:20170802-133314873
DOI: 10.7907/98zeh-g6585
We characterize the outcomes of games when players may make binding offers of strategy contingent side payments before the game is played. This does not always lead
to efficient outcomes, despite complete information and costless contracting. The characterizations are illustrated in a series of examples, including voluntary contribution public good games, Cournot and Bertrand oligopoly, principal-agent problems, and commons games, among others.https://resolver.caltech.edu/CaltechAUTHORS:20170802-133314873Money metric utilitarianism
https://resolver.caltech.edu/CaltechAUTHORS:20170727-154342486
DOI: 10.7907/mxpzf-mv393
We discuss a method of ranking allocations in economic environments which applies when we do not know the names or preferences of individual agents. We require that two allocations can be ranked with the knowledge only of their aggregate bundles and community indifference sets–a condition we refer to as aggregate independence. We also postulate a basic Pareto and continuity property, and a property stating that when two disjoint economies and allocations are put together, the ranking in the large economy should be consistent with the rankings in the two smaller economies (reinforcement). We show that a ranking method satisfies these axioms if and only if there is a probability measure over the strictly positive prices for which the rule ranks allocations on the basis of the random-price money-metric utilitarian rule. This is a rule which computes the money-metric utility for each agent at each price, sums these, and then takes an expectation according to the probability measure.https://resolver.caltech.edu/CaltechAUTHORS:20170727-154342486Legislative bargaining and the dynamics of public investment
https://resolver.caltech.edu/CaltechAUTHORS:20170727-112335038
DOI: 10.7907/sffq8-8mn50
We present a legislative bargaining model of the provision of a durable public good over an infinite horizion. In each period, there is a societal endowment which can either be invested in the public good or consumed. We characterize the optimal public policy, defined by the time path of investment and consumption. In each period, a legislature with representatives of each of "n" districts bargain over the current period's endowment for investment in the public good and transfers to each district. We analyze the Markov perfect equilibrium under different voting "q"-rules where "q" is the number of yes votes required for passage. We show that the efficiency of the public policy is increasing in "q" because higher "q" leads to higher investment in the public good and less pork. We examine the theoretical equilibrium predictions by conducting a laboratory experiment with five-person committees that compares three alternative voting rules: unanimity (q=5); majority (q=3); and dictatorship (q=1).https://resolver.caltech.edu/CaltechAUTHORS:20170727-112335038The Dynamics of Distributive Politics
https://resolver.caltech.edu/CaltechAUTHORS:20170728-155330544
DOI: 10.7907/qn20c-9s527
We study dynamic committee bargaining over an infinite horizon with discounting. In each period a committee proposal is generated by a random recognition rule, the committee chooses between the proposal and a status quo by majority rule, and the voting outcome in period t becomes the status quo in period t+1. We study symmetric Markov equilibria of the resulting game and conduct an experiment to test hypotheses generated by the theory for pure distributional (divide-the-dollar) environments. In
particular, we investigate the effects of concavity in the utility functions, the existence of a Condorcet winning alternative, and the discount factor (committee "impatience"). We report several new findings. Voting behavior is selfish and myopic. Status quo outcomes have great inertia. There are strong treatment effects, that are in the direction predicted by the Markov equilibrium. We find significant evidence of concave utility functions.https://resolver.caltech.edu/CaltechAUTHORS:20170728-155330544Congestion at Locks on Inland Waterways: An Experimental Testbed of a Policy of Tradable Priority Permits for Lock Access
https://resolver.caltech.edu/CaltechAUTHORS:20170802-115222530
DOI: 10.7907/w98ts-h9q90
This research is focused on the problem of congestion at locks on the inland waterways of the United States, and particularly on the Mississippi and Illinois Rivers. The current policy of first-come-first-served exacerbates the problem and adds to delays and inefficiency. An alternative policy of marketable priority access permits is proposed and studied. The dimensions of the policy relative to the needs of operators are discussed. Well established economic theory suggests that the system of marketable priority permits will increase the economic efficiency with which locks operate and that by the endowing of current operators with these permits will increase their profitability. A testbed experiment was conducted to illustrate how the principles operate. The policy objective of increased efficiency is observed thereby establishing proof of principle. More importantly, the policy works according to all of the many predictions that theory holds thereby establishing design consistency. Not only is the value of system use increased, prices converge to the competitive levels, the removal of delay for certain classes of permits transforms system use to higher valued activities and operator profitability is increased. In the testbed, the policy produces the desired outcomes and it does so for understandable reasons.https://resolver.caltech.edu/CaltechAUTHORS:20170802-115222530The Impact of Race and Ideology on Voting: Does Race Still Matter?
https://resolver.caltech.edu/CaltechAUTHORS:20170728-143605015
DOI: 10.7907/xzwky-5kg48
Why do barriers to minority legislative representation persist? This paper asks to what extent low levels of black office-holding are attributable to purely candidate race alone, as opposed to candidate ideology. Using a unique data set that contains information on candidates' ideological positions coupled with extensive individual voting data, the paper tests whether candidate race exerts an independent and significant influence on vote choice that cannot be explained away by candidate ideology. Estimating a model of individual vote choice, I find that candidate race is largely irrelevant for most white voters when the effects of candidate ideology are taken into account. This is especially the case when the ideological distance between two candidates is large. It implies that when candidates are ideologically distinct, voters are unlikely to cross their party line to support a candidate whom they would not normally support, regardless of their willingness to vote for a black candidate. However, when candidates are ideologically close, white voters are more likely to vote for a white candidate who is pitted against a black candidate. Similarly, I find that the race of candidates works as a strong negative cue for white voters with no party affiliations. This suggests that white voters are likely to use candidate race as a voting cue when the party cue is absent or weak.https://resolver.caltech.edu/CaltechAUTHORS:20170728-143605015Cooperation Without Immediate Reciprocity: An Experiment in Favor Exchange
https://resolver.caltech.edu/CaltechAUTHORS:20170727-102025513
DOI: 10.7907/0t6fj-gad06
This paper presents experimental evidence concerned with behavior in indefinite horizon two-person dynamic favor exchange games. Subjects interact in pairs in continuous time and occasionally one of them receives opportunity to provide a favor to her partner. The effects of changing the benefit of receiving a favor and the arrival rate of opportunities to do a favor are studied when the opportunities are privately observed. Also considered are the impacts of informational access to partner's opportunities on efficiency and the overall behavior of individuals with respect to "obvious" state variables.https://resolver.caltech.edu/CaltechAUTHORS:20170727-102025513Choosing How to Choose: Self-Stable Majority Rules
https://resolver.caltech.edu/CaltechAUTHORS:20170802-145707366
DOI: 10.7907/c634h-t3t66
We consider the endogenous choice of a voting rule, characterized by the majority size needed to elect change over the status quo, by a society who will use the rule to make future decisions. Under simple assumptions on the uncertainty concerning the future alternatives that will be voted upon, voters' have induced preferences over voting rules that are single-peaked and intermediate. We explore the existence of self-stable voting rules, i.e., voting rules such that there is no alternative rule that would beat the given voting rule if the given voting rule is used to choose between the rules. There are situations where self-stable voting rules do not exist. We explore conditions that guarantee existence, as well as issues relating to efficiency and constitutional design.https://resolver.caltech.edu/CaltechAUTHORS:20170802-145707366Testing Models with Multiple Equilibria by Quantile Methods
https://resolver.caltech.edu/CaltechAUTHORS:20170802-110804051
DOI: 10.7907/acyxk-m6t07
We derive an econometric test for the presence of monotone comparative statics in models with multiple equilibria. The test applies to many economic models, such as single-person decision, industrial organization, macroeconomic and game-theory models. These models have complementarities between exogenous and endogenous variables. We show that, as a result, extreme (large and small) conditional quantiles of the endogenous variable are increasing in the exogenous variable. We develop a likelihood-ratio test based on estimates for the conditional quantiles of the endogenous variable, which is an asymptotic extension of Bartholomew's (1959a,b) "chi-bar squared" test. Our assumptions are weak; we remain agnostic about the cardinality, location and probabilities of the equilibrium set, and make no restrictions on the equilibrium-selection procedure.https://resolver.caltech.edu/CaltechAUTHORS:20170802-110804051Gerrymandering Roll-Calls: Votes, Decisions, and Partisan bias in Congress, 1879-2000
https://resolver.caltech.edu/CaltechAUTHORS:20170801-164857645
DOI: 10.7907/pt2hj-d5610
We argue that the standard toolbox used in electoral studies to assess the bias and responsiveness of electoral systems can also be used to assess the bias and responsiveness of legislative systems. We consider which items in the toolbox are the most appropriate for use in the legislative setting, then apply them to estimate levels of bias in the U.S. House from 1879 to 2000. Our results indicate a systematic bias in favor of the majority party over this period, with the strongest bias arising during the period of "Czar rule" (51st-60th Congresses, 1889-1910) and during the post-packing era (87th-106th Congresses, 1961-2000). This finding is consistent with the majority party possessing a significant advantage in setting the agenda.
"The definition of alternatives is the supreme instrument of power."
-–E. E. Schattschneider (1960, p. 86).https://resolver.caltech.edu/CaltechAUTHORS:20170801-164857645Categorical Cognition: A psychological model of categories and identification in decision making
https://resolver.caltech.edu/CaltechAUTHORS:20170802-150938037
DOI: 10.7907/5bmpz-kta12
This paper introduces a psychological notion of categorization into economics and derives its implications for economic decision making. We show, using a tractable model of social cognition, that a decision maker in (efficiently) assigning past experiences to
categories, will sort experiences of interaction with larger (majority) groups more finely than experiences with smaller (minority) groups. We then apply the model to understand simple forms of discrimination and social identity. It is shown that discrimination in hiring can result from such cognitive processes even when there is no malevolent taste to do so and workers' qualifications are fully observable. The model also provides a framework that is equipped to investigate the social psychological concept of identity, where identity is viewed as self-categorization.https://resolver.caltech.edu/CaltechAUTHORS:20170802-150938037Core Many-to-one Matchings by Fixed-Point Methods
https://resolver.caltech.edu/CaltechAUTHORS:20170802-154832632
DOI: 10.7907/8mv8v-a7476
We characterize the core many-to-one matchings as fixed points of a map. Our characterization gives an algorithm for finding core allocations; the algorithm is efficient and simple to implement. Our characterization does not require substitutable preferences, so it is separate from the structure needed for the non-emptiness of the core. When preferences are substitutable, our characterization gives a simple proof of the lattice structure of core matchings, and it gives a method for computing the join and meet of two core matchings.https://resolver.caltech.edu/CaltechAUTHORS:20170802-154832632Allocation Rules for Network Games
https://resolver.caltech.edu/CaltechAUTHORS:20170801-162757854
DOI: 10.7907/md7ss-0dq66
Previous allocation rules for network games, such as the Myerson Value, implicitly or
explicitly take the network structure as fixed. In many situations, however, the network
structure can be altered by players. This means that the value of alternative network
structures (not just sub-networks) can and should influence the allocation of value among players on any given network structure. I present a family of allocation rules that incorporate information about alternative network structures when allocating value.https://resolver.caltech.edu/CaltechAUTHORS:20170801-162757854Information Aggregation and Strategic Abstention in Large Laboratory Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170727-151856770
DOI: 10.7907/b2f8f-w2r02
This paper compares strategic abstention in ad hoc committees versus standing committees. Ad hoc committees meet only once and then dissolve, while standing committees. The central finding from this study is that most of the predictions of swing voter curse theory hold up in large elections conducted under controlled laboratory conditions. There is significant abstention, and significant balancing of partisans by uniformed voters; and vote balancing increases with the partisan imbalance. Elections with no partisan imbalance successfully aggregate information and lead to efficient outcomes. Consistent with swing voter curse theory, this efficiency falls off as partisan imbalance increases, but to a significantly greater extent than is predicted in
equilibrium. It is instructive to compare these findings in large elections with results from the elections reported in Battaglini et al. (2007). All of the qualitative results are the same, concerning the comparative statics, balancing, and abstention. One slight difference is that there was less (irrational) voting for α in the small elections than in the large elections, except for the π = 5/9 m = 0 treatment, where we observed 20% voting for α in the small elections,
compared with 10% voting for α in the large elections. These differences were reflected in slightly different efficiency results between small and large elections, with the comparisons mirroring the differences in voting for α: more (irrational) α voting results in lower efficiency. We conclude that this scaled-up study successfully replicates the initial swing voter's curse experiment reported in Battaglini et al. (2007), obtaining very similar findings in laboratory committees that are three times the size of those in the original study. The one caveat is that we found evidence of a slight increase in irrational nonequilibrium behavior (voting for α) in the larger elections. Whether this trend would continue as election size is further scaled up is an open question a swing voter's curse environment.https://resolver.caltech.edu/CaltechAUTHORS:20170727-151856770The Strategy-Proof Provision of Public Goods under Congestion and Crowding Preferences
https://resolver.caltech.edu/CaltechAUTHORS:20170802-141721922
DOI: 10.7907/tbv76-57j49
We examine the strategy-proof provision of excludable public goods when agents care not only about the level of provision of a public good, but also the number of consumers. We show that on such domains strategy-proof and efficient social choice functions satisfying an outsider independence condition must be rigid in that they must always assign a fixed number of consumers, regardless of individual desires to participate. The fixed number depends on the attitudes of agents regarding group size - being small when congestion effects dominate (individuals prefer to have fewer other consumers) and large when cost sharing effects dominate (agents prefer to have more consumers). A hierarchical rule selects which consumers participate and a variation of a generalized median rule to selects the level of the public good. Under heterogeneity in agents' views on the optimal number of consumers, strategy-proof, efficient, and outsider independent social choice functions are much more limited and in an important case must be dictatorial.https://resolver.caltech.edu/CaltechAUTHORS:20170802-141721922An Empirical Bayes Approach to Estimating Ordinal Treatment Effects
https://resolver.caltech.edu/CaltechAUTHORS:20170727-161733786
DOI: 10.7907/wn44k-gkd25
Ordinal variables—categorical variables with a defined order to the categories, but without equal spacing between them—are frequently used in social science applications. Although a good deal of research exists on the proper modeling of ordinal response variables, there is not a clear directive as to how to model ordinal treatment variables. The usual approaches found in the literature for using ordinal treatment variables are either to use fully unconstrained, though additive, ordinal group indicators or to use a numeric predictor constrained to be continuous. Generalized additive models are a useful exception to these assumptions (Beck and Jackman 1998). In contrast to the generalized additive modeling approach, we propose the use of a Bayesian shrinkage estimator to model ordinal treatment variables. The estimator we discuss in this paper allows the model to contain both individual group level indicators and a continuous predictor. In contrast to traditionally used shrinkage models that pull the data toward a common mean, we use a linear model as the basis. Thus, each individual effect can be arbitrary, but the model "shrinks" the estimates toward a linear ordinal framework according to the data. We demonstrate the estimator on two political science examples: the impact of voter identification requirements on turnout (Alvarez, Bailey, and Katz 2007), and the impact of the frequency of religious service attendance on the liberality of abortion attitudes (e.g., Singh and Leahy 1978, Tedrow and Mahoney 1979, Combs and Welch 1982).https://resolver.caltech.edu/CaltechAUTHORS:20170727-161733786Strategic Voting in Sequential Committees
https://resolver.caltech.edu/CaltechAUTHORS:20170728-153442527
DOI: 10.7907/gzv0k-1wq94
We consider strategic voting with incomplete information and partially common values in sequential committees. A proposal is considered against the status quo in one committee, and only upon its approval advances for consideration in a second committee. Committee members (i) are privately and imperfectly informed about an unobservable state of nature which is relevant to their payoffs, and (ii) have a publicly observable bias with which they evaluate information. We show that the tally of votes in the originating committee can aggregate and transmit relevant information for members of the second committee in equilibrium, provide conditions for the composition and size of committees under which this occurs, and characterize all three classes of voting equilibria with relevant informative voting.https://resolver.caltech.edu/CaltechAUTHORS:20170728-153442527Modeling Dynamics in Time-Series–Cross-Section Political Economy Data
https://resolver.caltech.edu/CaltechAUTHORS:20170727-141913417
DOI: 10.7907/zx3dd-bq275
This paper deals with a variety of dynamic issues in the analysis of time- series–cross-section (TSCS) data. While the issues raised are more general, we focus on applications to political economy. We begin with a discussion of specification and lay out the theoretical differences implied by the various types of time series models that can be estimated. It is shown that there is nothing pernicious in using a lagged dependent variable and that all dynamic models either implicitly or explicitly have such a variable; the differences between the models relate to assumptions about the speeds of adjustment of measured and unmeasured variables. When adjustment is quick it is hard to differentiate between the various models; with slower speeds of adjustment the various models make sufficiently different predictions that they can be tested against each other. As the speed of adjustment gets slower and slower, specification (and estimation) gets more and more tricky. We then turn to a discussion of estimation. It is noted that models with both a lagged dependent variable and serially correlated errors can easily be estimated; it is only OLS that is inconsistent in this situation. We then show, via Monte Carlo analysis shows that for typical TSCS data that fixed effects with a lagged dependent variable performs about as well as the much more complicated Kiviet estimator, and better than the Anderson-Hsiao estimator (both designed for panels).https://resolver.caltech.edu/CaltechAUTHORS:20170727-141913417Empirically Evaluating the Electoral College
https://resolver.caltech.edu/CaltechAUTHORS:20170802-162109260
DOI: 10.7907/5pr6t-3a795
The 2000 U.S. presidential election has once again rekindled interest in possible electoral reform including the possible elimination of the Electoral College. Most arguments against the Electoral College have either been based on ancedotal evidence from particular elections or on highly stylized formal models We take a very different approach here. We develop a set of statistical models based on historical election results to evaluate the Electoral College as it has performed in practice. Thus, while we do not directly address the normative question of the value of the U.S. Electoral College, this paper does provide the necessary tools and evidence to make such an evaluation. We show that when one preforms such an analysis there is not much basis to argue for reforming the Electoral College. We first show that while the Electoral College may once have been biased against the Democrats, given the current distribution of voters, neither party is advantaged by the system. Further, the electoral vote will differ from the popular vote will only when the average votes shares are very close to a half. We then show that while there has been much temporal variation in voting power over the last several decades, the voting power of individual citizens would not likely increase under a popular vote system of electing the president.https://resolver.caltech.edu/CaltechAUTHORS:20170802-162109260Standard Voting Power Indexes Don't Work: An Empirical Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170802-162810725
DOI: 10.7907/8999e-yz237
Voting power indexes such as that of Banzhaf (1965) are derived, explicitly or implicitly, from the assumption that all votes are equally likely (i.e., random voting). That assumption can be generalized to hold that the probability of a vote being decisive in a jurisdiction with n voters is proportional to 1/√n.
We test and reject this hypothesis empirically, using data from several different U.S. and European elections. We find that the probability of a decisive vote is approximately proportional to 1/n. The random voting model (or its generalization, the square-root rule) overestimates the probability of close elections in larger jurisdictions. As a result, classical voting power indexes make voters in large jurisdictions appear more powerful than they really are.
The most important political implication of our result is that proportionally weighted voting systems (that is, each jurisdiction gets a number of votes proportional to n) are basically fair. This contradicts the claim in the voting power literature that weights should be approximately proportional to √n.https://resolver.caltech.edu/CaltechAUTHORS:20170802-162810725Understanding Price Controls and Non-Price Competition with Matching Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170727-094902937
DOI: 10.7907/z3ed3-whz07
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170727-094902937The Effect of Candidate Quality on Electoral Equilibrium: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20170802-160404405
DOI: 10.7907/w2rzf-2gt58
When two candidates of different quality compete in a one dimensional policy space, the equilibrium outcomes are asymmetric and do not correspond to the median. There are three main effects. First, the better candidate adopts more centrist policies than the worse candidate. Second, the equilibrium is statistical, in the sense that it predicts a probability distribution of outcomes rather than a single degenerate outcome. Third, the equilibrium varies systematically with the level of uncertainty about the location of the median voter. We test these three predictions using laboratory experiments, and find strong support for all three. We also observe some biases and show that they can be explained by quantal response equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170802-160404405The Testable Implications of Zero-sum Games
https://resolver.caltech.edu/CaltechAUTHORS:20170727-143423913
DOI: 10.7907/k145j-6cx97
We study Nash-rationalizable joint choice behavior under restriction on zero- sum games. We show that interchangeability of choice behavior is the only additional condition which distinguishes zero-sum games from general non-cooperative games with respect to testable implications. This observation implies that in some sense interchangeability is not only a necessary but also a sufficient property which differentiates zero-sum games.https://resolver.caltech.edu/CaltechAUTHORS:20170727-143423913Separation of Decisions in Group Identification
https://resolver.caltech.edu/CaltechAUTHORS:20170802-101737968
DOI: 10.7907/zvwn9-g1983
We study a model of group identification in which individuals' opinions as to the membership of a group are aggregated to form a list of the group's members. Potential aggregation rules are studied through the axiomatic approach. We introduce two axioms, meet separability and join separability, each of which requires the list of members generated by the aggregation rule to be independent of whether the question of membership in a group is separated into questions of membership in two other groups. We use these axioms to characterize a class of "one vote" rules, in which one opinion determines whether an individual is considered to be a member of a group. We then use this characterization to provide new axiomatizations of the liberal rule, in which each individual determines for himself whether he is a member of the group, as the only non-degenerate anonymous rule satisfying the meet separability and join separability axioms.https://resolver.caltech.edu/CaltechAUTHORS:20170802-101737968The Linking of Collective Decisions and Efficiency
https://resolver.caltech.edu/CaltechAUTHORS:20170801-163637851
DOI: 10.7907/1s9xs-94q22
For groups that must make several decisions of similar form, we define a simple and general mechanism that is designed to promote social efficiency. The mechanism links the various decisions by forcing agents to budget their representations of preferences so that the frequency of preferences across problems conforms to the underlying distribution of preferences. We show that as the mechanism operates over a growing number decisions, the welfare costs of incentive constraints completely disappear. In addition, as the number of decisions being linked grows, a truthful strategy is increasingly successful and secures the efficient utility level for an agent.https://resolver.caltech.edu/CaltechAUTHORS:20170801-163637851Non-Existence of Equilibrium in Vickrey, Second-Price, and English Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170802-114410645
DOI: 10.7907/et2fq-5nh70
A simple example shows that equilibria can fail to exist in second price (Vickrey) and English auctions when there are both common and private components to bidders' valuations and private information is held on both dimensions. The example shows that equilibrium only exists in the extremes of pure private and pure common values, and that existence in standard models is not robust to a slight perturbation.https://resolver.caltech.edu/CaltechAUTHORS:20170802-114410645Political Bias and War
https://resolver.caltech.edu/CaltechAUTHORS:20170802-103618263
DOI: 10.7907/zqmcq-znk73
We examine the incentives for countries to go to war as they depend on the comparison between how much their pivotal decision-makers have at risk and how much they stand to gain from a war. How this ratio compares to the country at large is what we term "political bias." If there is no political bias, then there are always payments that one country would like to make to the other that will avoid a war in the presence of commitment or enforceability of peace treaties. If there is a bias on the part of one or both countries, then war can result and in some cases cannot be prevented by any transfer payments. We examine how war technology and relative wealth levels interact with political bias in determining whether countries make transfers, go to war, and form alliances. Our results shed some new light on the uneven contender paradox and the interpretation of the "democratic peace".https://resolver.caltech.edu/CaltechAUTHORS:20170802-103618263Choice and individual welfare
https://resolver.caltech.edu/CaltechAUTHORS:20170728-135604147
DOI: 10.7907/rma1m-0gq76
We propose an abstract method of systematically assigning a "rational" ranking of outcomes to choice data which may not be rationalizable in any sense. An individual welfare functional maps stochastic choice functions into weak orders. A stochastic choice function gives the empirical frequency of choices for any possible opportunity set (framing factors may also be incorporated into the model); we call such general choice functions "choice distributions." We require that for any two alternatives x and y, if our individual welfare functional recommends x over y given two distinct choice distributions, then it also recommends x over y for any mixture of the two choice distributions. Together with some mild technical requirements, such an individual welfare functional must weight every opportunity set and assign a utility to each alternative x which is the sum across all opportunity sets of the weighted probability of x being chosen from the set. It therefore requires us to have a "prior view" about how important a choice of x from a given opportunity set is.https://resolver.caltech.edu/CaltechAUTHORS:20170728-135604147A citizen candidate model with private information and unique equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20170727-163611649
DOI: 10.7907/gdsn6-c9x32
We study a citizen candidate model where citizen ideal points are private information and each ideal point is an independent draw from a uniform distribution. We characterize the equilibrium as a function of the entry cost, the office-holding benefit, and the number of citizens. In contrast to the standard citizen candidate models, equilibrium is unique. Entry is from the extremes of the distribution. A citizen enters if and only if her ideal point greater than or equal to some critical distance from the expected median. Expected policies are more extreme as entry costs increase or office-holding benefits decrease, and as the number of citizens increases.https://resolver.caltech.edu/CaltechAUTHORS:20170727-163611649Volunteering and Image Concerns
https://resolver.caltech.edu/CaltechAUTHORS:20170728-144440439
DOI: 10.7907/zrzw5-m2b82
We design an experiment to analyze the impact of image concerns and material incentives on volunteering. Our design retains the advantages of laboratory control while incorporating field context by engaging subjects in an actual nonprofit's operation. We find that working in a public setting significantly increases volunteering. Monetary incentives have little impact, although they are slightly more effective in a private setting. Our results suggest that organizations have more to gain by catering to volunteers' image concerns than by providing monetary benefits.https://resolver.caltech.edu/CaltechAUTHORS:20170728-144440439Like Father, Like Son: Social Networks, Human Capital Investment, and Social Mobility
https://resolver.caltech.edu/CaltechAUTHORS:20170802-113029102
DOI: 10.7907/mc7yt-v4v02
We build a model where an individual sees higher returns to investments in human capital when their neighbors in a social network have higher levels of human capital. We show that the correlation of human capital across generations of a given family is directly related to the sensitivity of individual investment decisions to the state of the social network. Increasing the sensitivity leads to increased intergenerational correlation, as well as more costly investment decisions on average in the society. We calibrate a simple threshold version of the model to data from a variety of EU nations. We also show how directly analyzing sensitivity of decisions to social circumstances can lead to information that is not captured by intergenerational correlation.https://resolver.caltech.edu/CaltechAUTHORS:20170802-113029102What's age to do with it? Supreme Court appointees and the long run location of the Supreme Court median justice
https://resolver.caltech.edu/CaltechAUTHORS:20170727-113314415
DOI: 10.7907/5j2s1-sgj23
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170727-113314415Incentive Compatibility of Large Centralized Matching Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170727-114350596
DOI: 10.7907/vgfp1-8w504
This paper discusses the strategic manipulation of stable matching mechanisms. We provide a model of a two-sided matching market, in which a firm hires a worker, and each of them receives non-transferable utility. Assuming that the utilities are randomly drawn from underlying distributions, we measure the likelihood of differences in utilities from different stable matchings. Our key finding is that in large markets, most agents are close to being indifferent among partners in different stable matchings. Specifically, as the number of firms and workers becomes large, the expected proportion of firms and workers whose utilities from all stable matchings are within an arbitrarily small difference of one another converges to one. It is known that the utility gain by manipulating a stable matching mechanism is limited by the difference between utilities from the most and the least preferred stable matchings. Thus, the finding also implies that the expected proportion of agents who may obtain a significant utility gain from manipulation vanishes in large markets. This result reconciles successful stable mechanisms in practice with the theoretical concerns about strategic manipulation.https://resolver.caltech.edu/CaltechAUTHORS:20170727-114350596A Bayesian multinomial probit analysis of voter choice in Chile's 2005 Presidential election
https://resolver.caltech.edu/CaltechAUTHORS:20170728-134342343
DOI: 10.7907/b3tkd-t8744
The profound transformations in Chile's party structure since 1989 has led several authors to examine the main cleavages shaping partisan divide and the impact of different factors on citizens' party preferences. No study, however, has examined the influence of these factors on actual vote choice. We implement a Bayesian multinomial probit model to analyze voter choice in Chile's 2005 election. We show that the authoritarian-democratic cleavage dominated voter choice, with socio-demographic variables playing a less important role. We also find that the presence of a second conservative candidate significantly affected citizens' electoral behavior, increasing the support for the right and influencing the electoral outcome in a way that cannot be accounted for by analyses focused exclusively on citizens' party identification.https://resolver.caltech.edu/CaltechAUTHORS:20170728-134342343Vote trading with and without party leaders
https://resolver.caltech.edu/CaltechAUTHORS:20170727-100308276
DOI: 10.7907/4nb9v-8rb51
Two groups of voters of known sizes disagree over a single binary decision to be taken by simple majority. Individuals have different, privately observed intensities of preferences and before voting can buy or sell votes among themselves for money. We study the implication of such trading for outcomes and welfare when trades are coordinated by the two group leaders and when they take place anonymously in a competitive market. The theory has strong predictions. In both cases, trading falls short of full efficiency, but for opposite reasons: with group leaders, the minority wins too rarely; with market trades, the minority wins too often. As a result, with group leaders, vote trading improves over no-trade; with market trades, vote trading can be welfare reducing. All predictions are strongly supported by experimental results.https://resolver.caltech.edu/CaltechAUTHORS:20170727-100308276Market Design for Fishery IFQ Programs
https://resolver.caltech.edu/CaltechAUTHORS:20170727-150205866
DOI: 10.7907/74cgh-0j750
I examine the impact of market design on the performance of a cap-and-trade program for Individual Fishing Quotas. In equilibrium, neither the term of the quota, the number of years for which it is valid, nor the method of initial allocation, granting or selling, has a differential effect on the protability of the shery or the quality of the environment. However, the term of the quota and the method of initialization can have a big impact on the price discovery process and whether equilibrium is attained. Because of this, both the fishery and the environment can be significantly better off with a mixture of historically based grants and auctions with some form of limited term quotas. I also discuss some additional benefits from an initialization process that generates some revenue for the public. Section 5 contains a summary.https://resolver.caltech.edu/CaltechAUTHORS:20170727-150205866Quasi-Maximum Likelihood Estimation for Conditional Quantiles
https://resolver.caltech.edu/CaltechAUTHORS:20170802-144322831
DOI: 10.7907/6c82y-05842
In this paper we derive the asymptotic distribution of a new class of quasi-maximum likelihood estimators (QMLE) based on a 'tick-exponential' family of densities. We show that the 'tick-exponential' assumption is a necessary and sufficient condition for a QMLE to be consistent for the parameters of a correctly specified model of a given conditional quantile. Hence, the role of this family of densities in the conditional quantile estimation is analog to the role of the linear-exponential family in the conditional mean estimation. The 'tick-exponential' QMLEs are shown to be asymptotically normal with an asymptotic covariance matrix that has a novel form, not seen in earlier work, and which accounts for possible model misspecification. For practical purposes, we show that the maximization of the 'tick-exponential' (quasi) log-likelihood can conveniently be carried out by using standard gradient-based optimization techniques. More importantly, we provide a consistent estimator for the asymptotic covariance matrix based on the "scores" of the log-likelihood, which allows us to compute the conditional quantile confidence intervals.https://resolver.caltech.edu/CaltechAUTHORS:20170802-144322831A Survey of Models of Network Formation: Stability and Efficiency
https://resolver.caltech.edu/CaltechAUTHORS:20170801-140704727
DOI: 10.7907/c4y9t-bg336
I survey the recent literature on the formation of networks. I provide definitions of network games, a number of examples of models from the literature, and discuss some of what is known about the (in)compatibility of overall societal welfare with individual incentives to form and sever links.https://resolver.caltech.edu/CaltechAUTHORS:20170801-140704727Policy-based abstention in Brazil's 2002 presidential election
https://resolver.caltech.edu/CaltechAUTHORS:20170728-132603569
DOI: 10.7907/xy2j9-bmc50
This paper implements a unified model of individual abstention and vote choice to analyze policy-based alienation and indifference in Brazil's 2002 presidential election. The results indicate that both alienation and indifference depressed turnout, with indifference contributing slightly more to voter abstention. Also, the determinants of alienation and indifference differed considerably, the former being determined by structural factors such as voters' information and perceived efficacy levels, while the latter was related to short- term aspects such as parties' mobilization efforts. More importantly, evidence shows that while alienation and indifference were strongly influenced by attitudinal and protest variables, they were also affected by citizens' evaluation of candidates' ideological locations. The main conclusion is that abstention in Brazil's 2002 election had a policy-driven component and that spatial considerations played a substantive role in citizens' electoral behavior, a fact that has been overlooked in previous research on the determinants of abstention in Latin America.https://resolver.caltech.edu/CaltechAUTHORS:20170728-132603569Objective Lotteries as Ambiguous Objects: Allais, Ellsberg, and Hedging
https://resolver.caltech.edu/CaltechAUTHORS:20170727-103944043
DOI: 10.7907/b4ecq-6re88
We derive axiomatically a model in which the Decision Maker can exhibit simultaneously both the Allais and the Ellsberg paradoxes in the standard setup of Anscombe and Aumann (1963). Using the notion of 'subjective', or 'outcome' mixture of Ghirardato et al. (2003), we define a novel form of hedging for objective lotteries, and introduce a novel axiom which is a generalized form of preferences for hedging. We show that this axiom, together with other standard ones, is equivalent to a represen- tation in which the agent reacts to ambiguity using multiple priors like the MaxMin Expected Utility model of Gilboa and Schmeidler (1989), generating an Ellsberg-like behavior, while at the same time, she treats also objective lotteries as 'ambiguous objects,' and use a fixed (and unique) set of priors to evaluate them – generating an Allais-like behavior. We show that this representation is equivalent to one in which the agent evaluates lotteries using a set of concave rank-dependent utility functionals. A comparative notion of preference for hedging is also introduced.https://resolver.caltech.edu/CaltechAUTHORS:20170727-103944043A Simple Axiomatization of Quantiles on the Domain of Distribution Functions
https://resolver.caltech.edu/CaltechAUTHORS:20170802-104836154
DOI: 10.7907/ykyzr-fyn87
In an environment in which the primitive is the space of distribution functions, we characterize the quantile functions by the axioms ordinal covariance, monotonicity with respect to first order stochastic dominance, and upper semicontinuity. We show how to characterize the VAR in a similar manner.https://resolver.caltech.edu/CaltechAUTHORS:20170802-104836154Markets, Technology and Trust
https://resolver.caltech.edu/CaltechAUTHORS:20170802-152325635
DOI: 10.7907/kwvc9-x4m33
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170802-152325635No Trade
https://resolver.caltech.edu/CaltechAUTHORS:20170728-151244423
DOI: 10.7907/h4tdm-q9r95
We investigate, in a simple bilateral bargaining environment, the extent to which asymmetric information can induce individuals to engage in exchange where trade is not mutually profitable. We first establish a no-trade theorem for this environment. A laboratory experiment is conducted, where trade is found to occur between 16% and 32% of the time, depending on the specific details of the environment and trading mechanism. In most cases, buyers gain from such exchange, at the expense of sellers. An equilibrium model with naïve, or "cursed," beliefs accounts for some of the behavior findings, but open questions remain.https://resolver.caltech.edu/CaltechAUTHORS:20170728-151244423The Free Rider Problem: a Dynamic Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170727-105218172
DOI: 10.3386/w17926
We present a dynamic model of free riding in which "n" infinitely lived agents choose between private consumption and contributions to a durable public good "g". We characterize the set of continuous Markov equilibria in economies with reversibility, where investments can be positive or negative; and in economies with irreversibility, where investments are non negative and "g" can only be reduced by depreciation. With reversibility, there is a continuum of equilibrium steady states: the highest equilibrium steady state of "g" is increasing in "n", and the lowest is decreasing. With irreversibility, the set of equilibrium steady states converges to the highest steady state possible with reversibility, as depreciation converges to zero. We also show that in economies with reversibility there are always non-monotonic equilibria in which "g" converges to the steady state with damped oscillations; and there can be equilibria with persistent limit cycles.https://resolver.caltech.edu/CaltechAUTHORS:20170727-105218172The Nature of Collusion Facilitating and Collusion Breaking Power of Simultaneous Ascending Price and Simultaneous Descending Price Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170727-144612067
DOI: 10.7907/be3gz-yhh47
This paper demonstrates that a robust, tacit collusion evolves quickly in a collusion incubator environment, but is destroyed by the simultaneous, descending price auction. Theories of collusion-producing behavior, along with the detail of the states on which strategies are conditioned, lead to a deeper understanding of how tacit collusion evolves and its necessary conditions. These theories explain how the descending price auction destroys the collusion. The experiments proceed by conducting simultaneous ascending price auctions in the collusion incubator. Then, once the tacit collusion developed, changing to the descending auction. The change moved prices from collusive levels to near competitive levels.https://resolver.caltech.edu/CaltechAUTHORS:20170727-144612067The "Reasonable Man" and other legal standards
https://resolver.caltech.edu/CaltechAUTHORS:20170728-152555784
DOI: 10.7907/52q8c-zjc32
In the common law of negligence, an individual breaches the duty of due care if she fails to act as would a "reasonable man" under the circumstances. A natural question, first posed by Rubinstein [8], is whether the "reasonable man" can be derived from the views of actual agents. Rubinstein introduced an axiomatic model and showed that there does not exist a non-dictatorial aggregation method which satisfies several normatively appealing properties. I introduce a new model based on a different understanding of the "reasonable man" and provide a characterization of the "union rule", the most inclusive view of reasonableness satisfying a basic Pareto criterion. The union rule requires that a jury must unanimously agree to find a defendant liable for negligence.https://resolver.caltech.edu/CaltechAUTHORS:20170728-152555784Improving Coordination and Cooperation Through Competition
https://resolver.caltech.edu/CaltechAUTHORS:20170727-140736706
DOI: 10.7907/kp9pn-h4663
We use game theory and experimental economics approaches to analyze the relationships between corporate culture and the persistent performance differences among seemingly similar enterprises. First, we show that competition leads to higher minimum effort levels in the minimum effort coordination game. This implies that, organizations with competitive institutional design are more likely to have better coordination, hence better performance outcome. Furthermore, we show that organizations with better co- ordination also lead to higher rates of cooperation in the prisoner's dilemma game. This supports the theory that the high-efficiency culture developed in coordination games act as a focal point for the outcome of subsequent prisoner's dilemma game. In turn, we argue that these endogenous features of culture developed from coordination and cooperation can help explain the persistent performance differences.https://resolver.caltech.edu/CaltechAUTHORS:20170727-140736706Finding a Walrasian equilibrium is easy for a fixed number of agents
https://resolver.caltech.edu/CaltechAUTHORS:20170727-110237088
DOI: 10.7907/wwxc6-hbb24
In this work, we study the complexity of finding a Walrasian equilibrium. Our main result gives an algorithm which can compute an approximate Walrasian equilibrium in an exchange economy with general, but well-behaved utility functions, in time that is polynomial in the number of goods when the number of agents is held constant. This result has applications to macroeconomics and finance, where applications of Walrasian equilibrium theory tend to deal with many goods but a fixed number of agents.https://resolver.caltech.edu/CaltechAUTHORS:20170727-110237088A Characterization of Strategic Complementarities
https://resolver.caltech.edu/CaltechAUTHORS:20170802-152808493
DOI: 10.7907/waxp1-j6f13
I characterize games for which there is an order on strategies such that the game has strategic complementarities. I prove that, with some qualifications, games with a unique equilibrium have complementarities if and only if Cournot best-response dynamics has no cycles; and that all games with multiple equilibria have complementarities.
As applications of my results, I show: 1. That generic 2X2 games either have no pure-strategy equilibria, or have complementarities. 2. That generic two-player infinite ordinal potential games have complementarities.https://resolver.caltech.edu/CaltechAUTHORS:20170802-152808493Network Architecture, Salience and Coordination
https://resolver.caltech.edu/CaltechAUTHORS:20170727-165659587
DOI: 10.7907/m9p9a-hyz82
This paper reports the results of an experimental investigation of monotone games with imperfect information. Players are located at the nodes of a network and observe the actions of other players only if they are connected by the network. These games have many sequential equilibria; nonetheless, the behavior of subjects in the laboratory is predictable. The network architecture makes some strategies salient and this in turn makes the subjects' behavior predictable and facilitates coordination on efficient outcomes. In some cases, modal behavior corresponds to equilibrium strategies.https://resolver.caltech.edu/CaltechAUTHORS:20170727-165659587Correcting for Survey Misreports using Auxiliary Information with an Application to Estimating Turnout
https://resolver.caltech.edu/CaltechAUTHORS:20170727-155558097
DOI: 10.7907/r5ehj-emp92
Misreporting is a problem that plagues researchers that use survey data. In this paper, we develop a parametric model that corrects for misclassified binary responses using information on the misreporting patterns obtained from auxiliary data sources. The model is implemented within the Bayesian framework via Markov Chain Monte Carlo (MCMC) methods, and can be easily extended to address other problems exhibited by survey data, such as missing response and/or covariate values. While the model is fully general, we illustrate its application in the context of estimating models of turnout using data from the American National Elections Studies.https://resolver.caltech.edu/CaltechAUTHORS:20170727-155558097A Statistical Model of Abstention under Compulsory Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170727-171149551
DOI: 10.7907/gy1y6-ndp05
Invalid voting and electoral absenteeism are two important sources of abstention in compulsory voting systems. Previous studies in this area have not considered the correlation between both variables and ignored the compositional nature of the data, potentially leading to unfeasible results and discarding helpful information from an inferential standpoint. In order to overcome these problems, this paper develops a statistical model that accounts for the compositional and hierarchical structure of the data and addresses robustness concerns raised by the use of small samples that are typical in the literature. The model is applied to analyze invalid voting and electoral absenteeism in Brazilian legislative elections between 1945 and 2006 via MCMC simulations. The results show considerable differences in the determinants of both forms of non-voting: while invalid voting was strongly positively related both to political protest and to the existence of important informational barriers to voting, the influence of these variables on absenteeism is less evident. Comparisons based on posterior simulations indicate that the model developed in this paper fits the dataset better than several alternative modeling approaches and leads to different substantive conclusions regarding the effect of different predictors on the both sources of abstention.https://resolver.caltech.edu/CaltechAUTHORS:20170727-171149551A Contraction Principle for Finite Global Games
https://resolver.caltech.edu/CaltechAUTHORS:20170802-112102549
DOI: 10.7907/60sh5-87w07
I provide a new proof of uniqueness of equilibrium in a wide class of global games. I show that the joint best-response in these games is a contraction. The uniqueness result then follows as a corollary of the contraction property. Furthermore, the contraction mapping approach provides a revealing intuition for why uniqueness arises: Complementarities in games generate multiplicity of equilibria, but the global games structure dampens complementarities so that only one equilibrium exists. I apply my result to show that uniqueness is obtained through contraction in currency crises and Diamond's search models.https://resolver.caltech.edu/CaltechAUTHORS:20170802-112102549The Design and Testing of Information Aggregation Mechanisms: A Two-Stage Parimutuel IAM
https://resolver.caltech.edu/CaltechAUTHORS:20170802-105433880
DOI: 10.7907/sjj5s-1xs10
The research reported here is focused on the design of new information aggregation mechanisms. These are competitive processes designed for collecting and aggregating dispersed information held in the form of impression and belief, that might otherwise be impossible to get. The research explores alternative institutional forms of IAMs and how they work.
That specially designed markets can aggregate information is well documented in the literature as are the problems encountered when parimutuel-type systems are employed in an information aggregation capacity. This research is focused on new mechanisms, unlike any found evolving naturally, which mitigate these problems. These new mechanisms speed the process through which information is revealed, reduce deceptive behavior and reduce the instances of substantially incorrect aggregation (i.e., bubbles). The paper finds that a special, "two-stage" parimutuel mechanism is an improvement over previously studied parimutuel mechanisms. The two-stage parimutuel, on average, makes a prediction closer to that predicted using all available information. The mechanism suffers from fewer mirages (bubbles) than do previous parimutuel structures and it produces indicators for assessing the reliability of the information produced.https://resolver.caltech.edu/CaltechAUTHORS:20170802-105433880Supermodularizability
https://resolver.caltech.edu/CaltechAUTHORS:20170802-103018989
DOI: 10.7907/fhtsk-6aa08
We study the ordinal content of assuming supermodularity, including conditions under which a binary relation can be represented by a supermodular function. When applied to revealed-preference relations, our results imply that supermodularity is some times not refutable: A consumer's choices can be rationalized with a supermodular utility function if they can be rationalized with a monotonic utility function. Hence, supermodularity is not empirically distinguishable from monotonicity. We present applications to assortative matching, decision under uncertainty, and to testing production technologies.https://resolver.caltech.edu/CaltechAUTHORS:20170802-103018989Postcards from the NSF
https://resolver.caltech.edu/CaltechAUTHORS:20170802-134836111
DOI: 10.7907/2fmxx-mx706
We provide an overview of the workings of the National Science Foundation and the proposal review process, as well as some guidance in writing proposals for funding.https://resolver.caltech.edu/CaltechAUTHORS:20170802-134836111On the Nature of Competition in Alternative Electoral Systems
https://resolver.caltech.edu/CaltechAUTHORS:20170728-140717217
DOI: 10.7907/fpkkh-jah28
In this paper we argue that the number of candidates running for public office, their ideological differentiation, and the intensity of campaign competition are all naturally intertwined, and jointly determined in response to the incentives provided by the electoral system. We propose a simple general equilibrium model that integrates these elements in a unitary framework, and provide a comparison between majoritarian and proportional electoral systems.https://resolver.caltech.edu/CaltechAUTHORS:20170728-140717217Experiments in Political Economy
https://resolver.caltech.edu/CaltechAUTHORS:20170726-150555535
DOI: 10.7907/c54mq-f2c90
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170726-150555535Scheduling Auctions and Proto-Parties in Legislatures
https://resolver.caltech.edu/CaltechAUTHORS:20170726-143507538
DOI: 10.7907/7gq8e-c4j23
We consider the impact of the scarcity of plenary time in legislatures both on the outcome of the legislative bargaining process and the organization of the legislature itself. We do so by developing a novel model that we call scheduling auctions. In the model, the legislature is charged with allocating a fixed budget. Members can propose an allocation and the scheduling agent decides which one of the possible proposals will be considered by the entire legislature in plenary session for an up or down vote. We show in this simple setting that deciding which member should be selected as the scheduling agent endogenously induces the creation of nascent political parties that we call proto-parties. We also show that the these legislative structures have positive welfare implications.https://resolver.caltech.edu/CaltechAUTHORS:20170726-143507538Policy Reversals in Risk Management: The Effect of Refined Analyses
https://resolver.caltech.edu/CaltechAUTHORS:20170807-145422885
DOI: 10.7907/40mdh-rmj22
Reversals of policy recommendations occur in risk management when the social decision maker aggregates individuals' subjective utilities for the outcomes of a risky policy measure. The level of detail with which these outcomes are described can significantly affect the resulting policy recommendation. The choice of the level of detail on which we conduct an analysis therefore amounts to an implicit value judgment. Moreover, the power to fix the level of the analysis implies partial control over policy recommendations. We propose an alternative approach to decision—theoretically sound risk management.https://resolver.caltech.edu/CaltechAUTHORS:20170807-145422885Endowment effect theory, subject misconceptions and enhancement effect theory: A reply to Isoni, Loomes and Sugden
https://resolver.caltech.edu/CaltechAUTHORS:20170726-154849155
DOI: 10.7907/wvna0-v6911
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170726-154849155On multiple discount rates
https://resolver.caltech.edu/CaltechAUTHORS:20170726-084059845
DOI: 10.7907/v6dnf-qtv34
We propose a theory of intertemporal choice that is robust to specific assumptions on the discount rate. One class of models requires that one utility stream be chosen over another if and only if its discounted value is higher for all discount factors in a set. Another model focuses on an average discount factor. Yet another model is pessimistic, and evaluates a flow by the lowest available discounted value.https://resolver.caltech.edu/CaltechAUTHORS:20170726-084059845Savage in the Market
https://resolver.caltech.edu/CaltechAUTHORS:20170726-151722235
DOI: 10.7907/wx589-46888
We develop a behavioral axiomatic characterization of Subjective Expected Utility (SEU) under risk aversion. Given is an individual agent's behavior in the market: assume a finite collection of asset purchases with corresponding prices. We show that such behavior satisfies a "revealed preference axiom" if and only if there exists a SEU model (a subjective probability over states and a concave utility function over money) that accounts for the given asset purchases.https://resolver.caltech.edu/CaltechAUTHORS:20170726-151722235Estimating Dynamic Discrete Choice Models Via Convex Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170727-090045366
DOI: 10.7907/vgp0z-00y54
Using results from convex analysis, we characterize the identification and estimation of dynamic discrete-choice models based on the random utility framework. Based on these insights, we propose a new two-step estimator for these models, which is easily applicable to models in which the utility shocks may not derive from an extreme- value distribution, and may be mutually correlated with each other and with the state variables. Monte Carlo results demonstrate the good performance of this estimator, and we provide a short application using the dynamic bus engine replacement model in Rust (1987).https://resolver.caltech.edu/CaltechAUTHORS:20170727-090045366Correlated Equilibria in Voter Turnout Games
https://resolver.caltech.edu/CaltechAUTHORS:20170726-114216491
DOI: 10.7907/s8476-bzv73
Communication is fundamental to elections. This paper extends canonical voter turnout models to include any form of communication, and characterizes the resulting set of correlated equilibria. In contrast to previous research, high-turnout equilibria exist in large electorates and uncertain environments. This difference arises because communication can be used to coordinate behavior in such a way that voters find it incentive compatible to always follow their signals past the communication stage. The equilibria have expected turnout of at least twice the size of the minority for a wide range of positive voting costs, and show intuitive comparative statics on turnout: it varies with the relative sizes of diff t groups, and decreases with the cost of voting. This research provides a general micro foundation for group-based theories of voter mobilization, or voting driven by communication on a network.https://resolver.caltech.edu/CaltechAUTHORS:20170726-114216491Stable Aggregation of Preferences
https://resolver.caltech.edu/CaltechAUTHORS:20170807-152622185
DOI: 10.7907/3eqbb-2y409
We arrive at new conclusions for social choice theory by considering the process in which we refine decision-theoretic models and account for previously irrelevant parameters of a decision situation (cf. Savage's `small worlds'). Suppose that, for each individual, we consider a coarse-grained and a fine-grained decision-theoretic model, both of which are consistent with each other in a sense to be defined. We desire any social choice rule to be stable under refinements in the sense that the group choice based on fine-grained individual models and the group choice based on coarse-grained individual models agree for choices among coarse-grained alternatives. For ex ante aggregation, we find that stability is ubiquitous since it follows from independence of irrelevant alternatives. In ex post aggregation, individuals' utilities are pooled separately from their beliefs before the group's choice function is constructed. We find that any non-exceptional' rule (e.g., any Pareto optimal rule) for ex post aggregation must be unstable. If the rule is, in addition, independent of irrelevant alternatives, we find an infinite series of reversals of binary group choices. We consider applications to risk management and the theory of consensus formation.https://resolver.caltech.edu/CaltechAUTHORS:20170807-152622185Identifying Treatment Effects under Data Combination
https://resolver.caltech.edu/CaltechAUTHORS:20170726-154328187
DOI: 10.7907/dzksf-x2220
We consider the identification of counterfactual distributions and treatment effects when the outcome variables and conditioning covariates are observed in separate datasets. Under the standard selection on observables assumption, the counterfactual distributions and treatment effect parameters are no longer point identified. However, applying the classical monotone re-arrangement inequality, we derive sharp bounds on the counterfactual distributions and policy parameters of interest.https://resolver.caltech.edu/CaltechAUTHORS:20170726-154328187An Agnostic and Practically Useful Estimator of the Stochastic Discount Factor
https://resolver.caltech.edu/CaltechAUTHORS:20170726-110645423
DOI: 10.7907/b7hye-e5g06
We propose an estimator for the stochastic discount factor (SDF) which is agnostic because it does not require macroeconomic proxies or preference assumptions. It depends only on observed asset returns. Nonetheless, it is immune to the form of the multivariate return distribution, including the distribution's factor structure. Putting our estimator to work, we find that a unique positive SDF prices all U.S. asset classes and satisfies the Hansen/Jagannathan variance bound. In contrast, the Chinese and Indian equity markets do not share the same SDF and hence do not seem to be integrated.https://resolver.caltech.edu/CaltechAUTHORS:20170726-110645423Choosing with the Worst in Mind: A Reference-Dependent Model
https://resolver.caltech.edu/CaltechAUTHORS:20170726-141339637
DOI: 10.7907/v2zmh-ess90
We develop an axiomatic model of reference-dependent preferences in which reference points are menu-dependent. In particular, we focus on choices from menus of two-attribute alternatives, and the reference point for the given menu is a vector that consists of the minimums of each dimension of the menu. We characterize this model by two weakenings of the Weak Axiom of Revealed Preference (WARP) in addition to standard axioms. Our model is consistent with the attraction effect and the compromise effect, well-known deviations from rational choice theory, and it provides a connection between the two effects and diminishing sensitivity, a widely used behavioral property in economics. We apply the model to two different contexts, intertemporal choice and risky choice, and diminishing sensitivity has interesting implications. In intertemporal choice, the main implication of the model is that borrowing constraints produce a psychological pressure to move away from the constraints even if they are not binding. This implication provides a rationale for two empirical puzzles of life-cycle consumption profiles. In risky choice, the model allows different degrees of risk-aversion.https://resolver.caltech.edu/CaltechAUTHORS:20170726-141339637Resolving the Errors-in-Variables Bias in Risk Premium Estimation
https://resolver.caltech.edu/CaltechAUTHORS:20170726-114922004
DOI: 10.7907/p4161-89e21
The Fama-Macbeth (1973) rolling-B method is widely used for estimating risk premiums, but its inherent errors-in-variables bias remains an unresolved problem, particularly when using individual assets or macroeconomic factors. We propose a solution with a particular instrumental variable, B calculated from alternate observations. The resulting estimators are unbiased. In simulations, we compare this new approach with several existing methods. The new approach corrects the bias even when the sample period is limited. Moreover, our proposed standard errors are unbiased, and lead to correct rejection size in finite samples.https://resolver.caltech.edu/CaltechAUTHORS:20170726-114922004Economic Value of EWA Lite: A Functional Theory of Learning in Games
https://resolver.caltech.edu/CaltechAUTHORS:20170807-135200613
DOI: 10.7907/fksby-kk523
EWA Lite is a one-parameter theory of learning in normal-form games. It approximates the free parameters in an earlier model (EWA) with functions of experience. The theory is tested on seven different games and compared to other learning and equilibrium theories. Either EWA Lite or parameterized EWA predict best, but one kind of reinforcement learning predicts well in games with mixed-strategy equilibrium. Belief learning models fit worst. The economic value of theories is measured by how much more subjects would have earned if they followed theory recommendations. EWA Lite and EWA add the most economic value in every game but one.https://resolver.caltech.edu/CaltechAUTHORS:20170807-135200613How much does a vote count? Voting power, coalitions, and the Electoral College
https://resolver.caltech.edu/CaltechAUTHORS:20170807-140415393
DOI: 10.7907/4q8jx-vrx44
In an election, the probability that a single voter is decisive is affected by the electoral system—that is, the rule for aggregating votes into a single outcome. Under the assumption that all votes are equally likely (i.e., random voting), we prove that the average probability of a vote being decisive is maximized under a popular-vote (or simple majority) rule and is lower under any coalition system, such as the U.S. Electoral College system, no matter how complicated. Forming a coalition increases the decisive vote probability for the voters within a coalition, but the aggregate effect of coalitions is to decrease the average decisiveness of the population of voters. We then review results on voting power in an electoral college system. Under the random voting assumption, it is well known that the voters with the highest probability of decisiveness are those in large states. However, we show using empirical estimates of the closeness of historical U.S. Presidential elections that voters in small states have been advantaged because the random voting model overestimates the frequencies of close elections in the larger states. Finally, we estimate the average probability of decisiveness for all U.S. Presidential elections from 1960 to 2000 under three possible electoral systems: popular vote, electoral vote, and winner-take-all within Congressional districts. We find that the average probability of decisiveness is about the same under all three systems.https://resolver.caltech.edu/CaltechAUTHORS:20170807-140415393Seeking Alpha? It's a Bad Guideline for Portfolio Optimization
https://resolver.caltech.edu/CaltechAUTHORS:20170726-105902733
DOI: 10.7907/96363-0k862
Alpha is the most popular measure for evaluating the performance of both individual assets and funds. The alpha of an asset with respect to a given benchmark portfolio measures the change in the portfolio's Sharpe ratio driven by a marginal increase in the asset's portfolio weight. Thus, alpha indicates which assets should be marginally over/underweighted relative to the benchmark weights, and by how much. This study shows that alpha is actually a bad guideline for portfolio optimization. The reason is that alpha only measures the effects of infinitesimal changes in the portfolio weights. For small but finite changes, which are those relevant to investors, the optimal weight adjustments are almost unrelated to the alphas. In fact, in many cases the optimal adjustment is in the opposite direction of alpha – it may be optimal to reduce the weight of an asset with a positive alpha, and vice versa. Rather than employing alphas as a guideline, one can do much better by direct optimization with the desired constraint on the distance from the benchmark portfolio weights.https://resolver.caltech.edu/CaltechAUTHORS:20170726-105902733An "Enhanced" Corporate Valuation Model: Theory and Empirical Tests
https://resolver.caltech.edu/CaltechAUTHORS:20170726-090437231
DOI: 10.7907/a836f-dae61
In this paper, we develop an enhanced corporate valuation model based on the implied cost of equity capital (ICC). We argue that the enhanced approach extends the standard market multiples and discounted cash flow (DCF) approaches to corporate valuation. Specifically, it incorporates positive aspects of the market comparables and DCF approaches while mitigating the shortcomings of both. Unlike the traditional market comparables approach, the enhanced approach takes account of the full term structure of earnings forecasts. It does so by using the ICC calculated for the comparable companies as an "enhanced multiple" which translates the entire stream of cash flow forecasts into a value estimate. Unlike the DCF approach it does not require estimation of the cost of equity capital. As such, it avoids the complexity and uncertainty associated with estimating the cost of equity capital. In our empirical tests, we find the enhanced approach to be more accurate than either of the two traditional approaches.https://resolver.caltech.edu/CaltechAUTHORS:20170726-090437231Information Flow and Expected Inflation: An Empirical Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170726-105408407
DOI: 10.7907/arqb7-jjw44
This paper begins by ranking the absolute value of changes in the 10-year break- even inflation (BEI) calculated using 10-year Treasury notes and 10-year TIPS. Next, a news search is conducted to determine what inflation related information was released on days when the change in the BEI was greatest. The goal of the analysis is not only to see what information is associated with large changes in the BEI, but also to gain insight into the extent to which market participants accept the three competing theories of price determination: the classic monetary theory, the fiscal theory, and a "Keynesian" model that combines central bank setting of interest rates with the Philips curve. I find that there was no mention of the money supply, the demand for money, or the rate of monetary growth on any of the days on which there was a large change in the BEI. Further, I find that there was only one mention of the impact of government debt on a day where the BEI changed substantially. In comparison, there were 53 news items on large change days that either explicitly discussed Federal Reserve policy regarding interest rates or focused on the interaction between Fed policy, economic activity and expected inflation. This suggests that market participants accept the "Keynesian" model of price determination.https://resolver.caltech.edu/CaltechAUTHORS:20170726-105408407TECHNICAL REPORT: The Instability of Ex Post and Robust Aggregation without State-Consequence Separation
https://resolver.caltech.edu/CaltechAUTHORS:20170807-144804723
DOI: 10.7907/q7czj-z2a62
We prove the results of Hild (2001a, 2001b) in a framework that does not presuppose the separation of states and consequences. We introduce additional assumptions to make up for the structure that is lost by abandoning the representation of acts as functions from states to consequences.https://resolver.caltech.edu/CaltechAUTHORS:20170807-144804723The Positive Foundation of the Common Prior Assumption
https://resolver.caltech.edu/CaltechAUTHORS:20170807-133652174
DOI: 10.7907/mjxvs-zg731
The existence of a common prior is a property of the state space used to model the players' asymmetric information. We show that this property is not just a technical artifact of the model, but that it is immanent to the players' beliefs. To this end, we devise a condition, phrased solely in terms of the players' mutual beliefs about the basic, objective issues of possible uncertainty, which is equivalent to the existence of a common prior.https://resolver.caltech.edu/CaltechAUTHORS:20170807-133652174The Pro-Competitive Effect of Campaign Limits in Non-Majoritarian Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170726-160811555
DOI: 10.7907/v8d44-qfk46
We study a model of elections in non-majoritarian systems that captures the link between competition in policies and competition in campaign spending. We argue that the overall competitiveness of the political arena depends both on the endogenous number of parties contesting the election and the endogenous level of campaign spending. These two dimensions are linked together through their combined effect on the total equilibrium level of political rents. We illustrate the key insights of the model through the analysis of two major electoral institutions: campaign spending limits and compulsory voting. In particular, we show that under some conditions spending caps and compulsory voting can be pro-competitive, leading to an increase in the number of parties contesting the elections.https://resolver.caltech.edu/CaltechAUTHORS:20170726-160811555Words Get in the Way: The Effect of Deliberation in Collective Decision-Making
https://resolver.caltech.edu/CaltechAUTHORS:20170726-161227615
DOI: 10.7907/95v1p-kkr41
We estimate a model of strategic voting with incomplete information in which committee members - judges in the US courts of appeals - have the opportunity to communicate before casting their votes. The model is characterized by multiple equilibria, and partial identification of model parameters. We obtain confidence regions for these parameters using a two-step estimation procedure that allows flexibly for characteristics of the alternatives and the individuals. To quantify the effects of deliberation on outcomes, we compare the probability of mistakes in the court with deliberation with a counterfactual of no pre-vote communication. We find that for most configurations of the court in the confidence set, in the best case scenario deliberation produces a small potential gain in the effectiveness of the court, and in the worst case it leads to large potential losses.https://resolver.caltech.edu/CaltechAUTHORS:20170726-161227615Market Microstructure Design and Flash Crashes: A Simulation Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170727-091614199
DOI: 10.7907/dtabn-mjz96
We study consequences of regulatory interventions in limit order markets that aim at stabilizing the market after an occurrence of a "flash crash". We use a simulation platform that creates random arrivals of trade orders, that allows us to analyze subtle features of liquidity and price variability under various market structures. The simulations are performed under continuous double-auction microstructure, and under alternatives, including imposing minimum resting times, shutting off trading for a period of time, and switching to call auction mechanisms. We find that the latter is the most effective in restoring the liquidity of the book and recovery of the price level. However, one has to be cautious about possible long term consequences of the intervention on the traders' strategies.https://resolver.caltech.edu/CaltechAUTHORS:20170727-091614199Ambiguity from the Differential Viewpoint
https://resolver.caltech.edu/CaltechAUTHORS:20170802-160400905
DOI: 10.7907/8ckw7-86588
The objective of this paper is to show how ambiguity, and a decision maker (DM)'s response to it, can be modeled formally in the context of a very general decision model.
In the first part of the paper we introduce an "unambiguous preference" relation derived from the DM's preferences, and show that it can be represented by a set of probability measures. We provide such set with a simple differential interpretation and argue that it represents the DM's perception of the "ambiguity" present in the decision problem. Given the notion of ambiguity, we show that preferences can be represented so as to provide an intuitive representation of ambiguity attitudes.
In the second part of the paper we provide some extensions and "applications" of these ideas. We present an axiomatic characterization of the α-MEU decision rule. We also consider a simple dynamic choice setting and show the characterization of the updating rule that revises every prior in the aforementioned set by Bayes's rule; i.e., the generalized Bayesian updating rule.https://resolver.caltech.edu/CaltechAUTHORS:20170802-160400905Testable Implications of Quasi-Hyperbolic and Exponential Time Discounting
https://resolver.caltech.edu/CaltechAUTHORS:20170726-142753192
DOI: 10.7907/je1j3-any80
We present the first revealed-preference characterizations of the models of exponential time discounting, quasi-hyperbolic time discounting, and other time-separable models of consumers' intertemporal decisions. The characterizations provide non-parametric revealed-preference tests, which we take to data using the results of a recent experiment conducted by Andreoni and Sprenger (2012). For such data, we find that less than half the subjects are consistent with exponential discounting, and only a few more are consistent with quasi-hyperbolic discounting.https://resolver.caltech.edu/CaltechAUTHORS:20170726-142753192Candidate Entry and Political Polarization: An Anti-Median Voter Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170725-171230569
DOI: 10.7907/v1seg-1j538
We study a candidate entry model with private information about ideal points. We fully characterize the unique symmetric equilibrium of the entry game, and show that only relatively "extreme" citizen types enter the electoral competition as candidates, whereas more "moderate" types never enter. It generally leads to a substantial political polarization, even when the electorate is not polarized and citizens understand that they vote for more extreme candidates. Our results are robust with respect to changes in the implementation of a default policy if no citizen runs for office. We show that polarization increases in the costs of entry and the degree of risk aversion, and decreases in the benefits from holding office. Finally, we provide a simple limiting characterization pf the unique symmetric equilibrium when the number of citizens goes to infinity. In the limit, only the very most extreme citizens, with ideal points at the boundary of the policy space, become candidates.https://resolver.caltech.edu/CaltechAUTHORS:20170725-171230569Local Institutions and the Dynamics of Community Sorting
https://resolver.caltech.edu/CaltechAUTHORS:20170726-141049380
DOI: 10.7907/4x0qz-0bm25
This paper studies the dynamics by which populations with heterogeneous preferences for local public good provision sort themselves into communities. I conduct laboratory experiments to consider which institutions may best facilitate efficient self-organization when residents are able to move freely between locations. I find that institutions requiring all residents of a community to pay equal taxes enable subjects to sort into stable, homogeneous communities. However, populations can find themselves stuck at local, inefficient equilibria. Though sorted, residents may fail to attain the level of public good provision best suited for them and the system dynamics are crucial for determining whether subjects reach optimally-designed communities. When residents are able to vote for local tax policies with their ballots as well as with their feet, the inefficient local equilibria are eliminated, and each community converges to the most efficient outcome for its population.https://resolver.caltech.edu/CaltechAUTHORS:20170726-141049380Full Stock Payment Marginalization in M&A Transactions
https://resolver.caltech.edu/CaltechAUTHORS:20170726-084800488
DOI: 10.7907/n6h0d-41t82
The number of merger and acquisition (M&A) transactions paid fully in stock in the U.S. market declined sharply after 2001, when pooling and goodwill amortization were abolished by the Financial Accounting Standards Board. Did this accounting rule change really have such far-reaching implications? Using a differences-in-differences test and Canada as a counterfactual, this study reveals that it did. We also report several other results confirming the role of pooling abolishment, including (i) that the decrease in full stock payment relates to CEO incentives and (ii) that previously documented determinants of the M&A mode of payment cannot explain the post pooling abolishment pattern. These results are also robust to controls for various factors, such as the Internet bubble, the exclusion of cross-border deals, the presence of Canadian cross-listed firms, the use of a constant sample of acquirers across the pooling and post pooling abolishment periods, the use of Europe as an alternative counterfactual, and controls for the SEC Rule 10b-18 share repurchase safe harbor amendments of 2003.https://resolver.caltech.edu/CaltechAUTHORS:20170726-084800488Pure strategy Nash equilibria in non-zero sum colonel Blotto games
https://resolver.caltech.edu/CaltechAUTHORS:20170726-144614816
DOI: 10.7907/ayttm-ahe73
We analyze a Colonel Blotto game in which opposing parties have differing relative intensities (i.e. the game is non-zero sum). We characterize the colonels' payoffs that sustain a pure strategy equilibrium and present an algorithm that reaches the equilibrium actions (when they exist). Finally we show that the set of games with a pure strategy equilibria is non-empty.https://resolver.caltech.edu/CaltechAUTHORS:20170726-144614816Community Dynamics in the Lab: Congestion, Public Good Provision, and Local Instability
https://resolver.caltech.edu/CaltechAUTHORS:20170726-135655672
DOI: 10.7907/xr5pr-rr692
I study the dynamics of voluntary local public good provision in a free-mobility environment when agents differ substantially in the benefit they receive from the public good provided within their community. I find that subjects move in response to both provision and community composition but that the growth and stability of these communities are dictated by movement costs and crowding. When the public good is congestible, such that returns are lower for larger populations, communities are characterized by instability, cyclical fluctuations in local provision, and a dynamic in which low demanders continually chase high demanders through locations. When congestion is eliminated, agents with different preferences sometimes co-exist, but chronic, inefficient movement persists, suggesting that instability is driven by intrinsic preferences for community composition, as well as by sensitivity to congestion. While communities with high entry fees primarily attract those with high public good returns, segregation is not sufficient for overcoming free-riding.https://resolver.caltech.edu/CaltechAUTHORS:20170726-135655672Marshall and Walras, Disequilibrium Trades and the Dynamics of Equilibriation in the Continuous Double Auction Market
https://resolver.caltech.edu/CaltechAUTHORS:20170725-165340096
DOI: 10.7907/da3nf-75y29
Prices and quantities converge to the theoretical competitive equilibria in continuous, double auction markets. The double auction is not a tatonnement mechanism. Disequilibrium trades take place. The absence of any influence of disequilibrium trades, which have the capacity to change the theoretical equilibrium, appears to be due to a property found in the Marshallian model of single market adjustments. The Marshallian model incorporates a principle of self-organizing, coordination that mysteriously determines the sequence in which specific pairs of agents trade in an environment in which market identities and agent preferences are not public. Disequilibrium trades along the Marshallian path of trades do not change the theoretical equilibrium. The substance of this paper is to demonstrate that the Marshallian principle captures a natural tendency of the adjustment in single, continuous, double auction markets and to suggest how it takes place. The Marshallian model of quantity adjustment and the Walrasian model of market price adjustment can be seen as companion theories that explain the allocation and price processes of a market. The Marshallian model explains the evolution of the allocation, who will meet and trade, and the Walrasian excess demand explains the evolution of prices when they do.https://resolver.caltech.edu/CaltechAUTHORS:20170725-165340096A Bayesian Unobservable/Observable Source Separation Model and Activation Determination in fMRI
https://resolver.caltech.edu/CaltechAUTHORS:20170807-141138268
DOI: 10.7907/eqmw1-2yf39
In functional magnetic resonance imaging, the most important question to be answered is that of deciding which statistical method to use in analyzing the data. The statistical analysis is the most crucial task because it determines the statistical activations which are interpreted. In computing the activation level, a standard method is to select assumed to be known reference functions and perform a multiple regression of the time courses on them and a linear trend. In performing the multiple regression, t to F statistics are computed in each voxel then they are colored accordingly. But the most important question is: How do we choose the reference functions? Several different functions have been suggested. This paper, based on Bayesian source separation, determines the underlying source reference functions by instead assuming they are known, assessing a prior distribution for them along with the other parameters and determining them statistically. Both Gibbs sampling and iterated conditional modes algorithms are used to determine marginal posterior mean and joint maximum a posteriori values of the parameters along with statistical activation levels. It was found that the underlying response can be statistically determined and that the iterated conditional modes algorithm performed better than Gibbs sampling.https://resolver.caltech.edu/CaltechAUTHORS:20170807-141138268A Model for Bayesian Source Separation with the Overall Mean
https://resolver.caltech.edu/CaltechAUTHORS:20170807-144027269
DOI: 10.7907/dvf2j-xbd71
Typically in source separation models the overall mean as well as the mean of the sources are assumed to be zero. This paper assumes a nonzero overall mean and a nonzero source mean, quantifies available prior knowledge regarding them and other parameters. This prior knowledge is incorporated into the inferences along with the current data in the Bayesian approach to source separation. Vague, conjugate normal, and generalized conjugate normal distributions are used to quantify knowledge for the overall mean vector. Algorithms for estimating the parameters of the model from the joint posterior distribution are derived and determined statistically from the posterior distribution using both Gibbs sampling a Markov chain Monte Carlo method and the iterated conditional modes algorithm a deterministic optimization technique for marginal mean and maximum a posterior estimates respectively. This is a methodological paper which outlines the model without the use of a numerical example.https://resolver.caltech.edu/CaltechAUTHORS:20170807-144027269How to cut a cake healthily
https://resolver.caltech.edu/CaltechAUTHORS:20170807-133322634
DOI: 10.7907/se2dq-wme73
The Sliding Knife procedure as well as the Cake Cutting and Fair Border existence theorems, stated for additive evaluations, hold unchanged for concave ones.https://resolver.caltech.edu/CaltechAUTHORS:20170807-133322634Existence and Testable Implications of Extreme Stable Matchings
https://resolver.caltech.edu/CaltechAUTHORS:20170726-142324889
DOI: 10.7907/4mcm3-0zj41
We investigate the testable implications of the theory that markets produce matchings that are optimal for one side of the market; i.e. stable extremal matchings. A leading justification for the theory is that markets proceed as if the deferred acceptance algorithm were in place. We find that the theory of stable extremal matching is observationally equivalent to requiring that there be a unique matching, or that the matching be consistent with unrestricted monetary transfers. We also present results on rationalizing a matching as the median stable matching.
We work with a general model of matching, which encompasses aggregate and random matchings as special cases. As a consequence, we need to work with a notion of strong stability, and extend the standard theory on the existence and structure of extremal matchings.https://resolver.caltech.edu/CaltechAUTHORS:20170726-142324889Equilibrium Tax Rates and Income Redistribution: A Laboratory Study
https://resolver.caltech.edu/CaltechAUTHORS:20170726-144342297
DOI: 10.7907/0xp5n-z9471
This paper reports results from a laboratory experiment that investigates the Meltzer-Richard model of equilibrium tax rates, inequality, and income redistribution. We also extend that model to incorporate social preferences in the form of altruism and inequality aversion. The experiment varies the amount of inequality and the collective choice procedure to determine tax rates. We report four main findings. First, higher wage inequality leads to higher tax rates. The effect is significant and large in magnitude.
Second, the average implemented tax rates are almost exactly equal to the theoretical ideal tax rate of the median wage worker. Third, we do not observe any significant differences in labor supply or average implemented tax rates between a direct democracy institution and a representative democracy system where tax rates are determined by candidate competition. Fourth, we observe negligible deviations from labor supply behavior or voting behavior in the directions implied by altruism or inequality aversion.https://resolver.caltech.edu/CaltechAUTHORS:20170726-144342297Roemer on Equality of Opportunity
https://resolver.caltech.edu/CaltechAUTHORS:20170802-171353717
DOI: 10.7907/w4db8-jgy82
We critically discuss John Roemer's normative criterion of equal opportunity and advance opportunity dominance as an alternative criterion for the evaluation of policies. For Roemer, strict equality of opportunity obtains when people, irrespectively of circumstances beyond their control, have the same ability to achieve advantage through their free choices. To implement this idea, Roemer sorts people with similar circumstances into types and takes their free choices to be represented by their behaviour relative to other members of the same type or, as Roemer calls it, by their 'relative effort'. He then proposes that society should maximize the average advantage of all those whose circumstances cause them to be least well–off relative to others who have expended the same degree of relative effort.
We argue that typing and the relative effort metric conflate the factors for which we do and do not want to hold individuals responsible, whenever these factors are statistically correlated. Moreover, Roemer's rule for policy selection burdens the concept of equal opportunity with foreign distributive principles. Pointing to an inconsistency in Roemer's argument, we also note that his
selection rule violates his own requirement of Pareto-optimality. In response to these difficulties, we advance the criterion of opportunity dominance which is Pareto optimal, maintains conceptual purity and does not conflate the factors for which individuals should and should not be held responsible. This criterion determines a set of candidate policies that are undominated in
opportunity and from which a final policy must be selected by additional, conceptually distinct principles of distributive justice.https://resolver.caltech.edu/CaltechAUTHORS:20170802-171353717The Foundations of Latino Voter Partisanship: Evidence from the 2000 Election
https://resolver.caltech.edu/CaltechAUTHORS:20170807-150102415
DOI: 10.7907/18d7a-3sg98
Traditionally, the Latino electorate has been considered to be Democratic in partisan affiliation. However, during the 2000 presidential election there were many efforts made by the Republican Party to court Latino voters, suggesting that perhaps Latino voters may be becoming more Republican in orientation. Using a telephone survey of Latino likely voters conducted in the 2000 election, we examine three different sets of correlates of Latino voter partisanship: social and demographic, issue and ideological, and economic. We find that Latino voter partisanship is strongly structured by social and demographic, as well as issue and ideological, factors. We also find that while it is unlikely that changes in economic factors or abortion attitudes will significantly change which parties the different Latino nation-origin groups identify with, it is possible that changes in ideological positions regarding the role of government in providing social services could result in significant changes in Latino party identification.https://resolver.caltech.edu/CaltechAUTHORS:20170807-150102415The Willingness to Pay/Willingness to Accept Gap, the "Endowment Effect," Subject Misconceptions and Experimental Procedures for Eliciting Valuations
https://resolver.caltech.edu/CaltechAUTHORS:20170802-163705473
DOI: 10.7907/61zef-vpy88
We conduct experiments to explore the possibility that subject misconceptions, as opposed to a particular theory of preferences referred to as the "endowment effect," account for reported gaps between willingness to pay ("WTP") and willingness to accept ("WTA"). Two facts are evident in the literature. First, there is no consensus regarding the nature or robustness of the WTA-WTP gap. Secondly, while experimenters are very concerned to avoid subject misconceptions, there is no consensus about their fundamental properties or how they might be avoided. Instead, experimenters have revealed different conceptions of the phenomenon through different types of experimental procedures and controls. Such controls involve the role of anonymity, elicitation mechanisms, practice, training and binding outcome experiences applied separately or in different combinations.
The resulting pattern of research leaves open the possibility that the widely differing reports of a gap between WTP and WTA could be due to an incomplete science regarding subject misconceptions. The lack of a theory of misconceptions is replaced by what we will call a "revealed theory" methodology in which theories implicit in experimental procedures found in the literature are at the heart of the new experimental design. Thus, the approach reported here reflects an attempt to simultaneously control for all dimensions of concern found in the literature. To this end our procedures modify the Becker-DeGroot-Marschak mechanism used in previous studies to elicit values. In addition, our procedures supplement commonly used procedures by providing extensive training on the elicitation mechanism before subjects provide WTP and WTA responses. Experiments were conducted using both lotteries and mugs, goods frequently used in endowment effect experiments. Using the modified procedures, we find no support for
the hypothesis that WTA is significantly greater than WTP. In addition, we find no support that an observed gap can be convincingly interpreted as an endowment effect and conclude that further evidence is required before convincing interpretations of any observed gap can be advanced.https://resolver.caltech.edu/CaltechAUTHORS:20170802-163705473A characterization of combinatorial demand
https://resolver.caltech.edu/CaltechAUTHORS:20170725-153042529
DOI: 10.7907/qrejc-dsd67
We prove that combinatorial demand functions are characterized by two properties: continuity and the law of demand.https://resolver.caltech.edu/CaltechAUTHORS:20170725-153042529Exploiting Myopic Learning
https://resolver.caltech.edu/CaltechAUTHORS:20170725-172657983
DOI: 10.7907/4dere-a0z63
I develop a framework in which a principal can exploit myopic social learning in a population of agents in order to implement social or selfish outcomes that would not be possible under the traditional fully-rational agent model. Learning in this framework takes a simple form of imitation, or replicator dynamics, a class of learning dynamics that often leads the population to converge to a Nash equilibrium of the underlying game. To illustrate the approach, I give a wide class of games for which the principal can always obtain strictly better outcomes than the corresponding Nash solution and show how such outcomes can be implemented. The framework is general enough to accommodate many scenarios, and powerful enough to generate predictions that agree with empirically-observed behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170725-172657983Can Relational Contracts Survive Stochastic Interruptions?
https://resolver.caltech.edu/CaltechAUTHORS:20170726-134336781
DOI: 10.7907/ymtpq-vtd88
This paper investigates the robustness of the "two-tiered labor market" experimental results of Brown, Falk and Fehr (2004) by subjecting relationships to stochastic interruptions. Using two different subject pools, we first replicate the basic pattern of high quality private contracting and low quality public contracting. We then study the impact of exogenous random 'downturns' in which firms cannot hire workers for three periods. Our hypothesis is that 1. job rents are lower in downturns 2. this will lower wages and effort, unless strong re-connection norms exist. We do find that job rents are lower, but surprisingly, the downturns do not harm aggregate market efficiency. Stochastic interruptions delay the formation of relationships, necessitating the use of public offers, which increases the competitiveness of the short term market. The high tier (private) markets responds by raising wages, thus increasing average worker surplus per trade. We also find evidence that 50-50 pre-downturn worker-firm surplus sharing predicts post-downturn re-connections.https://resolver.caltech.edu/CaltechAUTHORS:20170726-134336781A Bayesian Model to Incorporate Jointly Distributed Generalized Prior Information on Means and Loadings in Factor Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170807-154328220
DOI: 10.7907/8k65a-3cd06
A Bayesian factor analysis model is outlined in which prior knowledge regarding the model parameters is quantified using prior distribution and incorporated into the inferences along with the data. Recent work (Rowe, 2000a; Rowe, 2000b; Rowe, 2000c) has focused on the population mean and considered vague, conjugate and generalized conjugate distributions when it was taken to be independent of the factor loadings. More recent work (Rowe, 2001) has taken the population mean and factor loadings to be jointly distributed and used a conjugate prior distribution. In this paper, the population mean vector and the factor loadings are taken to be jointly distributed and a generalized conjugate distribution is used. As mentioned in Press (1982), Rothenburg (1963) pointed out that with a conjugate prior distribution, the elements in the covariance matrices are constrained and may not be rich enough to permit complete freedom of assessment. The generalized conjugate distribution permits complete freedom of assessment. Parameters are estimated by Gibbs sampling and iterated conditional modes algorithms.https://resolver.caltech.edu/CaltechAUTHORS:20170807-154328220To Elect or to Appoint? Bias, Information, and Responsiveness of Bureaucrats and Politicians
https://resolver.caltech.edu/CaltechAUTHORS:20170726-153100245
DOI: 10.7907/nhpgb-s3b25
In this paper, we address empirically the trade-offs involved in choosing between bureaucrats and politicians. In order to do this, we need to map institutions of selection and retention of public officials to the type of public officials they induce. We do this by specifying a collective decision-making model, and exploiting its equilibrium information to obtain estimates of the unobservable types. We focus on criminal decisions across US states' Supreme Courts. We find that justices that are shielded from voters' influence ("bureaucrats") on average (i) have better information, (ii) are more likely to change their preconceived opinions about a case, and (iii) are more effective (make less mistakes) than their elected counterparts ("politicians"). We evaluate how performance would change if the courts replaced majority rule with unanimity rule.https://resolver.caltech.edu/CaltechAUTHORS:20170726-153100245A Subjective Spin on Roulette Wheels
https://resolver.caltech.edu/CaltechAUTHORS:20170802-170854246
DOI: 10.7907/wwvcd-ff363
We provide a behavioral foundation to the notion of 'mixture' of acts, which is used to great advantage in the decision setting introduced by Anscombe and Aumann.
Our construction allows one to formulate mixture-space axioms even in a fully subjective setting, without assuming the existence of randomizing devices. This simplifies the task of developing axiomatic models which only use behavioral data. Moreover, it is immune from the difficulty that agents may 'distort' the probabilities associated with randomizing devices.
For illustration, we present simple subjective axiomatizations of some models of choice under uncertainty, including the maxmin expected utility model of Gilboa and Schmeidler, and Bewley's model of choice with incomplete preferences.https://resolver.caltech.edu/CaltechAUTHORS:20170802-170854246Nonparametric Learning Rules from Bandit Experiments: The Eyes have it!
https://resolver.caltech.edu/CaltechAUTHORS:20170726-145343662
DOI: 10.7907/wm7ct-f2g27
How do people learn? We assess, in a distribution-free manner, subjects' learning and choice rules in dynamic two-armed bandit learning experiments. To aid in identification and estimation, we use auxiliary measures of subjects' beliefs, in the form of their eye-movements during the experiment. Our estimated choice probabilities and learning rules have some distinctive features; notably that subjects tend to update in a non-smooth manner following choices made in accordance with current beliefs. Moreover, the beliefs implied by our nonparametric learning rules are closer to those from a (non-Bayesian) reinforcement learning model, than a Bayesian learning model.https://resolver.caltech.edu/CaltechAUTHORS:20170726-145343662Modeling the Change of Paradigm: Non-Bayesian Reactions to Unexpected News
https://resolver.caltech.edu/CaltechAUTHORS:20170726-155556701
DOI: 10.7907/86vbk-me539
Despite its normative appeal and widespread use, Bayes' rule has two well-known limitations: first, it does not predict how agents should react to an information to which they assigned probability zero; second, a sizable empirical evidence documents how agents systematically deviate from its prescriptions by overreacting to information to which they assigned a positive but small probability. By replacing Dynamic Consistency with a novel axiom, Dynamic Coherence, we characterize an alternative updating rule that is not subject to these limitations, but at the same time coincides with Bayes' rule for "normal" events. In particular, we model an agent with a utility function over consequences, a prior over priors ρ, and a threshold. In the first period she chooses the prior that maximizes the prior over priors ρ--a' la maximum likelihood. As new information is revealed: if the chosen prior assigns to this information a probability above the threshold, she follows Bayes' rule and updates it. Otherwise, she goes back to her prior over priors ρ, updates it using Bayes' rule, and then chooses the new prior that maximizes the updated ρ. We also extend our analysis to the case of ambiguity aversion.https://resolver.caltech.edu/CaltechAUTHORS:20170726-155556701An Evolutionary Perspective on Goal Seeking and Escalation of Commitment
https://resolver.caltech.edu/CaltechAUTHORS:20170807-155245396
DOI: 10.7907/kknwr-kpf56
Maximizing the probability of bypassing an aspiration level, and taking increasing risks to recover previous losses are well-documented behavioral tendencies. They are compatible with individual utility functions that are S-shaped, as suggested in Prospect Theory (Kahneman and Tversky 1979). We explore evolutionary foundations for such preferences. Idiosyncratic innovative activity, while individually risky, enhances the fitness of society because it provides hedging against aggregate disasters that might occur if everybody had pursued the same course of action. In order that individuals choose the socially optimal dosage of innovative activity, the individuals' preferences should make them strive to improve upon the on-going convention, even if it implies taking gambles that reduce their expected achievements. We show how, in a formal model, the preferences that will be selected for in the course of evolution lead to maximizing the probability of bypassing an aspiration level. Furthermore, when comparing choices with the same probability of achieving this goal, preference is indeed established by maximizing the expected utility of an S-shaped utility function, exhibiting risk loving below the aspiration level and risk aversion beyond it.https://resolver.caltech.edu/CaltechAUTHORS:20170807-155245396The Hubris Hypothesis: Empirical Evidence
https://resolver.caltech.edu/CaltechAUTHORS:20170726-142038067
DOI: 10.7907/3mbyx-mv752
The Hubris Hypothesis is grounded on a failure to adequately account for the winner's curse, which leads to overbidding. Surprisingly few papers have attempted to develop a direct empirical test of the presence of overbidding in M&A contests. We develop two such tests in this paper. Our results strongly support the existence of overbidding.https://resolver.caltech.edu/CaltechAUTHORS:20170726-142038067Indecision Theory: Quality of Information and Voting Behavior
https://resolver.caltech.edu/CaltechAUTHORS:20170807-152618695
DOI: 10.7907/x45sh-c4n27
In this paper we show how to incorporate quality of information into a model of voting behavior. We do so in the context of the turnout decision of instrumentally rational voters who differ in their quality of information, which we refer to as ambiguity. Ambiguity is reflected by the fact that the voter's beliefs are given by a set of probabilities, each of which represents in the voter's mind a different possible scenario.
We show that in most elections voters who satisfy the Bayesian model do not strictly prefer abstaining over voting for one of the candidates. In contrast, a voter who is averse to ambiguity considers abstention strictly optimal when the candidates' policy positions are both ambiguous and they are "ambiguity complements". Abstaining is preferred since it is tantamount to mixing the prospects embodied by the two candidates, thus enabling the voter to "hedge" the candidates' ambiguity.https://resolver.caltech.edu/CaltechAUTHORS:20170807-152618695An Improved Statistical Model for Multiparty Electoral Data
https://resolver.caltech.edu/CaltechAUTHORS:20170807-153549706
DOI: 10.7907/c5d2w-qmn15
Katz and King (1999) develop a model for predicting or explaining aggregate electoral results in multiparty democracies. Their model is, in principle, analogous to what least squares regression provides American politics researchers in that two-party system. Katz and King applied their model to three-party elections in England and revealed a variety of new features of incumbency advantage and where each party pulls support from. Although the mathematics of their statistical model covers any number of political parties, it is computationally very demanding, and hence slow and numerically imprecise, with more than three. The original goal of our work was to produce an approximate method that works quicker in practice with many parties without making too many theoretical compromises. As it turns out, the method we offer here improves on Katz and King's (in bias, variance, numerical stability, and computational speed) even when the latter is computationally feasible. We also offer easy-to-use software that implements our suggestions.https://resolver.caltech.edu/CaltechAUTHORS:20170807-153549706An Experimental Comparison Between Free Negotiation and a Multi-issue Point Mechanism
https://resolver.caltech.edu/CaltechAUTHORS:20170726-142302581
DOI: 10.7907/bb0nx-nmq19
We propose a multi-issue point mechanism to be used in conflict resolution situations. This mechanism extract "gains from trade" inherent in the differing valuation towards the various issues where conflict exists. In order to assess the performance of this mechanism vis-a-vis unconstrained communication, we run a series of controlled laboratory experiments and find that both mechanisms reach similar levels of welfare, but the multi issue point mechanism allows subjects to reach an agreement more swiftly. In order to analyse in detail the performance of both mechanisms we introduce a classical measure of conflict and see that when conflict is highest, free negotiation sustains most losses: (1) subjects need more time to reach an agreement; (2) the likelihood of gridlock (no agreement) increases.https://resolver.caltech.edu/CaltechAUTHORS:20170726-142302581Fair Kidney Allocation Based on Waiting Time
https://resolver.caltech.edu/CaltechAUTHORS:20170807-143221840
DOI: 10.7907/jdkna-neq07
We study the allocation of cadaveric donor kidneys for transplantation based merely on waiting time. Despite the first impression, this simple allocation rule turns out to possess very attractive ethical and medical properties. Current allocation rules do not consider criteria such as sex, age and race perhaps for fear of morally unacceptable allocations, although certain combinations of these criteria are known to affect graft survival rates. We demonstrate that allocation by waiting time automatically protects disadvantaged patient types and puts them in a near to optimal position. The inclusion of sex, age and race will therefore not lead to morally unacceptable allocations. This allows individual patients to improve the expected survival time of their graft relative to the status quo without being penalized by the allocation rule. Moreover, decisions about when to start compromising on expected graft survival rates in favour of shorter waiting times are made locally by patients and their medical advisers rather than by a centralized protocol.https://resolver.caltech.edu/CaltechAUTHORS:20170807-143221840A Note on Impossibility Theorems and Seniority Rules
https://resolver.caltech.edu/CaltechAUTHORS:20170807-134702103
DOI: 10.7907/yrset-1fg13
The purpose of a social choice rule is to resolve conflicts among the preferences of a group of individuals. We should therefore require a social choice rules not to remain indecisive between alternatives for which individuals have conflicting preferences. Suppose we also adopt the requirements of a universal domain, strict Pareto optimality and independence of irrelevant alternatives. We then obtain the existence of a dictatorship (for binary choices) already under the weak consistency assumption that the group's choice function must always generate a preference relation that is acyclical over triples of alternatives. By contrast to other theorems, this results holds without any restrictions on the size of the group and without the axiom of positive responsiveness. Under the same consistency condition, we furthermore obtain an axiomatic characterization of seniority rules, also known as lexical dictatorships.https://resolver.caltech.edu/CaltechAUTHORS:20170807-134702103A Model for Bayesian Factor Analysis with Jointly Distributed Means and Loadings
https://resolver.caltech.edu/CaltechAUTHORS:20170807-160113179
DOI: 10.7907/6htq7-smm75
In the Bayesian approach to factor analysis, available prior knowledge regarding the model parameters is quantified in the form of prior distributions and incorporated into the inferences along with the data. The incorporation of prior knowledge has the added consequence of eliminating the ambiguity of rotation and the need for model constraints found in the traditional factor analysis model. A focus of recent work (Rowe, 2000a and Rowe 2000b and Rowe, 2000C) has been on quantifying and incorporating available prior knowledge when estimating the population mean. This previous work has considered vague, conjugate, and generalized conjugate distributions for the population mean. In this paper, unlike previous work, the population mean vector and the factor loading matrix are taken to be jointly distributed, which allows available interrelated prior information to be quantified and incorporated with the data. The model parameters are estimated by Gibbs sampling and iterated conditional modes algorithms.https://resolver.caltech.edu/CaltechAUTHORS:20170807-160113179Communication in Multilateral Bargaining
https://resolver.caltech.edu/CaltechAUTHORS:20170726-152606828
DOI: 10.7907/q0zaq-zax13
One of the most robust phenomena in the experimental literature on multilateral bargaining is the failure of proposers to extract equilibrium rents. However, all previous experiments have overlooked the fact that outside the lab committee members are allowed to - and do - engage in sometimes intense communication processes prior to voting on a proposal. We conduct an experimental test of the Baron-Ferejohn model in which we allow committee members to engage in unrestricted cheap-talk communication before a proposal is submitted. We find that proposers extract a significantly higher share of resources when communication is allowed. Communication increases proposer power through two channels. First, it mitigates the uncertainty surrounding the amount a coalition member is willing to accept. Second, it allows potential coalition members to compete for a place in the coalition by lowering this stated price.https://resolver.caltech.edu/CaltechAUTHORS:20170726-152606828Estimation of Random Coefficients Logit Demand Models with Interactive Fixed Effects
https://resolver.caltech.edu/CaltechAUTHORS:20170726-151203719
DOI: 10.7907/safgt-rvv10
We extend the Berry, Levinsohn and Pakes (BLP, 1995) random coefficients discrete- choice demand model, which underlies much recent empirical work in IO. We add interactive fixed effects in the form of a factor structure on the unobserved product characteristics. The interactive fixed effects can be arbitrarily correlated with the observed product characteristics (including price), which accommodates endogeneity and, at the same time, captures strong persistence in market shares across products and markets. We propose a two step least squares-minimum distance (LS-MD) procedure to calculate the estimator. Our estimator is easy to compute, and Monte Carlo simulations show that it performs well. We consider an empirical application to US automobile demand.https://resolver.caltech.edu/CaltechAUTHORS:20170726-151203719Can Housing Risk be Diversified? A Cautionary Test from the Housing Boom and Bust
https://resolver.caltech.edu/CaltechAUTHORS:20170726-112029656
DOI: 10.7907/mva5a-pt004
This study evaluates the effectiveness of geographic diversification in reducing housing investment risk. To characterize diversification potential, we estimate spatial correlation and integration among 401 US metropolitan housing markets. The 2000s boom brought a marked uptrend in housing market integration associated with eased residential lending standards and rapid growth in private mortgage securitization. As boom turned to bust, macro factors, including employment and income fundamentals, contributed importantly to the trending up in housing return integration. Portfolio simulations reveal substantially lower diversification potential and higher risk in the wake of increased market integration.https://resolver.caltech.edu/CaltechAUTHORS:20170726-112029656Bayesian Source Separation with Jointly Distributed Mean and Mixing Coefficients vie MCMC and ICM
https://resolver.caltech.edu/CaltechAUTHORS:20170807-133255278
DOI: 10.7907/9bv1k-zdq54
Recent source separation work has described a model which assumes a nonzero overall mean and incorporates prior knowledge regarding it. This is significant because source separation models that have previously been presented have assumed that the overall mean is zero. However, this work specified that the prior distribution which quantifies available prior knowledge regarding the overall mean be independent of the mixing coefficient matrix. The current paper generalizes this work by quantifying available prior information regarding the overall mean and mixing matrix with the use of joint prior distributions. This prior knowledge in the prior distributions is incorporated into the inferences along with the current data. Conjugate normal and generalized conjugate normal distributions are used. Algorithms for estimating the parameters of the model from the joint posterior distribution using both Gibbs sampling a Markov chain Monte Carlo method and the iterated conditional modes algorithm a deterministic optimization technique for marginal mean and maximum a posterior estimates respectively.https://resolver.caltech.edu/CaltechAUTHORS:20170807-133255278Improved Methods for Detecting Acquirer Skills
https://resolver.caltech.edu/CaltechAUTHORS:20170726-082321868
DOI: 10.7907/ktv09-kht25
Large merger and acquisition (M&A) samples feature the pervasive presence of repetitive acquirers. They offer an attractive empirical context for revealing the presence of acquirer skills (persistent superior performance). But panel data M&A are quite heterogeneous; just a few acquirers undertake many M&As. Does this feature affect statistical inference? To investigate the issue, our study relies on simulations based on real data sets. The results suggest the existence of a bias, such that extant statistical support for the presence of acquirer skills appears compromised. We introduce a new resampling method to detect acquirer skills with attractive statistical properties (size and power) for samples of acquirers that complete at least five acquisitions. The proposed method confirms the presence of acquirer skills but only for a marginal fraction of the acquirer population. This result is robust to endogenous attrition and varying time periods between successive transactions. Claims according to which acquirer skills are a first order factor explaining acquirer cross-‐sectional cumulated abnormal returns appears overstated.https://resolver.caltech.edu/CaltechAUTHORS:20170726-082321868On the probability of the competitive equilibrium being globally stable: The C.E.S. example
https://resolver.caltech.edu/CaltechAUTHORS:20170802-170857684
DOI: 10.7907/ecygz-2mc16
This paper extends an analysis proposed by Hirota (1981) to a class of economies with C.E.S. utility functions that include Scarf (1960)'s second example as a special case and shows by the use of numerical methods that (i) a Walrasian price adjustment mechanism converges to an equilibrium with very high probability and (ii) the weak axiom in revealed preference for market excess demands is satisfied with high probability, but the gross substitutability is rarely satisfied. Also, this paper suggests a possible interpretation of a Walrasian price adjustment that is based on the observations of experiments done by Anderson, Plott, Shimomura and Granat (2000).https://resolver.caltech.edu/CaltechAUTHORS:20170802-170857684Symmetric play in repeated allocation games
https://resolver.caltech.edu/CaltechAUTHORS:20170727-092437236
DOI: 10.7907/eb78v-njd23
We study symmetric play in a class of repeated games when players are patient. We show that, while the use of symmetric strategy profiles essentially does not restrict the set of feasible payoffs, the set of equilibrium payoffs is an interesting proper subset of the feasible and individually rational set. We also provide a theory of how rational individuals play these games, identifying particular strategies as focal through the considerations of Pareto optimality and simplicity. We report experiments that support many aspects of this theory.https://resolver.caltech.edu/CaltechAUTHORS:20170727-092437236The Instability of Robust Aggregation
https://resolver.caltech.edu/CaltechAUTHORS:20170725-161917556
DOI: 10.7907/vmr0n-5k854
We discuss the feasibility of Levi's (1990) robust mode of aggregating individuals' evaluations of acts into a social choice function. We examine the process in which we refine decision-theoretic models and account for previously irrelevant parameters of a decision situation (cf. Savage's `small worlds'). Suppose that, for each individual, we consider a coarse-grained and a fine-grained decision-theoretic model, both of which are consistent with each other in a sense to be defined. We desire any social choice rule to be stable under refinements in the sense that the group choice based on fine-grained individual models and the group choice based on coarse-grained individual models agree for choices among coarse-grained alternatives. We find that any stable robust social choice rule must collapse back into ex ante aggregation. We also provide sufficient conditions, such as Pareto optimality, under which robust aggregation leads to an infinite series of reversals of group choice. For ex ante aggregation, we find that stability is ubiquitous and that it follows from independence of irrelevant alternatives.https://resolver.caltech.edu/CaltechAUTHORS:20170725-161917556How to control controlled school choice
https://resolver.caltech.edu/CaltechAUTHORS:20170727-090931100
DOI: 10.7907/982gm-51w96
We characterize choice rules for schools that regard students as substitutes, while at the same time expressing preferences for the diversity composition of the student body. The stable (or fair) assignment of students to schools requires the latter to regard the former as substitutes. Such a requirement is in conflict with the reality of schools' preferences for a diverse student body. We show that the conflict can be useful, in the sense that certain unique rules emerge from imposing both considerations.https://resolver.caltech.edu/CaltechAUTHORS:20170727-090931100The Propagation of Shocks Across International Equity Markets: A Microstructure Perspective
https://resolver.caltech.edu/CaltechAUTHORS:20170726-114925483
DOI: 10.7907/emqdm-bfm65
We study the high-frequency propagation of shocks across international equity markets. We identify intraday shocks to stock prices, liquidity, and trading activity for 12 equity markets around the world based on non-parametric jump statistics at the 5-minute frequency from 1996 to 2011. Shocks to prices are prevalent and large, with regular spillovers across markets – even within the same 5-minute interval. We find that price shocks are predominantly driven by information rather than liquidity. Consistent with the information channel, price shocks do not revert and often occur around macroeconomic news announcements. Liquidity shocks tend to be isolated events that are neither associated with price shocks nor with liquidity shocks on other markets. Our results challenge the widespread view that liquidity plays an important role in the origination and propagation of financial market shocks.https://resolver.caltech.edu/CaltechAUTHORS:20170726-114925483Social Preferences under Uncertainty: Equality of Opportunity vs. Equality of Outcome
https://resolver.caltech.edu/CaltechAUTHORS:20170727-093210149
DOI: 10.7907/efypn-vdn31
This paper introduces a model of inequality aversion that captures a preference for equality of ex-ante expected payoff relative to a preference for equality of ex-post payoff by a single parameter. On deterministic allocations, the model reduces to the model of Fehr and Schmidt (1999). The model provides a unified explanation for recent experiments on probabilistic dictator games and dictator games under veil of ignorance. Moreover, the model can describe experiments on a preference for effciency, which seem inconsistent with inequality aversion. We also apply the model to the optimal tournament. Finally, we provide a behavioral foundation of the model.https://resolver.caltech.edu/CaltechAUTHORS:20170727-093210149Incomplete Information
https://resolver.caltech.edu/CaltechAUTHORS:20170807-134049226
DOI: 10.7907/5jwkt-1se17
In interactive contexts such as games and economies, it is important to take account not only of what the players believe about substantive matters (such as payoffs), but also of what they believe about the beliefs of other players. Two different but equivalent ways of dealing with this matter, the semantic and the syntactic, are set forth. Canonical and universal semantic systems are then defined and constructed, and the concepts of common knowledge and common priors formulated and characterized. The last two sections discuss relations with Bayesian games of incomplete information and their applications, and with interactive epistemology -- the theory of multi-agent knowledge and belief as formulated in mathematical logic.https://resolver.caltech.edu/CaltechAUTHORS:20170807-134049226Mixed Equilibrium in a Downsian Model with a Favored Candidate
https://resolver.caltech.edu/CaltechAUTHORS:20170807-165400595
DOI: 10.7907/d5xt1-9dh42
This paper examines competition in the standard one-dimensional Downsian model of two-candidate elections, but where one candidate (A) enjoys an advantage over the other candidate (D). Voters' preferences are Euclidean, but any voter will vote for candidate A over candidate D unless D is closer to her ideal point by some fixed distance δ. The location of the median voter's ideal point is uncertain, and its distribution is commonly known by both candidates. The candidates simultaneously choose locations to maximize the probability of victory. Pure strategy equilibria often fails to exist in this model, except under special conditions about δ and the distribution of the median ideal point. We solve for the essentially unique symmetric mixed equilibrium, show that candidate A adopts more moderate policies than candidate D, and obtain some comparative statics results about the probability of victory and the expected distance between the two candidates' policies.https://resolver.caltech.edu/CaltechAUTHORS:20170807-165400595Is there a Gender Gap in Fiscal Political Preferences?
https://resolver.caltech.edu/CaltechAUTHORS:20170807-170314367
DOI: 10.2139/ssrn.240502
This paper examines the relationship between attitudes on potential uses of the budget surplus and gender. Survey results show relatively weak support overall for using a projected surplus to reduce taxes, with respondents much likelier to prefer increased social spending on education or social security. There is a significant gender gap with men being far more likely than women to support tax cuts or paying down the national debt. Given a menu of particular types of tax cuts, women are marginally more likely to favor childcare relief or working poor tax credits whereas men are marginally more likely to favor capital gains reduction or tax rate cuts. When primed that the tax laws are biased against two-worker families, men significantly change their preferences, moving from support for general tax rate cuts to support for working poor tax relief, but not to child-care relief. One of the strongest results to emerge is that women are far more likely than men not to express an opinion or to confess ignorance about fiscal matters. Both genders increase their "no opinion" answer in the face of priming, but men more so than women. Further research will explore this no opinion/uncertainty aspect.https://resolver.caltech.edu/CaltechAUTHORS:20170807-170314367Aggregation and Dynamics of Survey Responses: The Case of Presidential Approval
https://resolver.caltech.edu/CaltechAUTHORS:20170807-164540519
DOI: 10.7907/kdsp6-nwb86
In this paper we critique much of the empirical literature on the important political science concept of presidential approval. Much of the recent research on presidential approval has focused on the dynamic nature of approval; arguments have raged about whether presidential approval is integrated, co-integrated, or fractionally integrated. We argue that none of these time-series concepts, imported from an econometrics literature which has fundamentally different types of data than do political scientists, can apply to the presidential approval time series. Instead, we advocate careful use of aggregated approval as a time-series cross-section, or the use of individual-level survey responses. Ultimately most of the important hypotheses political scientists wish to test regarding presidential approval involve individual voters or citizens; thus we argue that using the appropriate data unit is the best methodology.https://resolver.caltech.edu/CaltechAUTHORS:20170807-164540519Strategic Learning and Teaching
https://resolver.caltech.edu/CaltechAUTHORS:20170807-171037465
DOI: 10.7907/byfnc-zek61
Decision makers learn from experience and this learning affects their future decisions. But how? Using game theory, the authors explore how people factor in the payoffs of past decisions in their current choices. The authors introduce the concept of experience-weighted attraction (EWA) to explain how people learn from their experience. Sophisticated players can use this concept to learn which strategies work and "outguess" their rivals. And by understanding how rivals and customers learn, managers can take actions that "teach" rivals and customers what you want them to believe—reassuring partners and intimidating competitors.https://resolver.caltech.edu/CaltechAUTHORS:20170807-171037465Emergence of Endogenous Legal Institutions: The Rural Charters in Northern Italy
https://resolver.caltech.edu/CaltechAUTHORS:20170807-161707482
DOI: 10.7907/4yv4f-9rd82
Common-pool resources create a well-known social dilemma, and to solve the problem the recent literature in economics has focused on how repeated interaction can promote informal cooperation without the need for formal legal or political institutions. This paper examines a particular example of a common resource: common property in alpine communities of Northern Italy between the 13th and the 19th century. There, rather than relying on repeated interaction alone, users created formal mechanisms that regulated behavior and access to the common property via quotas and time restrictions. Because the formal institutions existed side by side with the sort of repeated interaction that would bread informal cooperation, there was a paradoxical redundancy of institutions.
On one hand, formal regulations were probably the best way to limit the overuse of the commons. We consider the tradeoff between developing formal regulations versus relying on informal cooperation. Under certain conditions, the cost of building formal institutions is repaid by a large gain in efficiency. On the other hand, the users themselves had to create and administer the formal institutions, and since the benefits of formal regulations are a public good, each individual has an incentive to free ride. The collective action problem of providing regulatory services was surmounted thanks to the repeated interaction among users. The paradox is thus resolved.https://resolver.caltech.edu/CaltechAUTHORS:20170807-161707482On the Evolutionary Emergence of Optimism
https://resolver.caltech.edu/CaltechAUTHORS:20170807-160109793
DOI: 10.7907/n7jaw-cf482
Successful individuals were frequently found to be overly optimistic. These findings are puzzling, as one could expect that realists would perform best in the long run. We show, however, that in a large class of strategic interactions of either cooperation or competition, the equilibrium payoffs of optimists may be higher than those of realists. This is because the very fact of being optimistic changes the game, and drives the adversary to change her equilibrium behavior, possibly to the benefit of the optimist. Suppose, then, that a population consists initially of individuals with various perceptional tendencies, pessimists and optimists to various extents, as well as of realists. Individuals meet in pairs to interact, and more successful tendencies proliferate faster. We show that as time goes by, some moderate degree of optimism will take over, and outnumber all other tendencies.https://resolver.caltech.edu/CaltechAUTHORS:20170807-160109793Range Convexity and Ambiguity Averse Preferences
https://resolver.caltech.edu/CaltechAUTHORS:20170808-163116323
DOI: 10.7907/mmrke-86374
We show that range convexity of beliefs, a 'technical' condition that appears naturally in axiomatizations of preferences in a Savage-like framework, imposes some unexpected restrictions when modelling ambiguity averse preferences. That is, when it is added to a mild condition, range convexity makes the preferences collapse to subjective expected utility as soon as they satisfy structural conditions that are typically used to characterize ambiguity aversion.https://resolver.caltech.edu/CaltechAUTHORS:20170808-163116323Banking Industry Structure, Competition, and Performance: Does Universality Matter?
https://resolver.caltech.edu/CaltechAUTHORS:20170808-163959852
DOI: 10.7907/0etxd-7qh20
By studying the German universal banking system in the pre-World War I period, in comparison with its American and British counterparts, this paper investigates whether universality (the combination of commercial and investment banking services) influences banking industry concentration, levels of market power, or financial performance of banks. The short answer is "no". First, given that the UK's specialized commercial banking sector was structured very similarly to the German universal industrial banking sector, and that neither system was extremely concentrated in the pre-war era, the paper argues that universality does not necessarily or uniquely propagate concentration. Second, on average, German universal banks behaved no less competitively than their American counterparts in the provision of loan services. Structural price markup models, as well as reduced-form Rosse-Panzar tests, demonstrate little deviation from competitive pricing in either country. The findings therefore indicate that universality does not lead to appreciable market power, in either an absolute or a relative sense. These same results also imply that banking industry concentration, at least up to the moderately high levels found in Germany, does not in itself produce anti-competitive behavior. The empirical results, though contradictory to common wisdom about German universal banking, are easily motivated by the theoretical literature in industrial organization. Finally, estimates of returns on equity and on assets suggest only slight international differences in average returns over extended periods, but large deviations in individual years. Adjusting for prevailing rates on government bonds, commercial loans, or commercial deposits narrows the gaps further. Universality is not linked with superior profitability, whether the hypothesized source is efficiency (economies of scope) or monopoly power. These three sets of findings may assuage fears that deregulation in American banking could lead to excessive concentration and therefore collusive behavior. At the same time, the results may lower hopes of significant efficiency gains from broadening the scope of services.https://resolver.caltech.edu/CaltechAUTHORS:20170808-163959852A Bayesian Factor Analysis Model with Generalized Prior Information
https://resolver.caltech.edu/CaltechAUTHORS:20170807-164537348
DOI: 10.7907/cjc86-9jc10
In the Bayesian approach to factor analysis, available prior knowledge regarding the model parameters is quantified in the form of prior distributions and incorporated into the inferences. The incorporation of prior knowledge has the added consequence of eliminating the ambiguity of rotation found in the traditional factor analysis model. Previous Bayesian factor analysis work (Press & Shigemasu 1989, & Press 1998, Rowe 2000a, and Rowe 2000b), has considered mainly natural conjugate prior distributions for the model parameters. As is mentioned in Press (1982), Rothenburg (1963) pointed out that with a natural conjugate prior distribution, the elements in the covariance matrices are constrained and thus may not be rich enough to permit freedom of assessment. In this paper, generalized natural conjugate distributions are used to quantify and incorporate available prior information which permit complete freedom of assessment.https://resolver.caltech.edu/CaltechAUTHORS:20170807-164537348Inducing Liquidity in Thin Financial Markets through Combined-Value Trading Mechanisms
https://resolver.caltech.edu/CaltechAUTHORS:20170808-135334937
DOI: 10.7907/yk4m6-yrv98
Previous experimental research has shown that thin financial markets fail to fully equilibrate, in contrast with thick markets. A specific type of market risk is conjectured to be the reason, namely, the risk of partial execution of desired portfolio rearrangements in a system of parallel, unconnected double auction markets. This market risk causes liquidity to dry up before equilibrium is reached. To verify the conjecture, we organized markets directly as a portfolio trading mechanism, allowing agents to better coordinate their orders across securities. The mechanism is an implementation of the combined-value trading (CVT) system. We present evidence that our portfolio trading mechanism facilitates equilibration to the same extent as thick markets do. Like in thick markets, the emergence of equilibrium pricing cannot be attributed to chance. Inspection of order submission and trade activity reveals that subjects manage to exploit the direct linkages between markets presented by the CVT system.https://resolver.caltech.edu/CaltechAUTHORS:20170808-135334937The First Use of a Combined Value Auction for Transportation Services
https://resolver.caltech.edu/CaltechAUTHORS:20170808-141429666
DOI: 10.7907/1xbgs-71j14
Sears, Roebuck and Co. is one of the largest procurers of trucking services in the world through its wholly-owned subsidiary, Sears Logistics Services (SLS). SLS controls supply chain elements that originate at the vendor (manufacturer) through distribution centers to retail stores, and from vendor to distribution centers to cross dock facilities. This case examines a major change in the method Sears used in contracting for truckload carrier services for this supply chain. It provides a pioneering example of complex business to business e-Commerce.https://resolver.caltech.edu/CaltechAUTHORS:20170808-141429666Has The Cross-Section of Average Returns Always Been the Same? Evidence from Germany, 1881-1913
https://resolver.caltech.edu/CaltechAUTHORS:20170808-155743909
DOI: 10.7907/dn4fv-8ve95
The cross-section of average annual returns on German common stock in the period of 1881-1913 exhibits several of the patterns that have been observed in more recent U.S. data. Market beta is hardly important, and its explanatory power is swamped by size and the ratio of book value to market value. A book-to-market risk measure (covariance with a portfolio long in high book-to-market firms and short in low book-to-market firms) has no effect on the explanatory power of the book-to-market characteristic. But the size effect appears to be caused by selection bias in the sample. And the book-to-market effect is opposite that of the recent U.S. experience (and, hence, can certainly not be attributed to selection bias). Finally, a momentum portfolio constructed on the basis of the error of the basic 3-characteristic model (market beta, size and book-to-market) does not generate significant returns. These findings highlight the variability in the power of certain characteristics in explaining the cross section of average returns.https://resolver.caltech.edu/CaltechAUTHORS:20170808-155743909Bayesian consistent prior selection
https://resolver.caltech.edu/CaltechAUTHORS:20170808-133741084
DOI: 10.7907/2epdc-s7810
A subjective expected utility agent is given information about the state of the world in the form of a set of possible priors. She is allowed to condition her prior on this information. A set of priors may be updated according to Bayes' rule, prior-by-prior, upon learning that some state of the world has not obtained. We show that there exists no decision maker who obeys Bayes' rule, conditions her prior on the available information (by selecting a prior in the announced set), and who updates the information prior-by- prior using Bayes' rule. The result implies that at least one of several familiar decision theoretic "paradoxes" is a mathematical necessity.https://resolver.caltech.edu/CaltechAUTHORS:20170808-133741084A Crash Course in Implementation Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170808-170041665
DOI: 10.7907/37vz7-4gj61
These lectures are meant to familiarize the audience with some of the fundamental results in the theory of implementation and provide a quick progression to some open questions in the literature.https://resolver.caltech.edu/CaltechAUTHORS:20170808-170041665Iterative Elimination of Weakly Dominated Strategies in Binary Voting Agendas with Sequential Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170808-134301462
DOI: 10.7907/pamdw-qxg09
In finite perfect information extensive (FPIE) games, backward induction (BI) gives rise to all pure-strategy subgame perfect Nash equilibria, and iterative elimination of weakly dominated strategies (IEWDS) may give different outcomes for different orders of elimination. Several conjectures were recently posed in an effort to better understand the relationship between BI and IEWDS in FPIE games. Four of these problems regard binary voting agendas with sequential voting and two alternatives. Those problems are: (1) Assuming no indifferences, is the BI strategy profile, "always vote for my preferred alternative", guaranteed to survive IEWDS using exhaustive elimination? (2) Does any order of IEWDS leave only strategy profiles that generate paths of play consistent with BI? (3) Does there exist an order of IEWDS that leaves only strategy profiles that generate paths of play consistent with BI? (4) Does any order of IEWDS leave at least one strategy profile that generates a path of play consistent with BI? This paper proves all four conjectures. Moreover, the first conjecture is generalized to agendas with indifferences, the second and third conjectures are shown to not hold for binary voting agendas with more than two alternatives, and I comment on additional results related to the last three
problems.https://resolver.caltech.edu/CaltechAUTHORS:20170808-134301462IPO Underpricing in Two Universes: Berlin, 1882-1892, and New York, 1998-2000
https://resolver.caltech.edu/CaltechAUTHORS:20170808-150536344
DOI: 10.7907/d0y4r-48224
Underpricing of new issues relates negatively to underwriter reputation in studies covering the US during the early 1970s until 1997 but positively in one study of IPOs from 1992-4. This paper investigates whether IPO underpricing depends on the organization of the financial system, whether underwriter reputation is a consistent indicator of firm quality and therefore (negatively) of underpricing, and whether this reputation effect also appears in completely different contexts. The study also looks for truncation in the observed returns distributions that may hint at price support activities on the part of underwriters. To answer these questions, the study presents evidence on new issues of stocks and their one-day returns in the Berlin market of 1882-1892 along with parallel data from the New York markets of 1998-2000.
Despite what appear to be major institutional differences between the US and Germany, underpricing and its correlates are remarkably similar in the two cases: median underpricing is nearly the same in the two samples. Strikingly, the link between underwriter reputation and underpricing is positive both in the U.S. of recent years and in Berlin of the 1880s. This finding is in stark contrast with those for the US in the 1970s and 80s. The trade off between prices and rationing faced by underwriters might result in this instability in the reputation-underpricing link. Finally, the observed distributions of first-day returns for both markets display marked skewness toward positive values ? a pattern that is consistent with left censoring and quite possibly with underwriter price supports.
These results support a number of conclusions: first, either underwriter reputation is a poor signal of firm quality or firm quality is positively related to underpricing in certain circumstances; second, the largest and most prestigious underwriters may exert market power or at least provide more or better service in return for their higher indirect costs; third, the relationship between underwriter reputation or market share and underpricing clearly varies dramatically over time and across countries; and finally, financial system design?in particular, universal and relationship banking?may have little impact on the performance of new issues markets. Given the German results, one should conclude either that significant information asymmetries exist despite universality and formal relationships in the banking system or that information problems are unnecessary for the emergence of underpricing.https://resolver.caltech.edu/CaltechAUTHORS:20170808-150536344Search in the Formation of Large Networks: How Random are Socially Generated Networks?
https://resolver.caltech.edu/CaltechAUTHORS:20170809-094225745
DOI: 10.7907/ca4tc-rv406
We present a model of network formation where entering nodes find other nodes to link to both completely at random and through search of the neighborhoods of these randomly met nodes. We show that this model exhibits the full spectrum of features that have been found to characterize large socially generated networks. Moreover, we derive the distribution of degree (number of links) across nodes, and show that while the upper tail of the distribution is approximately "scale-free," the lower tail may exhibit substantial curvature, just as in observed networks. We then fit the model to data from six networks. Besides offering a close fit of these diverse networks, the model allows us to impute the relative importance of search versus random attachment in link formation. We find that the fitted ratio of random meetings to search-based meetings varies dramatically across these applications. Finally, we show that as this random/search ratio varies, the resulting degree distributions can be completely ordered in the sense of second order stochastic dominance. This allows us to infer how the relative randomness in the formation process affects average utility in the network.https://resolver.caltech.edu/CaltechAUTHORS:20170809-094225745Risk, Ambiguity, and the Separation of Utility and Beliefs
https://resolver.caltech.edu/CaltechAUTHORS:20170808-154249780
DOI: 10.7907/b28c4-0z245
We introduce and characterize axiomatically a general model of static choice under uncertainty, which is possibly the weakest model in which a separation of cardinal utility and a representation of beliefs is achieved. Most of the popular non-expected utility models in the literature are special cases of it.
To prove its usefulness, we show that the model can be used to generalize several well-known results on the characterization of risk aversion. Elsewhere, we have shown that it can be fruitfully applied to the problem of characterizing a notion of ambiguity aversion, as the separation of utility and beliefs that we achieve can be used to identify and remove aspects of risk attitude from the decision maker's behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170808-154249780Existence of Equilibrium in Auctions and Discontinuous Bayesian Games: Endogenous and Incentive Compatible Sharing Rules
https://resolver.caltech.edu/CaltechAUTHORS:20170808-170732405
DOI: 10.7907/86a06-jka38
We consider discontinuous games with incomplete information. Auctions are a leading example. With standard tie breaking rules (or more generally, sharing rules), these games may not have equilibria. We consider sharing rules that depend on the private information of players. We show that there exists an equilibrium of an augmented game with an incentive compatible sharing rule in which players reveal their private information for the purpose of determining sharing. We also show that for a large class of private value auctions, ties never occur in the equilibrium of the augmented game. This establishes existence of equilibria in such auctions with standard tie breaking rules.https://resolver.caltech.edu/CaltechAUTHORS:20170808-170732405Trade rules for uncleared markets
https://resolver.caltech.edu/CaltechAUTHORS:20170808-143010615
DOI: 10.7907/tre32-rz293
We analyze markets in which the price of a traded commodity is such that the supply and the demand are unequal. Under standard assumptions, the agents then have single peaked preferences on their consumption or production choices. For such markets, we propose a class of Uniform Trade rules each of which determines the volume of trade as the median of total demand, total supply, and an exogenous constant. Then these rules allocate this volume "uniformly" on either side of the market. We evaluate these "trade rules" on the basis of some standard axioms in the literature. We show that they uniquely satisfy Pareto optimality, strategy proofness, no-envy, and an informational simplicity axiom that we introduce. We also analyze the implications of anonymity, renegotiation proofness, and voluntary trade on this domain.https://resolver.caltech.edu/CaltechAUTHORS:20170808-143010615Citizen Candidates Under Uncertainty
https://resolver.caltech.edu/CaltechAUTHORS:20170808-142212101
DOI: 10.7907/qyp6e-3b423
In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we term "sincere-strategic," and we show that with this refinement, the two equilibrium candidates will not be too extreme, one will lean to the left and the other one to the right.https://resolver.caltech.edu/CaltechAUTHORS:20170808-142212101Sophisticated EWA Learning and Strategic Teaching in Repeated Games
https://resolver.caltech.edu/CaltechAUTHORS:20170808-151402454
DOI: 10.7907/acy85-62128
Most learning models assume players are adaptive (i.e., they respond only to their own previous experience and ignore others' payoff information) and behavior is not sensitive to the way in which players are matched. Empirical evidence suggests otherwise. In this paper, we extend our adaptive experience-weighted attraction (EWA) learning model to capture sophisticated learning and strategic teaching in repeated games.
The generalized model assumes there is a mixture of adaptive learners and sophisticated players. An adaptive learner adjusts his behavior the EWA way. A sophisticated player rationally best-responds to her forecasts of all other behaviors. A sophisticated player can be either myopic or farsighted. A farsighted player develops multiple-period rather than single-period forecasts of others' behaviors and chooses to "teach" the other players by choosing a strategy scenario that gives her the highest discounted net present value.
We estimate the model using data from p-beauty contests and repeated trust games with incomplete information. The generalized model is better than the adaptive EWA model in describing and predicting behavior. Including teaching also allows an empirical learning-based approach to reputation formation which predicts better than a quantal-response extension of the standard type-based approach.https://resolver.caltech.edu/CaltechAUTHORS:20170808-151402454Consumers Networks and Search Equilibria
https://resolver.caltech.edu/CaltechAUTHORS:20170808-154749005
DOI: 10.7907/jr1yh-fb549
We explore the effect of local information sharing among consumers on market functioning. Consumers are embedded in a consumers network, they may costly search non-sequentially for price quotations and the information gathered are non-excludable along direct links. We first show that when search costs are low consumers randomize between searching for one price and two price quotations (high search intensity equilibrium). Otherwise, consumers randomize between searching for one price and not searching at all (low search intensity equilibrium). In both equilibria consumers search less frequently in denser networks. The main result of the paper shows that when search costs are low the expected price and the social welfare increase, while the consumer surplus decreases, as the consumers network becomes denser. These results are reverse when search costs are high.https://resolver.caltech.edu/CaltechAUTHORS:20170808-154749005An Explanation of Inefficient Redistribution: Transfers Insure Cohesive groups
https://resolver.caltech.edu/CaltechAUTHORS:20170808-141549284
DOI: 10.7907/px2t2-tw114
Redistributive policies often sustain inefficient economic sectors. Economists routinely argue that governments should let the sectors collapse, and compensate the affected agents. We explain why governments may instead prefer the inefficient redistribution. If income shocks in a given sector are more correlated than in the rest of the economy, and redistribution is related to individuals' income, then by sustaining a sector, the government is also providing its agents with insurance. The agents would lose this insurance if they relocate to another sector. Government transfers to sectors with correlated in- comes are therefore worth more than their monetary value. A preliminary analysis of the publicly-available data suggests that indeed agents in sectors that receive transfers are subject to more correlated income shocks. Our results imply that buying out inefficient sectors may not be the second-best policy when agents cannot fully insure themselves (markets are incomplete).https://resolver.caltech.edu/CaltechAUTHORS:20170808-141549284EWA Learning in Bilateral Call Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170801-162754834
DOI: 10.7907/96ycp-0c965
This chapter extends the EWA learning model to bilateral call market games (also known as the "sealed-bid mechanism" in two-person bargaining). In these games, a buyer and seller independently draw private values from commonly-known distributions and submit bids. If the buyer's bid is above the seller's, they trade at the midpoint of the two bids; otherwise they don't trade. We apply EWA by assuming that players have value-dependent bidding strategies, and they partially generalize experience from one value/cost condition to another in response to the incentives from nonlinear optimal bid functions. The same learning model can be applied to other market institutions where subjects economize on learning by taking into consideration similarity between past experience and a new environment while still recognizing the difference in market incentives between them. The chapter also presents a new application of EWA to a "continental divide" coordination game, and reviews 32 earlier studies comparing EWA, reinforcement, and belief learning. The application shows the advantages of a generalized adaptive model of behavior that includes elements of reinforcement, belief-based and direction learning as special cases at some cost of complexity for the benefit of generality and psychological appeal. It is a good foundation to build upon to extend our understanding of adaptive behavior in more general games and market institutions. In future work, we should investigate the similarity parameters, ψ and ω, to better characterize their magnitude and significance in different market institutions.https://resolver.caltech.edu/CaltechAUTHORS:20170801-162754834Tacit Collusion in Auctions and Conditions for Its Facilitation and Prevention: Equilibrium Selection in Laboratory Experimental Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170808-160459422
DOI: 10.7907/8zxqw-8ar20
The paper studies bidder behavior in simultaneous, continuous, ascending price auctions. The purpose is to create (possibly extreme) conditions under which tacit collusion develops quickly, naturally and reliably; study models of its development, and then study institutional and environmental remedies that would cause it to evolve into competitive behavior. Special environments were implemented with a purpose of creating good conditions for the development of tacit collusion. The special environments were based on a type of public, symmetrically "folded" preferences together with what we call "item-aligned" preferences. Once tacit collusion developed, remedies were implemented and the success of the remedies in promoting competitive behavior was studied.
The results are as follow. (1) The environmental conditions do foster tacit collusion. (2) The tacit collusion corresponds to the unique buyer Pareto Equilibrium of a game theoretic model of the auction process. (3) Once tacit collusion developed, it proved remarkably robust to institutional changes that weakened it as an equilibrium of the game theoretic model. (4) The only remedy that was clearly successful was a nonpublic change in the preference of participants that destroyed the symmetrically, "folded" and "item aligned" patterns of preferences, creating head to head competition between two agents reminiscent of the concept of a "maverick".https://resolver.caltech.edu/CaltechAUTHORS:20170808-160459422Voluntary Implementation
https://resolver.caltech.edu/CaltechAUTHORS:20170808-165154639
DOI: 10.7907/q2nj0-z2j43
We examine Nash implementation when individuals cannot be forced to accept the outcome of a mechanism. Two approaches are studied. The first approach is static where a state-contingent participation constraint defines an implicit mapping from rejected outcomes into outcomes that are individually rational. We call this voluntary implementation and show that the constrained Walrasian correspondence is not voluntarily implementable. The second approach is dynamic where a mechanism is replayed if the outcome at any stage is vetoed by one of the agents. We call this stationary implementation and show that if players discount the future in any way, then the constrained Walrasian correspondence is stationarily implementable.https://resolver.caltech.edu/CaltechAUTHORS:20170808-165154639Asymmetries in Exchange Behavior Incorrectly Interpreted as Evidence of Prospect Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170808-145011964
DOI: 10.7907/2g5fg-jhx98
Systematic asymmetries in exchange behavior have been widely interpreted as support for endowment effect theory, that loss aversion associated with ownership creates an asymmetry in valuations and exchange behavior. According to this theory, which is an application of prospect theory, parting with an endowed object produces a loss that is greater than the gain from acquiring another object of otherwise equal value. The results also have been cited as general support for prospect theory, of which loss aversion is a fundamental component. The experiments reported here suggest that such interpretations of observed exchange asymmetries are incorrect. While exchange asymmetries are readily observable, the data reported here support the claim that measurements of preferences are confounded by experimental procedures. In other words, the data suggest that experimental procedures lead to observed exchange asymmetries. In treatments for which endowment effect theory would predict exchange asymmetries, we observe no asymmetries when we eliminate alternative explanations related to procedures. Therefore, our results do not support the use of observed exchange asymmetries as evidence of loss aversion, endowment effect theory or prospect theory and call into question normative legal analyses grounded in these theories of decision making.https://resolver.caltech.edu/CaltechAUTHORS:20170808-145011964Electoral Competition with Entry
https://resolver.caltech.edu/CaltechAUTHORS:20170808-161419643
DOI: 10.7907/h7879-q9e10
By extending the established theoretical models of electoral competition with entry (eg. Palfrey (1984)) to incorporate simultaneous competition for multiple districts I produce a unique two party equilibrium under plurality rule with non-centrist party platforms. This equilibrium also precludes entry of additional parties. This result is used to provide a domain for which Duverger's Law could be expected to apply. I also present new results under the run-o_ rule for both the single district and multiple district frameworks. In the single district case I find that for the run-off rule the model is more consistent with empirical observation than it is for the plurality rule, but that this performance is reversed when we consider multiple districts. The paper also sheds some light on how the different levels of elections in the U.S. and other systems relate to each other.https://resolver.caltech.edu/CaltechAUTHORS:20170808-161419643Proper Scoring Rules for General Decision Models
https://resolver.caltech.edu/CaltechAUTHORS:20170808-144357937
DOI: 10.7907/g3pys-vyw44
On the domain of Choquet expected utility preferences with risk neutral lottery evaluation and totally monotone capacities, we demonstrate that proper scoring rules do not exist. This implies the non-existence of proper scoring rules for any larger class of preferences (CEU with convex capacities, multiple priors). We also show that if an agent whose behavior conforms to the multiple priors model is faced with a scoring rule for a subjective expected utility agent, she will always announce a probability belonging to her set of priors; moreover, for any prior in the set, there exists such a scoring rule inducing the agent to announce that prior.https://resolver.caltech.edu/CaltechAUTHORS:20170808-144357937On Estimating the Mean In Bayesian Factor Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170808-134717128
DOI: 10.7907/2dddb-wgm70
In the Bayesian factor analysis model (Press & Shigemasu, 1989), the sample size was assumed to be large enough to estimate the overall population mean by the sample mean. In this paper, the procedure of estimating the population mean by the sample mean is compared to estimating it along with the other parameters both by Gibbs sampling and Iterated Conditional Modes. Results show that even in small samples, the Gibbs sampling and iterated conditional modes estimates of the mean are for practical purposes identical to the sample mean. Thus, the population mean is adequately estimated by its sample value.https://resolver.caltech.edu/CaltechAUTHORS:20170808-134717128A Smart Market Solution to a Class of Back-Haul Transportation Problems: Concept and Experimental Testbeds
https://resolver.caltech.edu/CaltechAUTHORS:20170808-162232402
DOI: 10.7907/4g9yz-ft534
Abstract: Back-haul problems occur in many areas of transportation. One way rental of equipment often takes it from an area of high demand to an area of low demand. Examples include container rentals, car rentals and other cases of mobile equipment. The problem is to return the equipment to the location of need. Typically this is viewed as an administrative and scheduling problem. The approach taken here is "decentralized" in which a specially designed market organizes competition and information to minimize the cost of back-hauls without the direct intervention of administrative negotiations or command-and-control types of scheduling. Laboratory experimental methods are employed to test the concept and explore its limitations.https://resolver.caltech.edu/CaltechAUTHORS:20170808-162232402Economic, Political, and Legal Factors in Financial System Development: International Patterns in Historical Perspective
https://resolver.caltech.edu/CaltechAUTHORS:20170808-145405888
DOI: 10.7907/aq75t-8sd40
Financial systems are often described either as bank-based, universal, and relational or as market-based, specialized, and arms-length; and for many years academics and policymakers have debated the relative merits of these different types of systems. This paper inquires into the underlying causes of financial system structure and development. Older theories dictated that financial institutions developed in relationship to the economy's level of development. Newer work has brought political and legal factors to the fore: hypothesizing specific relationships between banking structure and state centralization and between financial development and legal tradition. This study classifies countries by type of financial system, and in doing, indicates that few banking systems fit the extreme paradigms of universal-relationship or specialized arm's length banking. On the other hand, despite several cases of temporary upheaval, and recent widespread movement toward conglomeration, banking system structure has remained remarkably stable over the last 100 to 150 years.
Economic factors in the late nineteenth century provide relatively strong explanatory power for financial system development, market orientation, and banking structure at the eve of World War I and in the present day. Banking specialization and market orientation appear strongly associated with legal tradition, though it seems more likely that the three characteristics are jointly determined or that the legal system variable simply proxies for a close or historical tie to the exporter of many political-economic institutions, England. Legal orientation exerted little impact on financial institution growth at the turn of the century and provides no consistent prediction of real economic growth rates over the past 150 years. Finally, political structure relates significantly to market orientation but not to banking system design or legal tradition. Nonetheless, many individual country histories make it clear that political forces played important roles in shaping regulations that in turn altered the course of financial institutions and markets. The results here simply suggest that these political forces appeared inconsistently and had no traceable, uniform relationship to the overall political system in place in the nineteenth century.
The results underscore two principal themes: the weight of history in determining the growth and design of financial institutions and markets, and the importance of idiosyncratic forces that buffet institutions over time. Despite obvious connections among political, legal, economic, and financial institutions, robust, long-term, causal relationships often prove to be elusive.https://resolver.caltech.edu/CaltechAUTHORS:20170808-145405888Vote Buying
https://resolver.caltech.edu/CaltechAUTHORS:20170809-115042573
DOI: 10.7907/qzfzg-ttj55
We examine the consequences of vote buying, assuming this practice were allowed and free of stigma. Two parties competing in a binary election may purchase votes in a sequential bidding game via up-front binding payments and/or campaign promises (platforms) that are contingent upon the outcome of the election. We analyze the role of the parties' budget constraints and voter preferences. For instance, if only campaign promises are allowed, then the winning party depends not only on the relative size of the budgets, but also on the excess support of the party with the a priori majority, where the excess support is measured in terms of the (minimal) total utility of supporting voters who are in excess of the majority needed to win. If up front vote buying is permitted, and voters care directly about how they vote (as a legislator would), then the determination of the winning party depends on a weighted comparison of the two parties' budgets plus half of the total utility of their supporting voters. These results suggest that vote buying can lead to an inefficient party winning in equilibrium. We find that under some circumstances, if parties budgets are raised through donations, then vote buying can be efficient. Finally, we provide some results on vote buying in the face of uncertainty.https://resolver.caltech.edu/CaltechAUTHORS:20170809-115042573Objective Subjective Probabilities
https://resolver.caltech.edu/CaltechAUTHORS:20170808-151011915
DOI: 10.7907/vtw3e-k9969
This note shows that if the space of events is sufficiently rich and the subjective probability measure of each individual is non-atomic, then there is a σ-algebra of events on which everyone will have the same probability, and moreover, the range of these probabilities is the entire segment [0, 1].https://resolver.caltech.edu/CaltechAUTHORS:20170808-151011915The Kariba Case Study
https://resolver.caltech.edu/CaltechAUTHORS:20170808-151727675
DOI: 10.7907/5zrmp-vzn65
The Kariba Dam, completed during the second half of the 1950s, was the first mainstream dam built on the Zambezi River. Its construction was partially financed by the largest loan that the World Bank had given up until that time. Considered a successful project even by affected people based on cost benefit analysis, Kariba also involved unacceptable environmental and social impacts. The involuntary resettlement of 57,000 people within the reservoir basin and immediately downstream from the dam was responsible for serious environmental degradation which was one of a number of factors that left a majority of those resettled impoverished. Other factors included inadequate institutional capacity, inadequate opportunities, adverse rural-urban terms of trade, the war for Zimbabwes independence and the bankruptcy of the political economy of Zambia.
Built as a single purpose hydro project, Karibas construction drastically altered, and regularized, the Zambezis natural regime. That adversely affected the flood recession agriculture of Zambian villagers living below the dam as well as the size and biodiversity of the Zambezi delta and the productivity of Mozambiques offshore fishery. Failure to properly drawdown the Kariba and Cahora Bassa reservoirs prior to increased rainfall during the 1999-2000 and 2000-2001 rainy seasons caused significant downstream loss of life, crops, and village and urban infrastructure in February-March 2000 and 2001.https://resolver.caltech.edu/CaltechAUTHORS:20170808-151727675Incorporating Prior Knowledge Regarding the Mean in Bayesian Factor Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170808-134254087
DOI: 10.7907/3mbxp-d2q74
In the Bayesian factor analysis model (Press & Shigemasu, 1989), available knowledge regarding the model parameters is incorporated in the form of prior distributions. This has the added consequence of eliminating the ambiguity of rotation found in the traditional factor analysis model. In the model presented by Press and Shigemasu, a vague prior distribution was implicitly specified for the population mean. The sample size was assumed to be large enough to estimate the overall population mean by the sample mean. In this paper, available prior knowledge regarding the population mean is incorporated into the inferences in the form of a prior distribution. The population mean is estimated along with the other parameters by both Gibbs sampling and Iterated Conditional Modes.https://resolver.caltech.edu/CaltechAUTHORS:20170808-134254087Finite Perfect Information Extensive Games with Generic Payoffs
https://resolver.caltech.edu/CaltechAUTHORS:20170808-134943416
DOI: 10.7907/veeev-b3b51
In finite perfect information extensive (FPIE) games, backward induction (BI) gives rise to all pure-strategy subgame perfect Nash equilibria, and iterative elimination of weakly dominated strategies (IEWDS) may give different outcomes for different orders of elimination. Duggan recently posed several conjectures in an effort to better understand the relationship between BI and IEWDS in FPIE games. One conjecture states that the unique BI strategy profile in FPIE games with generic payoffs is guaranteed to survive IEWDS when all weakly dominated strategies are eliminated at every round. This paper exhibits a counterexample to this conjecture.https://resolver.caltech.edu/CaltechAUTHORS:20170808-134943416Quantiles and Medians
https://resolver.caltech.edu/CaltechAUTHORS:20170808-161550003
DOI: 10.7907/7azxy-rj342
We provide a list of functional equations that characterize quantile functions for collections of bounded and measurable functions. Our central axiom is ordinal covariance. When a probability measure is exogeneously given, we characterize quantiles with respect to that measure through monotonicity with respect to stochastic dominance. When none is given, we characterize those functions which are simply ordinally covariant and monotonic as quantiles with respect to capacities; and we also find an additional condition for finite probability spaces that allows us to represent the capacity as a probability measure. Additionally requiring that a function be covariant under its negation results in a generalized notion of median. Finally, we show that all of our theorems continue to hold under the weaker notion of covariance under increasing, concave transformations. Applications to the theory of ranking infinite utility streams and to the theory of risk measurement are provided.https://resolver.caltech.edu/CaltechAUTHORS:20170808-161550003Consistent representative democracy
https://resolver.caltech.edu/CaltechAUTHORS:20170809-093454389
DOI: 10.7907/ee0ee-33s90
We study axioms which define "representative democracy" in an environment in which agents vote over a finite set of alternatives. We focus on a property that states that whether votes are aggregated directly or indirectly makes no difference. We call this property representative consistency. Representative consistency formalizes the idea that a voting rule should be immune to gerrymandering. We characterize the class of rules satisfying unanimity, anonymity, and representative consistency. We call these rules "partial priority rules." A partial priority rule can be interpreted as a rule in which each agent can "veto" certain alternatives. We investigate the implications of imposing other axioms to the list specified above. We also study the partial priority rules in the context of specific economic models.https://resolver.caltech.edu/CaltechAUTHORS:20170809-093454389An axiomatic theory of political representation
https://resolver.caltech.edu/CaltechAUTHORS:20170809-092845210
DOI: 10.7907/vd421-sjs02
We discuss the theory of voting rules which are immune to gerrymandering. Our approach is axiomatic. We show that any rule that is unanimous, anonymous, and representative consistent must decide a social alternative as a function of the proportions of agents voting for each alternative, and must either be independent of this proportion, or be in one-to-one correspondence with the proportions. In an extended model in which voters can vote over elements of the unit interval, we introduce and characterize the quasi-proportional rules based on unanimity, anonymity, representative consistency, strict monotonicity, and continuity. We show that we can always (pointwise) approximate a single-member district quota rule with a quasi-proportional rule. We also establish that upon weakening strict monotonicity, the generalized target rules emerge.https://resolver.caltech.edu/CaltechAUTHORS:20170809-092845210Global Instability in Experimental General Equilibrium: The Scarf Example
https://resolver.caltech.edu/CaltechAUTHORS:20170808-153221745
DOI: 10.7907/pvhqy-8kz29
Scarf (1960) proposed a market environment and a model of dynamic adjustment in which the standard tatonnement price adjustment process orbits around, rather than converges to, the competitive equilibrium. Hirota (1981) characterized the price paths by the configuration of endowments. We explore the predictions of Scarf's model in a nontatonnement experimental double auction. We find that the average transaction prices in each period do follow the path predicted by the Scarf and Hirota models. When the model predicts prices will converge to the competitive equilibrium, our data converge; when the model predicts prices will orbit our data orbit the equilibrium, and in the direction predicted by the model. Moreover, we observe a weak tendency for prices within a period to follow the path predicted by the model.https://resolver.caltech.edu/CaltechAUTHORS:20170808-153221745Policy Uncertainty, Electoral Securities and Redistribution
https://resolver.caltech.edu/CaltechAUTHORS:20170808-150121475
DOI: 10.7907/xk7az-rvd11
This paper investigates how uncertainty about the adoption of a redistribution policy affects political support for redistribution when individuals can trade policy contingent securities in the stock market. We show that the demand for redistribution is always smaller than in the case where no "policy-insurance market" is available. Consistent with the empirical evidence, our analysis implies that in economies with well-developed financial markets the level of redistribution decreases with the level of participation in these markets and with income inequality. We show that the existence of a policy insurance market may increase future expected inequality even if a majority of individuals are redistributing resources through private transfers.https://resolver.caltech.edu/CaltechAUTHORS:20170808-150121475Regular Quantal Response Equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20170809-091351332
DOI: 10.7907/jex2e-1q546
The structural Quantal Response Equilibrium (QRE) generalizes the Nash equilibrium by augmenting payoffs with random elements that are not removed in some limit. This approach has been widely used both as a theoretical framework to study comparative statics of games and as an econometric framework to analyze experimental and field data. The framework of structural QRE is flexible: it can be applied to arbitrary infinite games and incorporate very general error structures. Restrictions on the error structure are needed, however, to place testable restrictions on the data (Haile et al., 2004). This paper proposes a reduced-form approach, based on quantal response functions that replace the best-response functions underlying the Nash equilibrium. We define a regular QRE as a fixed point of quantal response functions that satisfies four axioms: continuity, interiority, responsiveness, and monotonicity. We show that these conditions are not vacuous and demonstrate with an example that they imply economically sensible restrictions on data consistent with laboratory observations. The reduced-form approach allows for a richer set of regular quantal response functions, which has proven useful for estimation purposes.https://resolver.caltech.edu/CaltechAUTHORS:20170809-091351332Throwing Out the Baby with the Bath Water: A Comment on Green, Yoon and Kim
https://resolver.caltech.edu/CaltechAUTHORS:20170731-170111519
DOI: 10.7907/mkbra-4vn16
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170731-170111519A Solution to Matching with Preferences over Colleagues
https://resolver.caltech.edu/CaltechAUTHORS:20170808-153108432
DOI: 10.7907/vtc5z-bnj02
We study many-to-one matchings, such as the assignment of students to colleges, where the students have preferences over the other students who would attend the same college. It is well known that the core of this model may be empty, without strong assumptions on agents' preferences. We introduce a method that finds all core matchings, if any exist. The method requires no assumptions on preferences. Our method also finds certain partial solutions that may be useful when the core is empty.https://resolver.caltech.edu/CaltechAUTHORS:20170808-153108432Experimental Testbedding of a Pollution Trading System: Southern California's RECLAIM Emissions Market
https://resolver.caltech.edu/CaltechAUTHORS:20170808-140630427
DOI: 10.7907/txk4w-c9032
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170808-140630427Post-Stratification without Population Level Information on the Post-Stratifying Variable, with Application to Political Polling
https://resolver.caltech.edu/CaltechAUTHORS:20170808-143149807
DOI: 10.7907/814rt-46j78
We investigate the construction of more precise estimates of a collection of population means using information about a related variable in the context of repeated sample surveys. The method is illustrated using poll results concerning presidential approval rating (our related variable is political party identification). We use post-stratification to construct these improved estimates, but since we don't have population level information on the post-stratifying variable, we construct a model for the manner in which the post-stratifier develops over time. In this manner, we obtain more precise estimates without making possibly untenable assumptions about the dynamics of our variable of interest, the presidential approval rating.https://resolver.caltech.edu/CaltechAUTHORS:20170808-143149807Strategic analysis in complex networks with local externalities
https://resolver.caltech.edu/CaltechAUTHORS:20170808-155344395
DOI: 10.7907/xr8ja-nh893
In this paper, we discuss a model with local positive externalities on a complex random network that allows for wide heterogeneities among the agents. The situation can be analyzed as a game of incomplete information where each player's connectivity is her type. We focus on three paradigmatic cases in which the overall degree distribution is Poisson, exponential, and scale-free (given by a power law). For each of them, we characterize the equilibria and obtain interesting insights on the interplay between network topology and payoffs. For example, we reach the somewhat paradoxical conclusion that a broad degree distribution or/and too low a cost of effort render it difficult, if not impossible, to sustain an (efficient) high-effort configuration at equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170808-155344395The Approximation of Efficient Public Good Mechanisms by Simple Voting Schemes
https://resolver.caltech.edu/CaltechAUTHORS:20170808-142250579
DOI: 10.7907/v811f-scj07
This paper compares the performance of simple voting rules, called referenda, to the performance of interim efficient mechanisms for the provision of a public good. In a referendum, voters simply vote for or against the provision of the public good, and production of the public good depends on whether or not the number of yes votes exceeds a prespecified threshold. Costs are shared equally. We show that in large populations for any interim efficient allocation rule, there exists a corresponding referendum that yields approximately the same total welfare when there are many individuals. Moreover, if there is a common value component to the voters' preferences, then there is a unique approximating referendum.https://resolver.caltech.edu/CaltechAUTHORS:20170808-142250579Market Stability: Backward Bending Supply in a Laboratory Experimental Market
https://resolver.caltech.edu/CaltechAUTHORS:20170810-150248660
DOI: 10.7907/jtj4g-2xr09
The paper investigates the stability properties of markets with backward bending supply curves. Parameters are chosen so that the two classic models of price dynamics, the Walrasian model and the Marshallian model, give opposite predictions. The results are: (1) market instability can be observed; (2) in the backward bending case stability is captured by the Walrasian model and the Marshallian model of dynamics is rejected. Previous experiments have demonstrated that the Marshallian model works in the forward falling case. Thus, which theory of dynamics is appropriate for a market depends upon the underlying reasons for demand and supply shapes.https://resolver.caltech.edu/CaltechAUTHORS:20170810-150248660Collusion in private value ascending price auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170810-155523315
DOI: 10.7907/aq3gx-3at57
We investigate bidder collusion in one-sided ascending price auctions without communication. If bidding rules in an English-type auction allow bidders to match each other's' bids, collusion can be sustained as a Nash equilibrium of a one-shot auction game. Our earlier experiments show that in common value auctions with complete information, collusion does occur and is sustainable even when bidders cannot explicitly coordinate their strategies. In this study, we investigate the robustness of bidders' collusive behavior in private values, private information environments. We find that collusion still occurs as long as the bidders' gains from collusion are high.https://resolver.caltech.edu/CaltechAUTHORS:20170810-155523315Experience-Weighted Attraction Learning in Sender-Receiver Signaling Games
https://resolver.caltech.edu/CaltechAUTHORS:20170810-160858318
DOI: 10.7907/t3pzy-0et43
Recent experiments have indicated that it is possible to systematically lead subjects to less refined equilibria in signaling games. In this paper, we seek to understand the process by which this occurs using Camerer and Ho's Experience Weighted Attraction (EWA) model of learning in games. We first adapt the model to extensive-form signaling games by specifying that senders update the chosen message for both the realized and unrealized type, but do not update the unchosen message. We test this model against the choice reinforcement and belief-based special cases of EWA; the latter is of particular interest because it formalizes the story about convergence to less refined equilibria offered by Brandts and Holt. We also test a variety of models which update unchosen messages. We find that while the Brandts-Holt story captures the direction of switching from one strategy to another, it does not do a good job at capturing the rate at which the switching occurs. EWA does quite well at predicting the rate of switching, and is slightly bettered by the unchosen message models, which all perform equally well.https://resolver.caltech.edu/CaltechAUTHORS:20170810-160858318Equilibrium Equivalence with J Candidates and N Voters
https://resolver.caltech.edu/CaltechAUTHORS:20170810-144525092
DOI: 10.7907/c6dk5-phj88
In this paper, we examine the incentives facing candidates in the spatial voting model. We assume that voters' types are independent, but allow for nonidentical distributions across voters. Examining candidate positional equilibria as a function of voter behavior, we find that what we term p-symmetric strict p-local equilibria when candidates maximize expected plurality are also strict p-local equilibris when candidates maximize probability of victory. This result holds for arbitrary numbers of candidates and voters. We also show that, for generic type distributions, interior p-asymmetric equilibria under maximization of expected vote share are not equilibria under maximization of probability of victory.https://resolver.caltech.edu/CaltechAUTHORS:20170810-144525092Statistical Analysis of the Additive and Multiplicative Hypotheses of Multiple Exposure Synergy for Cohort and Case-Control Studies
https://resolver.caltech.edu/CaltechAUTHORS:20170810-152247165
DOI: 10.7907/x82vj-kw233
This paper considers hypotheses tests for synergistic relationships in epidemiological studies. Two hypotheses are considered. First, I develop tests of the additive hypothesis which states that the combined risk from two sources of exposure is the sum of each risk taken separately. I then develop tests for the hypothesis that a multiplicative relationship exists for the risks, i.e., that the combined risk is consistent with the multiplication of the individual risks. Following standard practice in epidemiological studies I consider tests for both case-referent and cohort (standardized mortality rate) type studies.https://resolver.caltech.edu/CaltechAUTHORS:20170810-152247165Spatial Competition with Three Firms: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20170810-162532912
DOI: 10.7907/datyq-h1843
The paper reports the results of an experimental study of the three firm location problem. We compare the subjects' behavior in the experiments with the symmetric mixed strategy Nash equilibrium calculated by Shaked (1982). Overall, the findings are consistent with the equilibrium prediction. However, the subjects' locations were significantly more dispersed than predicted by the theory. Three alternative explanations of this phenomenon - inexperience, approximate equilibrium behavior and risk aversion - are suggested and evaluated for their predictive power. Special attention is paid to risk aversion.https://resolver.caltech.edu/CaltechAUTHORS:20170810-162532912Regulation, Taxation, and the Development of the German Universal Banking System, 1884-1913
https://resolver.caltech.edu/CaltechAUTHORS:20170808-145300983
DOI: 10.7907/6v6nr-e5p31
Previous researchers argue that the legal and regulatory environment helped shape the German financial system in the nineteenth and early-twentieth centuries, with particular emphasis on the damaging effects of the stock-exchange law of 1896. This paper finds that the stock exchange law of 1896 exerted little measurable impact on the growth and concentration of the universal banking system or on the business turnover of universal banks relative to securities markets. The paper also shows that the English commercial banking sector and the German universal banking sector underwent similar movements toward concentration between 1884 and 1920 (both accelerating after 1912), despite no corresponding regulatory changes in England–further suggesting that consolidation of universal banking resulted from factors other than the 1896 law.https://resolver.caltech.edu/CaltechAUTHORS:20170808-145300983Is the Sleeping Giant Awakening? Latinos and California Politics in the 1990's
https://resolver.caltech.edu/CaltechAUTHORS:20170810-154207193
DOI: 10.7907/f4t52-keg72
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170810-154207193I Know What You Did Last Quarter: Economic Forecasts of Professional Forecasters
https://resolver.caltech.edu/CaltechAUTHORS:20170810-145105143
DOI: 10.7907/6fsfe-rfc97
Abstract: In this paper we have examined data from the Survey of Professional Forecasters. We study nominal GDP, the unemployment rate, the Treasury bill rate and the implicit price deflator beginning with the first quarter of 1992. Forecasts for a single time period appear several times in consecutive forecasts in the survey. We study the revision of forecasts for a fixed points in time. We find that the forecasts were not unbiased, but they were biased in directions one would expect, ex post. There is strong dependence of revisions of expectations on the most recently observed one step forecast errors. For most series, lagged innovations do not enter the regression equations significantly and constant terms are not significantly different from zero. Most forecasters seem to be using information on several series in their forecasts.https://resolver.caltech.edu/CaltechAUTHORS:20170810-145105143A Theory of Voting in Large Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170810-163456027
DOI: 10.7907/5pq1n-gsc34
This paper provides a game-theoretic model of probabilistic voting and then examines the incentives faced by candidates in a spatial model of elections. In our model, voters' strategies form a Quantal Response Equilibrium (QRE), which merges strategic voting and probabilistic behavior. We first show that a QRE in the voting game exists for all elections with a finite number of candidates, and then proceed to show that, with enough voters and the addition of a regularity condition on voters' utilities, a Nash equilibrium profile of platforms exists when candidates seek to maximize their expected margin of victory. This equilibrium (1) consists of all candidates converging to the policy that maximizes the expected sum of voters' utilities, (2) exists even when voters can abstain, and (3) is unique when there are only 2 candidates.https://resolver.caltech.edu/CaltechAUTHORS:20170810-163456027Keeping an Eye on Your Neighbors: Agents Monitoring and Sanctioning One Another In a Common-Pool Resource Environment
https://resolver.caltech.edu/CaltechAUTHORS:20170810-133619486
DOI: 10.7907/h1wcf-ctc47
The role of a specific institution in avoiding a "tragedy of the commons" situation in a common pool-resource environment is studied experimentally. The resource users privately decide their own exploitation level and then, once the group outcome is revealed, can choose to select other individuals for inspection. At a cost, the inspector can view the decision of any individual. If the inspected individual has exploited the resource excessively, relative to a publicly known amount, a fine is imposed and paid to the inspector. The rules, called Carte di Regola, were modeled after an historical case of self-governed rural communities. The impact of the rules is a dramatic increase in efficiency over the no rule case but still less than 100%. As part of an attempt to understand the nature of the impact of the rules the paper focuses on models of individual agent's choices. The patterns of results relative to the classical Nash model are similar to other experiments. The model does well, except for the fact that contributions to the public good are less than Nash equilibrium amount. However, when the environment is changed to allow sanctions, contributions above the Nash equilibrium are observed. This paradoxical "flip" in behavior is explained by a non classical model in which spite plays a role in preferences and in which agents are heterogeneous. The model of asymmetric, other regarding agents also does relatively well in predicting patterns of individual choices such as the choice to inspect and sanction others. Efficiency is improved along with the strength of sanctions and the patterns of individual choices are consistent with the non classical model.https://resolver.caltech.edu/CaltechAUTHORS:20170810-133619486Quantal Response Equilibrium and Overbidding in Private-Value Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170810-132644522
DOI: 10.7907/7m89d-44815
This paper reports the results of a private-values auction experiment in which expected costs of deviating from the Nash equilibrium bidding function are asymmetric, with the implication that upward deviations will be more likely in one treatment than in the other. Overbidding is observed in both treatments, but is more prevalent in the treatment where the costs of overbidding are lower. We specify and estimate a noisy (quantal response) model of equilibrium behavior. Estimated noise and risk aversion parameters are highly significant and consistent across treatments. The resulting two-parameter model tracks both the average bids and the distribution of bids remarkably well. Alternative explanations of overbidding are also considered. The estimates of parameters from a nonlinear probability weighting function yield a formulation that is essentially equivalent to risk aversion in this context. A model in which players experience a "joy of winning" provides a reasonable fit of the data but does significantly worse than the risk aversion model.https://resolver.caltech.edu/CaltechAUTHORS:20170810-132644522Why Did Proposition 227 Pass?
https://resolver.caltech.edu/CaltechAUTHORS:20170810-160250001
DOI: 10.7907/7aa2q-gbq89
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170810-160250001Platonic Mechanism Design
https://resolver.caltech.edu/CaltechAUTHORS:20170811-140442145
DOI: 10.7907/pn7sd-8v583
We characterize the class of tiered exchange functions for an assignment problem. We examine a model with a finite number of indivisible goods to be assigned to a finite number of individuals with status quo endowments. However, these individuals can be partitioned into tiers, and new axioms of social justice are developed to account for this tiering.https://resolver.caltech.edu/CaltechAUTHORS:20170811-140442145Repeated Implementation
https://resolver.caltech.edu/CaltechAUTHORS:20170811-163520727
DOI: 10.7907/b7sbx-5k980
In the traditional static implementation literature it is often impossible for implementors to enforce their optimal outcomes. And when restricting the choice to dominant-strategy implementation, only the dictatorial choices of one of the participants are implementable.
Repeated implementation problems are drastically different. This paper provides a strong implementation "folk theorem": for patient implementors, every outcome function they care about is dominant-strategy implementable.https://resolver.caltech.edu/CaltechAUTHORS:20170811-163520727Status Quo Bias in Bargaining: An extension of the Myerson Satterthwaite Theorem with an application to the Coase Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170811-153048325
DOI: 10.7907/fykjr-x5418
We use a generalized version of the Myerson-Satterthwaite theorem to study inefficiencies in bilateral bargaining over trade of an indivisible good, where there is two sided private information on the valuations. We show that when preferences are convex and quasi linear, and when the private information represents the magnitude of the utility gain or loss and follows a uniform distribution, that the most efficient mechanism always exhibits a bias towards the status quo. In the case that utility functions are quadratic in the amount traded, we prove that for any incentive compatible direct mechanism, there is an expected bias towards the disagreement point. In other words, for the class of preferences we study, there is a strategic advantage to property rights in the Coase bargaining setup in the presence of incomplete information.https://resolver.caltech.edu/CaltechAUTHORS:20170811-153048325Citizenship and Political Representation in Contemporary California
https://resolver.caltech.edu/CaltechAUTHORS:20170811-144010317
DOI: 10.7907/q2n3t-eks05
[No abstract].https://resolver.caltech.edu/CaltechAUTHORS:20170811-144010317The econometrics and behavioral economics of escalation of commitment: A re-examination of Staw & Hoang's NBA data
https://resolver.caltech.edu/CaltechAUTHORS:20170811-142521643
DOI: 10.7907/t4j0a-rh635
We examine the phenomenon of escalation from an economist's perspective, emphasizing explanations which do not rule out rational behavior on the part of firms or agents. We argue that escalation cannot be established as a separate phenomenon unless these possible alternative explanations are properly accounted for. We present Staw and Hoang's (1995) study of NBA data as an instance of where evidence of escalation might be overturned upon more careful analysis. After performing several tests of our alternative explanations, we find that evidence of escalation persists, although it is weaker both in duration and magnitude.https://resolver.caltech.edu/CaltechAUTHORS:20170811-142521643Uncertainty and Candidate Personality Traits
https://resolver.caltech.edu/CaltechAUTHORS:20170811-143206034
DOI: 10.7907/h81hw-0pv50
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170811-143206034Gender and Tax
https://resolver.caltech.edu/CaltechAUTHORS:20170811-141048047
DOI: 10.2139/ssrn.172985
It is well known that there is a gender gap in American politics: that men and women in the aggregate vote differently for presidential candidates, for example. The precise determinants of the gap are less well known. Using existing data, mainly 1996 general election exit polls, this article explores the gender gap in relation to tax issues. It finds that while men and women have broadly similar attitudes or "primary preferences" about tax questions, the weighing of tax as an issue -- the "secondary preferences" -- differ, with men attaching more importance to tax as an issue than women. This result suggests, inter alia, that framing of political issues matter, and that a successful candidate may appeal differentially to each gender on the basis of different policy issues.https://resolver.caltech.edu/CaltechAUTHORS:20170811-141048047Elections and the Media: The Supply Side
https://resolver.caltech.edu/CaltechAUTHORS:20170811-145436321
DOI: 10.7907/g1h9c-77d46
This paper examines the role of the media in modern elections. In particular, the media industry is included in a traditional spatial voting framework. Consumers are modeled as random utility maximizers, and predictions are obtained, including an incumbency/celebrity advantage, emphasis of news concerning front-runners and unknown candidates, higher levels of coverage of volatile issues, higher levels of horse-race coverage in heterogeneous electorates, and lower levels of issue coverage in competitive news markets. The paper also includes a brief, cursory look at media coverage of Perot's candidacies in 1992 and 1996, illustrating how the model can be used "out of the box."https://resolver.caltech.edu/CaltechAUTHORS:20170811-145436321Information Cascades: Replication and an Extension to Majority Rule and Conformity Rewarding Institutions
https://resolver.caltech.edu/CaltechAUTHORS:20170811-134629257
DOI: 10.7907/dpd8c-fkn46
In a randomly determined order, each agent was given an independent, private signal about which of two states was selected by a random draw. After receiving the private signal each agent made a publicly announced decision about the state. Thus, at the time of personal decision each agent had a private signal and also knew the decisions of all preceding agents. The experiments focused on three different types of organization. (1) Agents were rewarded according to whether their announced decision was right or wrong. This "individualistic institution" is the one studied by Anderson and Holt (AER, 1997). Their discovery of information cascades is replicated. (2) Agents were rewarded according to whether a majority of announced decisions were right or wrong. Under this "majority rule institution" the instance of information cascades is sharply reduced. (3) Agents are rewarded more according to whether their personal announced decision was the same as the majority decision than they were rewarded if their decision was correct. This "conformity rewarding institution" is motivated by proceedings in which there is incentive to produce reports that conform to the reports of others. Substantial information cascades are observed.https://resolver.caltech.edu/CaltechAUTHORS:20170811-134629257Financial System Structure and Industrialization: Reassessing the German Experience before World War I
https://resolver.caltech.edu/CaltechAUTHORS:20170811-163052701
DOI: 10.7907/mah9p-jwv61
Lack of both theoretical cogency and empirical evidence casts doubt on the Gerschenkronian paradigm of banking and industrial development. Social, political, and regulatory environments may shape financial systems, and institutions may persist beyond their usefulness. Central features of universal banking arose late in the German industrialization, if at all; those that did may not have stemmed from the banks' universal structure. In focusing on international differences among financial systems, traditional views on the relative benefits of universal banking may underestimate both the impact of non-institutional factors on development experiences and the similarities in the ultimate effects of disparate systems.https://resolver.caltech.edu/CaltechAUTHORS:20170811-163052701Coalition and Party Formation in a Legislative Voting Game
https://resolver.caltech.edu/CaltechAUTHORS:20170811-152345079
DOI: 10.7907/eyge5-sb455
We examine a legislative voting game where decisions are being made over both ideological and distributive dimensions, and legislators' preferences are separable over the two dimensions. In equilibrium legislators prefer to make proposals for the two dimensions together, rather than offering sequential proposals on the two dimensions separately. The equilibria exhibit interaction between the ideological and distributive dimensions and in any equilibrium there is a positive probability that a proposal is made and approved which excludes the median legislator (as defined over the ideological dimension), in contrast with a game where no distributive decision is being made. Moreover, in any stationary equilibrium there is more than one ideological decision that has a positive probability of being proposed and approved.
We show that legislators can gain from forming political parties, and consider examples where predictions can be made about the composition of parties. We discuss the impact of political parties on the outcome.https://resolver.caltech.edu/CaltechAUTHORS:20170811-152345079Measuring The Relative Impact of Issues and the Economy in Democratic Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170811-132613359
DOI: 10.7907/swzy5-vb688
It is generally accepted that issues and economic outcomes influence elections. In this paper we analyze the relative importance of issues and the economy in Canadian elections. We estimate a model of the 1988 and 1993 Canadian elections in which we include voter evaluations of the parties on a variety of issues, and voter evaluations of the national economy and their personal finances. We demonstrate that it is possible to compare the effects of issues and the economy on election outcomes. And we put this in the context of the impact of issues and elections in several other democracies. We show that even in elections where other factors are dominant, we can still see the impact of economic voting. And we argue that given the tenuous connection between the actions of elected officials and macroeconomic outcomes, this suggests that voters may be giving elected officials undue leeway in their non-economic policy-making functions.https://resolver.caltech.edu/CaltechAUTHORS:20170811-132613359Tax Return Preparers and Tax Evasion
https://resolver.caltech.edu/CaltechAUTHORS:20170811-160210655
DOI: 10.7907/4a0gr-cvp11
The IRS has determined that the largest amount of tax evasion is associated with a relatively small percentage of returns prepared by tax practitioners. Tax practitioners can generally serve in three roles—to assist aggressive tax planning and evasion, to act as agents for the IRS and enforce the tax code, or simply be expensive outlets for tax return preparation. Do the distributional statistics lead to the conclusion that tax practitioners cause rather than divert additional tax evasion? The purpose of this paper is to address the causal connection between return preparation choice and evasion. We find that the return characteristics for those seeking practitioners are associated with an increased opportunity for tax evasion. But our analysis also shows that tax practitioners actually lower tax evasion beyond what it would be if an individual had sought another means of preparation, such as self-preparation.https://resolver.caltech.edu/CaltechAUTHORS:20170811-160210655The Impossibility of Compromise: Convexity and Uniqueness in Decision Making under Risk and Uncertainty
https://resolver.caltech.edu/CaltechAUTHORS:20170810-164741187
DOI: 10.7907/bjsyc-wnx58
The main set of results of the paper shows that for expected utility preferences, agreement of two preferences on one (non-extreme) indifference class implies their equality. We show that, besides expected utility preferences under (objective) risk, this uniqueness property holds for subjective expected utility preferences under uncertainty, in both Anscombe-Aumann's (partially subjective) and Savage's (fully subjective) frameworks. For these two frameworks, we present an analogous result for beliefs, showing that if two decision makers agree on a likelihood indifference class, they must have identical probabilities.
The second part of the paper shows two different sets of consequences of the uniqueness results. First, we study the preference aggregation results of the type pioneered by Harsanyi. We show that under the assumptions of those results, the existence of a weak form of agreement among the agents implies that all their preferences are identical. Then, we study the models which extend expected utility to incorporate ambiguity aversion, and show that for a class of these models a very weak condition is surprisingly equivalent to expected utility maximizing behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170810-164741187An Experimental Study of Jury Decision Rules
https://resolver.caltech.edu/CaltechAUTHORS:20170811-153843207
DOI: 10.7907/49qpb-78b68
We present experimental results on individual decisions in juries. We consider the effect of three treatment variables: the size of the jury (three or six), the number of votes needed for conviction (majority or unanimity), and jury deliberation. We find evidence of strategic voting under the unanimity rule, where the form of strategic behavior involves a bias to vote guilty to compensate for the unanimity requirement. A large fraction of jurors vote to convict even when their private information indicates the defendant is more likely to be innocent than guilty. This is roughly consistent with the game theoretic predictions of Feddersen and Pesendorfer (FP) [1998]. While individual behavior is explained well by the game theoretic model, at the level of the jury decision, there are numerous discrepancies. In particular, contrary to the FP prediction, we find that in our experiments juries convict fewer innocent defendants under unanimity rule than under majority rule. We are able to simultaneously account for the individual and group data by using Quantal Response Equilibrium to model the error.https://resolver.caltech.edu/CaltechAUTHORS:20170811-153843207Plurality and Probability of Victory: Some Equivalence Results
https://resolver.caltech.edu/CaltechAUTHORS:20170811-135632584
DOI: 10.7907/a421h-ndm17
This paper examines decision-making by political candidates under three different objective functions. In particular, we are interested in when the optimal strategies for expected vote share, expected plurality, and probability of victory maximizing candidates coincide in simple plurality elections. It is shown here that if voters' behavior, conditional on the policies proposed by the candidates, is identical from the candidates' perspective, and candidates are restricted to choosing pure strategies, then all three objectives lead to the same best response function when there are two candidates and abstention is not allowed. We then provide a counter-example to Hinich's claim of general asymptotic equivalence in two candidate elections without abstention in which voter types are independently, but not identically distributed. In addition, we provide a counterexample to general best response equivalence between these objective functions in two candidate elections in which abstention is allowed, but our other assumptions are satisfied. Finally, an example of why our result can not be immediately extended to arbitrary numbers of candidates is provided.https://resolver.caltech.edu/CaltechAUTHORS:20170811-135632584Bounded Rationality in Individual Decision Making
https://resolver.caltech.edu/CaltechAUTHORS:20170811-161943874
DOI: 10.7907/n0gd5-at008
My goals in this paper are: (i) To give a pithy, opinionated summary of what has been learned about bounded rationality in individual decision making from experiments in economics and psychology (drawing on my 1995 Handbook of Experimental Economics chapter); and (ii) mention some promising new directions for research which would be included if that chapter were written today.https://resolver.caltech.edu/CaltechAUTHORS:20170811-161943874Corporate Capital Structure and the Influence of Universal Banks in Pre-War Germany
https://resolver.caltech.edu/CaltechAUTHORS:20170811-161003249
DOI: 10.7907/jqv94-zbc83
Information asymmetries and conflicts of interest are theorized to inflate the cost of external finance, but formal bank relationships are thought to ameliorate such problems and may even lead to excessive leverage. Bank oversight is associated with slightly higher leverage but not with greater use of bank debt. Older and cash-rich firms have lower leverage and less bank debt, suggesting that information problems affected firms' financing decisions; but bank attachment appears not to alter these patterns. The findings suggest that bank oversight had little to do with leverage decisions, particularly short-term borrowing, in the later stages of the German industrialization.https://resolver.caltech.edu/CaltechAUTHORS:20170811-161003249Detecting Failures of Backward Induction: Monitoring Information Search in Sequential Bargaining
https://resolver.caltech.edu/CaltechAUTHORS:20170811-144320736
DOI: 10.7907/q4x6c-tg374
We ran three-round sequential bargaining experiments in which the perfect equilibrium offer was $1.25 and an equal split was $2.50. Subjects offered $2.11 to other subjects, $1.84 to "robot" players (who are known to play subgame perfectly), and $1.22 to robots after instruction in backward induction. Measures of information search showed that subjects did not look at the amounts being divided in different rounds in the correct order, and for the length of time, necessary for backward induction, unless they were specifically instructed. The results suggest that most of the departure from perfect equilibrium is due to limited computation and some is due to fairness.https://resolver.caltech.edu/CaltechAUTHORS:20170811-144320736The Revolution Against Affirmative Action in California: Politics, Economics, and Proposition 209
https://resolver.caltech.edu/CaltechAUTHORS:20170811-154822248
DOI: 10.7907/gmd9m-32904
We consider two possible explanations-economic anxiety and racial division for the appeal of Proposition 209 to California voters during the 1996 election. Voter support for this proposition has been attributed to racial differences in opinion and to economic anxiety caused by poor economic conditions in the state and the perceived threat that affirmative action presented in school admissions or the workplace. Because the presidential candidates campaigned on and debated the merits of affirmative action policy during this election, we incorporate this endogeneity into our analysis.
We develop two competing hypotheses to explain voter behavior: (1) if voters are blaming affirmative action for the state's economic conditions, then voters who believe that California's economic condition is poor or who perceive that their personal financial situation is worse will be more likely to support Proposition 209; and (2) if voters are, instead, divided along more traditional racial lines on the merits of affirmative action (winners versus losers), then whites, males, Republicans, and conservatives will be more likely to support Proposition 209, and other ethnic group members, females, Democrats, and liberals will be more likely to oppose Proposition 209.
To test these hypotheses, we analyze voter exit poll data from the 1996 California election. We utilize a two-stage logit model to allow for the endogeneity of candidate endorsements. We find support for the second of our two hypotheses. These findings cause us to conclude that racial division fueled by a fear of arbitrary exclusion prompted voter support for Proposition 209.https://resolver.caltech.edu/CaltechAUTHORS:20170811-154822248A Bargaining Model of Collective Choice
https://resolver.caltech.edu/CaltechAUTHORS:20170810-171219202
DOI: 10.7907/0pn8v-n7v55
We present a general model of legislative bargaining in which the status quo is an arbitrary point in a multidimensional policy space. In contrast to other bargaining models, the status quo is not assumed to be "bad," and delay may be Pareto efficient. We prove existence of stationary equilibria. The possibility of equilibrium delay depends on four factors: risk aversion of the legislators , the dimensionality of the policy space, the voting rule, and the possibility of transfers across districts. If legislators are risk averse, if there is more than one policy dimension, and if voting is by majority rule, for example, then delay will almost never occur. In one dimension, delay is possible if and only if the status quo lies in the core of the voting rule, and then it is the only possible outcome. This "core selection" result yields a game-theoretic foundation for the well-known median voter theorem. Our comparative statics analysis yield two noteworthy insights: (i) if the status quo is close to the core, t hen equilibrium policy outcomes will also be close to the core (a moderate status quo produces moderate policy outcomes), and (ii) if legislators are patient, then equilibrium proposals will be close to the core (legislative patience leads to policy moderation).https://resolver.caltech.edu/CaltechAUTHORS:20170810-171219202A New and Improved Design for Multi-Object Iterative Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170810-170308555
DOI: 10.7907/wve8f-fxz97
In this paper we present a new improved design for multi-object auctions and report on the results of tests of that design. We merge the better features of two extant but very different auction processes, the Milgrom FCC design (see Milgrom (1995)) and the Adaptive User Selection Mechanism (AUSM) of Banks et al. (1989)). Then, by adding one crucial new feature, we are able to create a new design, the Resource Allocation Design (RAD) auction process, which performs better than both. We are able to demonstrate, in both simple and complex environments, that the RAD auction achieves higher efficiencies, lower bidder losses, and faster times to completion without increasing the complexity of a bidder's problem.https://resolver.caltech.edu/CaltechAUTHORS:20170810-170308555The Evolution of Social and Economic Networks
https://resolver.caltech.edu/CaltechAUTHORS:20170811-141737687
DOI: 10.7907/z7sw0-gx712
We examine the dynamic formation and stochastic evolution of networks connecting individuals whose payoffs from an economic or social activity depends on the network structure in place. Over time, individuals form and sever links connecting themselves to other individuals based on the improvement the resulting network offers them relative to the current network. Such a process creates a sequence of networks that we call an 'improving path'. The changes made along an improving path make the individuals, who added or deleted the relevant link(s) at each date, better off. Such sequences of networks can cycle, and we study conditions on underlying allocation rules that characterize cycles.
Building on an understanding of improving paths, we consider a stochastic evolutionary process where in addition to intended changes in the network there is a small probability of unintended changes or errors. Predictions can be made regarding the relative likelihood that the stochastic evolutionary process will lead to any given network at some time. The evolutionary process selects from among the statically stable networks and cycles. We show that in some cases, the evolutionary process selects inefficient networks even though efficient ones are statically stable. We apply these results to the matching literature to show that there are contexts in which the evolutionarily stable networks coincide with the core stable networks, and thus achieve efficiency.https://resolver.caltech.edu/CaltechAUTHORS:20170811-141737687Prospect theory in the wild: Evidence from the field
https://resolver.caltech.edu/CaltechAUTHORS:20170811-150835361
DOI: 10.7907/p3ren-j9d62
The workhorses of economic analysis are simple formal models that can explain naturally occurring phenomena. Reflecting this taste, economists often say they will incorporate more psychological ideas into economics if those ideas can parsimoniously account for field data better than standard theories do. Taking this statement seriously, this article describes 10 regularities in naturally occurring data that are anomalies for expected utility theory but can all be explained by three simple elements of prospect theory: loss aversion, reflection effects, and nonlinear weighting of probability; moreover, the assumption is made that people isolate decisions (or edit them) from others they might be grouped with (Read, Loewenstein, and Rabin 1999; cf. Thaler, 1999). I hope to show how much success has already been had applying prospect theory to field data and to inspire economists and psychologists to spend more time in the wild.https://resolver.caltech.edu/CaltechAUTHORS:20170811-150835361A New Approach for Modeling Strategic Voting in Multiparty Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170811-154700936
DOI: 10.7907/1ea5s-rn272
Whether voters vote strategically, using their vote to best further their interests, or vote sincerely, voting for their first choice among the alternatives, is a question of long-standing interest. We offer two innovations in searching for the answer to this question. First, we begin with a more consistent model of sincere voting in multiparty democratic systems than has been presented in the literature to date. Second, we incorporate new operationalizations of the objective potential for strategic behavior than have been used in the past. We offer a test of strategic voting in the 1987 British General Election based on the variance in strategic setting across constituencies in Britain. We allow voters to use available information in deciding whether or not to cast a strategic vote. We estimate a lower level of strategic voting than many other methods have estimated. We also demonstrate that the use of self-reported vote motivation causes errors in estimating the amount of strategic voting, and that this problem is exacerbated the further from the election the self-report is obtained.https://resolver.caltech.edu/CaltechAUTHORS:20170811-154700936Equilibria in Campaign Spending Games: Theory and Data
https://resolver.caltech.edu/CaltechAUTHORS:20170814-154504840
DOI: 10.7907/v39vq-q2c70
This paper presents a formal game-theoretic model to explain the simultaneity problem that has made it difficult to obtain unbiased estimates of the effects of both incumbent and challenger spending in U.S. House elections. The model predicts a particular form of correlation between the expected closeness of the race and the level of spending by both candidates, which implies that the simultaneity problem should not be present in close races, and should be progressively more severe in range of safe races that are empirically observed. This is confirmed by comparing simple OLS regression of races that are expected to be close with races that are expected not to be close, using House incumbent races spanning two decades. The theory also implies that inclusion of a variable controlling for total spending should successfully produce reliable estimates using OLS. This is confirmed.https://resolver.caltech.edu/CaltechAUTHORS:20170814-154504840Economics, Entitlements and Social Issues: Voter Choice in the 1996 Presidential Election
https://resolver.caltech.edu/CaltechAUTHORS:20170811-170524063
DOI: 10.7907/80v4g-r5z56
Theory: Contemporary theories of presidential election outcomes, especially the economic voting and spatial issue voting models, are used to examine voter choice in the 1996 presidential election.
Hypotheses: First, we look at the effects of voter perceptions of the national economy on voter support for Clinton. Second we look at the effects of candidate and voter positions on ideology and on a number of issues. Last, we examine whether respondents' views on other issues—social issues such as abortion as well as issues revolving around entitlements and taxation that were emphasized by the campaigns—played significant roles in this election.
Methods: Multinomial probit analysis of the 1996 National Election Studies data; simulations based on counterfactual scenarios based on different macroeconomic conditions and different issue platforms of candidates.
Results: The effects of economic perceptions are much greater than the effects of voter issue positions on the election outcome. Some social issues, namely abortion, did play a role in determining the election outcome. The presence of a third centrist candidate limited the ability of other candidates to improve their vote shares by moving in the issue space.https://resolver.caltech.edu/CaltechAUTHORS:20170811-170524063The Universal Banks and the Mobilization of Capital in Imperial Germany
https://resolver.caltech.edu/CaltechAUTHORS:20170814-152243289
DOI: 10.7907/5m1nv-tg993
Capital mobilization represents a serious obstacle to industrialization. By stimulating savings, matching savers and investors, and offering advice to entrepreneurs, universal banks are believed to have eased such problems during the later stages of the German industrialization. Using evidence on deposit taking, branching, and banks' liability structure as well as the extent of interlocking directorates between banks and industrial companies, this paper demonstrates that the universal banks played a limited part in capital mobilization and expansion until the start of the twentieth century. The paper shows, however, that networks of the provincial banks enveloped a broad range of industrial companies.https://resolver.caltech.edu/CaltechAUTHORS:20170814-152243289Do Voters Learn from Presidential Election Campaigns?
https://resolver.caltech.edu/CaltechAUTHORS:20170811-165812488
DOI: 10.7907/z572j-c8444
Theory: We present a model of voter campaign learning which is based on Bayesian learning models. This model assumes voters are imperfectly informed and that they incorporate new information into their existing perceptions about candidate issue positions in a systematic manner.
Hypothesis: Additional information made available to voters about candidate issue positions during the course of a political campaign will lead voters to have more precise perceptions of the issue positions of the candidates involved.
Data and Methods: We use panel survey data from the 1976 and 1980 presidential elections, combined with content analyses of the media during these same elections. Our primary analysis is conducted using random effects panel models.
Results: We find that during each of these campaigns many voters became better informed about the positions of candidates on many issues and that these changes in voter information are directly related to the information flow during each presidential campaign.https://resolver.caltech.edu/CaltechAUTHORS:20170811-165812488An Experimental Study of the Effect of Private Information in the Coase Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170814-133129502
DOI: 10.7907/h25ge-vez95
This paper investigates, in an experimental setting, the effect of private information on the Coase theorem's predictions of efficiency and allocative neutrality. For a two-person bargaining game, we find significantly more inefficiency and allocative asymmetry in the case of private information compared with the case of complete information. We also find substantial bargaining breakdown, which is not predicted by the Coase theorem. For the case of private information, the Coase theorem does not predict as well as a generalized version of the Myerson-Satterthwaite theorem, which predicts inefficiency, allocative non-neutrality in the direction of the disagreement point, and some bargaining breakdown.https://resolver.caltech.edu/CaltechAUTHORS:20170814-133129502Bank Structure and Growth: Insights from British and German Bank Balance Sheets Before World War I
https://resolver.caltech.edu/CaltechAUTHORS:20170814-135116494
DOI: 10.7907/z53xb-kba44
Financial institutions may enhance economic growth by raising the quantity, quality (productivity), and efficiency of investment. The structure of the German system is thought by many to have amplified these beneficial effects. Orthodox paradigms hold that through direct involvement with firms, the German universal banks funneled substantial amounts of financial capital into industry and credibly committed to behaving in the long-run interest of firms. At the same time, by avoiding such engagement with industrial companies, British banks are thought to have disadvantaged that country's economy with respect to its continental counterparts. This paper uses aggregate bank balance sheet data to investigate systematic differences in the financial makeup and activities of universal and specialized banks. The paper first measures the rate of expansion and the ultimate magnitude of capital mobilized by the British and German banks. It then investigates the makeup of banks' asset portfolios and estimates the extent of direct involvement in equity ownership by the two types of banks. The findings suggest that, compared to the British banks, the German banks maintained at least as much liquidity relative to their short-term liabilities and held approximately the same (small) proportion of their assets in the form of non-government securities. Furthermore, the German banks seem to have held only a limited number of industrial equities in their portfolios and often did so merely because of insufficient markets for new issues. The findings suggest that the commonly-perceived gulf between specialized and universal banking may exaggerate the real differences in the systems' influences on economic growth and industrial development.https://resolver.caltech.edu/CaltechAUTHORS:20170814-135116494The Resurgence of Nativism in California? The Case of Proposition 187 and Illegal Immigration
https://resolver.caltech.edu/CaltechAUTHORS:20170811-171514638
DOI: 10.7907/ycksa-pk633
Theory: We argue that support among California voters for Proposition 187 in 1994 was an example of cyclical nativism. This nativism was provoked primarily by California's economic downturn during the early 1990s.
Hypotheses: We develop four specific hypotheses to explain how poor economic conditions in California and the consequent nativistic sentiments would result in support for Proposition 187:
1. voters who believe that California's economic condition is poor will be more likely to support Proposition 187;
2. voters who perceive themselves as being economically threatened by illegal immigrants will be more likely to support Proposition 187;
3. voters with lower levels of education are more economically vulnerable and will be more likely to support Proposition 187;
4. voters in Southern California feel more directly affected by illegal immigration and will be more likely to support Proposition 187.
Methods: To test these hypotheses, we analyze voter exit poll data from the 1994 California election. We utilize a two-stage probit model to allow for the endogeneity which results from the politicization of illegal immigration during this election.
Results: We find support for our hypotheses in the data. These findings cause us to conclude that nativism, fueled by economic conditions, was a salient factor leading many Californians to support Proposition 187.https://resolver.caltech.edu/CaltechAUTHORS:20170811-171514638Analysis of Crossover and Strategic Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170814-132747188
DOI: 10.7907/d9np1-r2b60
We undertake the analysis of primary elections from 1980 through 1996 using both academic individual level survey data, media exit-polls, and aggregate election returns on a county by county basis. We come to the following conclusions:
1. there is very little crossover voting in general in United States primaries;
2. the difference in the amount of crossover voting between states with open primaries and closed primaries is not substantively large;
3. the amount of strategic behavior on the part of voters is extremely small.https://resolver.caltech.edu/CaltechAUTHORS:20170814-132747188Beyond Ordinary Logit: Taking Time Seriously in Binary Time-Series-Cross-Section Models
https://resolver.caltech.edu/CaltechAUTHORS:20170814-133839169
DOI: 10.7907/2fmk9-nvs52
Researchers typically analyze time-series-cross-section data with a binary dependent variable (BTSCS) using ordinary logit or probit. However, BTSCS observations are likely to violate the independence assumption of the ordinary logit or probit statistical model. It is well known that if the observations are temporally related that the results of an ordinary logit or probit analysis may be misleading. In this paper, we provide a simple diagnostic for temporal dependence and a simple remedy. Our remedy is based on the idea that BTSCS data is identical to grouped duration data. This remedy does not require the BTSCS analyst to acquire any further methodological skills and it can be easily implemented in any standard statistical software package. While our approach is suitable for any type of BTSCS data, we provide examples and applications from the field of International Relations, where BTSCS data is frequently used. We use our methodology to re-assess Oneal and Russett's (1997) findings regarding the relationship between economic interdependence, democracy, and peace. Our analyses show that 1) their finding that economic interdependence is associated with peace is an artifact of their failure to account for temporal dependence and 2) their finding that democracy inhibits conflict is upheld even taking duration dependence into account.https://resolver.caltech.edu/CaltechAUTHORS:20170814-133839169Electoral Incentives, Informational Asymmetries, and the Policy Bias Toward Special Interests
https://resolver.caltech.edu/CaltechAUTHORS:20170815-135745649
DOI: 10.7907/gma71-brm17
Political decisions are often biased in favor of special interests at the expense of the general public, and they are frequently inefficient in the sense that the losses incurred by the majority exceed the gains enjoyed by the minority. This paper provides an explanation based on informational asymmetries and the free rider problem: (i) incumbents increase their chances of re-election by biasing policy toward groups that are better able to monitor their activities; and (ii) smaller groups are better able to overcome the free rider problem of costly monitoring so that policy will be biased in their favor. A welfare analysis examines the effect of asymmetric monitoring on voter welfare. The inefficiencies created by the policy bias are offset by a positively-valued selection bias: incumbents of above-average quality are more likely to survive voter scrutiny than are low-quality types.https://resolver.caltech.edu/CaltechAUTHORS:20170815-135745649Federalism and Central Bank Autonomy: The Politics of German Monetary Policy, 1957-1992
https://resolver.caltech.edu/CaltechAUTHORS:20170815-141352052
DOI: 10.7907/14tjf-j0m20
Two channels of political control allow elected politicians to influence monetary policy. First, central bankers may accommodate political pressures to ward off political threats to the status, the structure, or the very existence of the central bank. Second, politicians may use their powers of appointment to ensure that central bank appointees share their electoral and party-political goals. This paper derives the monetary policy outcomes obtained as a function of the degree of central bank independence (zero, partial, or full) and of central bankers' types (partisans or technocrats).
Based on a case study of the 1957 and 1992 institutional changes to the German central banking system and a regression analysis covering the 1960- 1989 period, I argue that the formal autonomy of the system is protected by its embeddedness in the institutions of German federalism and by the federalist components of its decentralized organizational structure. I conclude that the behavioral autonomy of the German central bank fluctuates over time with the party control of federalist veto points. The evidence is consistent with the hypothesis that the Bundesbank is staffed with non-partisan technocrats who are partially insulated from political pressures.https://resolver.caltech.edu/CaltechAUTHORS:20170815-141352052On Incentives and Updating in Agent Based Models
https://resolver.caltech.edu/CaltechAUTHORS:20170814-163151313
DOI: 10.7907/833jr-w8x29
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170814-163151313The Dynamics Of Equity Prices In Fallible Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170814-140850379
DOI: 10.7907/xr38e-amk77
In an efficient securities market, prices correctly reflect news about future payoffs. This paper argues that there are two aspects to correctness: (i) correct updating of beliefs from news, (ii) correct prior beliefs. Traditionally, empirical research has implicitly insisted on both. Lucas' rational expectations equilibrium theory also assumes both, explicitly. Nevertheless, rationality requires only the former, but not the latter. This paper develops restrictions on the random behavior of prices of equity-like contracts when (i) is maintained, but the market may have mistaken priors about the likelihood of the bankruptcy state, in violation of (ii). The restrictions are cast in the form of familiar martingale difference results. They do not necessarily restrict returns as traditionally computed, however. Most importantly, the restrictions appear only when the empiricist deliberately imposes a selection bias. In particular, the price histories of securities that are in the money at the terminal date are to be separated from those of securities that end out of the money (i.e., in the bankruptcy state). As a result, this paper also demonstrates that something can be learned about market efficiency from samples subject to survivorship bias or the Peso problem.https://resolver.caltech.edu/CaltechAUTHORS:20170814-140850379Parimutuel Betting Markets as Information Aggregation Devices: Experimental Results
https://resolver.caltech.edu/CaltechAUTHORS:20170815-151804081
DOI: 10.7907/fbqg5-zzh14
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170815-151804081An Epistemic Characterization of Extensive Form Rationalizability
https://resolver.caltech.edu/CaltechAUTHORS:20170810-154039653
DOI: 10.7907/py245-kxq87
We use an extensive form, universal type space to provide the following epistemic characterisation of extensive form rationalisability. Say that player i strongly believes event E if i is certain of E conditional on each of her information sets consistent with E. Our main contribution is to show that a strategy profile s is extensive form rationalisable if and only if there is a state in which s is played and (0) everybody is rational, (1) everybody strongly believes (0), (2) everybody strongly believes (0) & (1), (3) everybody strongly believes (0) & (1) & (2), .... This result also allows us to provide sufficient epistemic conditions for the backward induction outcome and to relate extensive form rationalisability and conditional common certainty of rationality.https://resolver.caltech.edu/CaltechAUTHORS:20170810-154039653Delegation and the Regulation of Risk
https://resolver.caltech.edu/CaltechAUTHORS:20170815-142256366
DOI: 10.7907/n04sf-8mg45
Political principals typically use low-cost "fire-alarm" signals transmitted by the media, interest groups, and disaffected constituents to monitor the activities of regulatory agencies. We argue that regulatory decision-making is biased and inconsistent if the instruments of political oversight are simple and the information flows to the principal are coarse relative to the complexity of the regulatory environment.https://resolver.caltech.edu/CaltechAUTHORS:20170815-142256366The Illusion of Leadership
https://resolver.caltech.edu/CaltechAUTHORS:20170815-142759861
DOI: 10.7907/1kpw8-3yp19
This paper reports experiments which examine attributions of leadership quality due to differences in situations. Subjects played an abstract coordination game which is analogous to some organizational situations. Previous research showed that when large groups play the game, they rarely coordinate on the Pareto-optimal (efficient) outcome, but small groups almost always coordinate on a Pareto-optimal outcome. After two or three periods of playing the game, one subject who was randomly selected from among the participants to be the "leader" for the experiment was instructed to make a speech exhorting others to choose the efficient action. Based on previous studies with small and large groups, we predicted that small groups would succeed in achieving efficiency and large groups would fail. Based on research on the fundamental attribution error in social psychology, we predicted that the leaders would be credited for the success of the small groups, and blamed for the failure of the large groups. The effects of the leaders in improving coordination were, as predicted, overshadowed by the situational variable (group size). Nonetheless, there was an "illusion of leadership": subjects attributed differences in outcomes between conditions to differences in the effectiveness of leaders. In a second experiment, subjects in both large and small groups coordinated efficiently, but gave less credit to the second leader than the first.https://resolver.caltech.edu/CaltechAUTHORS:20170815-142759861An Aggregate Nested Logit Model of Political Participation
https://resolver.caltech.edu/CaltechAUTHORS:20170814-170702184
DOI: 10.7907/7kcmg-s2990
This paper builds an aggregate nested model of political participation. We use a three-event model of voting behavior that incorporates the registration, turnout, and ballot completion decisions. The definitions of registration, turnout, and ballot completion used in this paper reflect the nested nature of the model. The concept of a nested model allows us to determine the elasticity of participation in total and for specific events within the model. We find that African-Americans register more often than non-Latino whites, turnout less for general elections, and complete less of the ballot. We also find that minorities do not vote in equal proportions to non-minorities even in those instances where they have the greatest chance of influencing the outcome, such as in primaries and mid-term election.https://resolver.caltech.edu/CaltechAUTHORS:20170814-170702184A Statistical Model for Multiparty Electoral Data
https://resolver.caltech.edu/CaltechAUTHORS:20170814-155246837
DOI: 10.7907/ktgms-6m904
We propose an internally consistent and comprehensive statistical model for analyzing multiparty, district-level aggregate election data. This model can be used to explain or predict how the geographic distribution of electoral results depends upon economic conditions, neighborhood ethnic compositions, campaign spending, and other features of the election campaign or characteristics of the aggregate areas. We also provide several new graphical representations for help in data exploration, model evaluation, and substantive interpretation.
Although the model applies more generally, we use it to help resolve an important controversy over the size of and trend in the electoral advantage of incumbency in Great Britain. Contrary to previous analyses, which are all based on measures now known to be biased, we demonstrate that the incumbency advantage is small but politically meaningful. We also find that it differs substantially across the parties, about half a percent for the Conservatives, 1% for the Labor Party, and 3% for the Liberal party and its successors. Also contrary to previous research, we show that these effects have not grown in recent years. Finally, we are able to estimate from which party each party's incumbency advantage is predominantly drawn.https://resolver.caltech.edu/CaltechAUTHORS:20170814-155246837Political Confederation
https://resolver.caltech.edu/CaltechAUTHORS:20170815-144929970
DOI: 10.7907/04xcr-91r09
Using a spatial model, we compare different rules for aggregating preferences across confederated districts, under the assumption that voters have private information and face uncertainty about the distribution of preferences of other voters.
Our model includes, as special cases, systems of local representation in national assemblies and parliaments and international legislatures with representatives from member states. We show how induced preferences over the degree of centralization and the method of representation (proportional vs. equal representation of districts) vary systematically across voters and districts, depending on such factors as relative size of the districts, the number of districts, and the variance of underlying policy preferences.
We show that each voter has an ideal confederation in which representation consists solely of equal representation. These ideal confederations vary across voters, but are independent district size. Moderate voters prefer a higher degree of centralization than extreme voters. Preference for centralization is increasing in the number of districts and decreasing in the variance of voter ideal points.
With two districts, majority rule equilibria always exists and can have some degree of proportional representation. With three or more districts, majority rule equilibrium often fails to exist. Nonexistence arises due to cycling in the two dimensional-Centralization x Representation-space of confederations.https://resolver.caltech.edu/CaltechAUTHORS:20170815-144929970Revolutionary Finance? Capital Mobilization and Utilization in Pre-War Germany and Italy
https://resolver.caltech.edu/CaltechAUTHORS:20170814-165512165
DOI: 10.7907/zbv81-mva62
Accumulation, mobilization, and efficient utilization of capital represent serious obstacles to industrialization. Financial institutions may ease these problems by stimulating savings, matching savers and investors, and offering business or investment advice to entrepreneurs. German-style universal banks are believed by many to have played a crucial part in overcoming all three capital related stumbling blocks. Using evidence on deposit taking and branching, banks' liability structure, interlocking directorates between banks and industrial companies, and firm-level investment patterns, the paper investigates the role of German and Italian universal banks in the mobilization and efficient utilization of capital at the turn of the last century.https://resolver.caltech.edu/CaltechAUTHORS:20170814-165512165Linkage Politics
https://resolver.caltech.edu/CaltechAUTHORS:20170815-133930517
DOI: 10.7907/es2p3-hcg20
I extend the basic repeated Prisoners' Dilemma to allow for the linkage of punishment strategies across issues (issue linkage) as well as decentralized third-party enforcement (player linkage). I then synthesize the concepts of issue and player linkage to develop the notion of domestic-international linkage, which allows for the linkage of trigger strategy punishments across games played over different issues by different sets of players. In a two-level game, domestic and international cooperation may be reinforced by a punishment linkage: a defection in the domestic game may trigger a breakdown of international cooperation, and vice versa. I also examine the conditions under which the incentives to cooperate are stronger at the domestic level than at the international level, and vice versa. In this case domestic-international linkage allows for the credibility surplus to spill over to the level with the credibility deficit. Finally, I provide conditions under which governments are better off delinking domestic and international issues.https://resolver.caltech.edu/CaltechAUTHORS:20170815-133930517The Reapportionment Revolution and Bias in U.S. Congressional Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170814-144530248
DOI: 10.7907/xt8ng-zdv36
We develop a simple formal model of the redistricting process that highlights the importance of two factors: first, partisan or bipartisan control of the redistricting process; second, the nature of the reversionary outcome, should the state legislature and governor fail to agree on a new districting plan. Using this model, we derive various predictions about the levels of partisan bias and responsiveness that should be observed under districting plans adopted under various constellations of partisan control of state government and reversionary outcomes, testing our predictions on postwar (1946{70) U.S. House electoral data. We find strong evidence that both partisan control and reversionary outcomes systematically affect the nature of a redistricting plan and the subsequent elections held under it. Further, we show that the well-known disappearance circa 1966 of what had been a long-time pro-Republican bias of about 6% in nonsouthern congressional elections can be explained completely by the changing composition of northern districting plans.https://resolver.caltech.edu/CaltechAUTHORS:20170814-144530248Financial Factors and Investment in Belgium, France, Germany and the UK: A Comparison Using Company Panel Data
https://resolver.caltech.edu/CaltechAUTHORS:20170815-154941010
DOI: 10.7907/pwmjz-b7d36
We construct company panel datasets for manufacturing firms in Belgium, France, Germany and the UK, covering the period 1978-89. These datasets are used to estimate a range of empirical investment equations, and to investigate the role played by financial factors in each country. A robust finding is that cash flow or profits terms appear to be both statistically and quantitatively more significant in the UK than in the three continental European countries. This is consistent with the suggestion that financial constraints on investment may be relatively severe in the more market-oriented UK financial system.https://resolver.caltech.edu/CaltechAUTHORS:20170815-154941010Choquet Rationality
https://resolver.caltech.edu/CaltechAUTHORS:20170814-163740702
DOI: 10.7907/tm8mm-xhq38
We provide a characterization of the consequences of the assumption that a decision maker with a given utility function is Choquet rational: She maximizes expected utility, but possibly with respect to non-additive beliefs, so that her preferences are represented by Choquet expected utility (CEU).
The characterization shows that this notion of rationality allows in general to rationalize more choices than it is possible when beliefs have to be additive. More surprisingly, we find that a considerable restriction on the types of beliefs allowed does not change the set of rational actions. We then remark on the relation between the predictions of C E U model, of a similar model (the maxmin expected utility model), and those of subjective expected utility when the risk attitude of the decision maker is not known. We close with an application of the result to the definition of a solution concept (in the spirit of rationalizability) for strategic-form games.https://resolver.caltech.edu/CaltechAUTHORS:20170814-163740702The Effects of Payoff Magnitude and Heterogeneity on Behavior in 2 x 2 Games with Unique Mixed Strategy Equilibria
https://resolver.caltech.edu/CaltechAUTHORS:20170815-143845609
DOI: 10.7907/95xv9-jkv06
The Logit version of Quantal Response Equilibrium predicts that equilibrium behavior in games will vary systematically with payoff magnitudes, if all other factors are held constant (including the Nash equilibria of the game). We explore this in the context of a set of asymmetric 2x2 games with unique totally mixed strategy equilibria. The data provide little support for the payoff magnitude predictions of the Logit Equilibrium model. We extend the theoretical Quantal Response Equilibrium model to allow for heterogeneity, and find that the data fit the heterogeneous version of the theory significantly better.https://resolver.caltech.edu/CaltechAUTHORS:20170815-143845609Collusion in Multiple Object Simultaneous Auctions: Theory and Experiments
https://resolver.caltech.edu/CaltechAUTHORS:20170814-145511145
DOI: 10.7907/c0290-p2b47
The choice of strategies by bidders who are allowed to communicate in auctions are studied. Using the tools of mechanism design, the possible outcomes of communication between bidders participating in a series of simultaneous first-price auctions are investigated. A variety of mechanisms are incentive compatible when side payments are not allowed. When attention is restricted to mechanisms which rely only on bidders' ordinal rankings of markets, incentive compatibility is characterized and the ranking mechanism of Pesendorfer (1996) is shown to be interim incentive efficient. Laboratory experiments were completed in order to investigate the existence, stability, and effect on bidder and seller surplus of cooperative agreements in multiple object simultaneous first-price auctions. Collusive agreements stable in the laboratory. The choices of the experimental subjects often closely match the choices predicted by the ranking and serial dictator mechanisms presented earlier. However, a few notable exceptions raise interesting prospects for the theoretical development of models of cooperative behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170814-145511145Unknown heterogeneity, the EC-EM algorithm, and Large T Approximation
https://resolver.caltech.edu/CaltechAUTHORS:20170815-150649504
DOI: 10.7907/5brrc-5y985
We study a panel structure with n subjects/entities being observed over T periods. We consider a class of models for each subject's data generating process, and allow for unknown heterogeneity. In other words, we do not know how many types we have, what the types are, and which subjects belong to each type. We propose a large T approximation to the posterior mode on the unknowns through the Estimation/Classification (EC) algorithm of El-Gamal and Grether (1995) which is linear in n, T, and the unknown number of types. If our class of models (likelihood functions) allows for a consistent asymptotically normal estimator under the assumption of homogeneity (number of types = 1), then the estimators obtained by our EC algorithm inherit those asymptotic properties as T ↑ ∞ and then as n t ↑ ∞ (with a block-diagonal covariance matrix facilitating hypothesis-testing). We then propose a large T approximation to the EM algorithm to obtain posteriors on the subject classifications and diagnostics for the goodness of the large T approximation in the• EC stage. If the large T approximation does not seem to be appropriate, then we suggest the use of the more computationally costly EM algorithm, or the - even more costly - full Bayesian updating. We illustrate the procedure with two applications to experimental data on probability assessments within a class of Pro bit and a class of Tobit models.https://resolver.caltech.edu/CaltechAUTHORS:20170815-150649504Participation in Direct Legislation: Evidence from the Voting Booth
https://resolver.caltech.edu/CaltechAUTHORS:20170814-155144144
DOI: 10.7907/v6txk-74k62
This study considers individual voting behavior on propositions. After controlling for voter and election specific attributes, we determine the effects of proposition attributes, such as proposition position and readability on roll-off and voter fatigue. If proposition attributes affect voting behavior and if their attributes can be influenced by supporters, including interest groups, then any such potential advantages should be ameliorated in the interest of "equal" political representation. As an example, advantages of ballot position can be minimized by modifying the linkage between the qualification sequence and the ballot sequence.
Using individual level ballot data taken from Los Angeles county, we find that the proposition position is negatively related to the probability of voting on a proposition and the probability of voting "Yes" on bonds and initiatives. We also find that reading ease is positively related to the probability of voting on a proposition and the probability of voting "Yes" on bonds and initiatives.https://resolver.caltech.edu/CaltechAUTHORS:20170814-155144144Expectations and Learning in Iowa
https://resolver.caltech.edu/CaltechAUTHORS:20170815-145720985
DOI: 10.7907/1jcq6-r2h20
We study the rationality of learning and the biases in expectations in the Iowa Experimental Markets. Using novel tests developed in Bossaerts [1996], learning in the Iowa winner-take-all markets is found to be in accordance with the rules of conditional probability (Bayes' law). Hence, participants correctly update their beliefs using the available information. There is evidence, however, that beliefs do not satisfy the restrictions of rational expectations that they reflect the factual distribution of outcomes.https://resolver.caltech.edu/CaltechAUTHORS:20170815-145720985Party Fragmentation and Presidential Elections in Post-Communist Democracies
https://resolver.caltech.edu/CaltechAUTHORS:20170814-142407508
DOI: 10.7907/0sgsn-vwq02
Theory: Despite its controversial status as a stable governmental form, many of today's societies attempting to make the transition to democracy have or will, for a variety of reasons, choose presidentialism. Meanwhile, the evidence suggests that the combination of presidentialism and multipartism is especially dangerous for democratic stability (Mainwaring 1994). The question this essay addresses, though, is whether presidential elections themselves serve to encourage a fragmented party system, at least in the initial stages of democratization.
Hypothesis: In transitional political systems presidential elections encourage party fragmentation, but in a way different from that of highly proportional purely parliamentary mechanisms. Specifically, parties proliferate to support the presidential aspirations of political elites.
Methods: Multivariate regression analysis on cross-sectional aggregate electoral data, supported by extensive outliers diagnostics and assessments of the role of country-specific effects. A nested model is used to discriminate among the secondary hypotheses. Controls include: parliamentary election rules (district magnitude, threshold for representation, adjustment districts, ballot structure), relative timing of presidential and parliamentary elections, and basic societal cleavage structure.
Results: Using as our data source the recent experiences of Central Europe and the European part of the former Soviet Union, we show that presidential elections consistently significantly increase party fragmentation. At the same time, the data are consistent with the hypothesis that presidentialism does encourage the overall consolidation in party systems through voters' abandonment of some parties, akin to Duverger's 'psychological effect'.https://resolver.caltech.edu/CaltechAUTHORS:20170814-142407508In Defense of Unanimous Jury Verdicts: Communication, Mistrials, and Sincerity
https://resolver.caltech.edu/CaltechAUTHORS:20170814-143254954
DOI: 10.7907/d3qd9-b6105
It is a widely held belief among legal theorists that the requirement of unanimous jury verdicts in criminal trials reduces the likelihood of convicting an innocent defendant. This belief is, to a large extent, dependent upon the assumption that all jurors will vote sincerely based on their own impression of the trial evidence. Recent literature, however, has drawn this assumption into question, and simple models of jury procedure have been constructed in which, except under very strict conditions, it is never a Nash equilibrium for all jurors to vote sincerely. Moreover, Nash equilibrium behavior in these models leads to higher probabilities of both convicting an innocent defendant and acquitting a guilty defendant under unanimity rule than under a wide variety of alternative voting rules, including simple majority rule. The present paper extends these models by adding minimal enhancements that we argue bring the existing models closer to actual jury procedures. In particular, we separately analyze the implications of (1) incorporating the possibility of mistrial and (2) allowing limited communication among jurors. Under each of these enhancements, we identify general conditions under which sincere voting is, in fact, a Nash equilibrium. We further demonstrate that under such sincere voting equilibria, unanimous jury verdicts perform better than any alternative voting rule in terms of minimizing probability of trial error and maximizing expected utility.https://resolver.caltech.edu/CaltechAUTHORS:20170814-143254954On the Emergence of Cities
https://resolver.caltech.edu/CaltechAUTHORS:20170814-162639416
DOI: 10.7907/2rhct-60240
This paper considers the formation of cities in a simple model in which the preferences of agents depend on at most two characteristics of a location: its population and its average distance to the other agents. In such a simple model it is possible to recreate phenomena such as path dependency and centrally located cities which have been generated in more sophisticated models. Moreover, an example is provided in which cities emerge in the sense that the micro level preferences of agents do not appear to favor locating near or with other agents. When nonlinear effects are included then it is possible to show that even if efficient equilibria exist, they are not likely to occur and that there may exist extreme sensitivity to initial conditions. The model suggests that the mapping from individual preferences to population distributions merits further study.https://resolver.caltech.edu/CaltechAUTHORS:20170814-162639416IPO Post-Issue Markets: Questionable Predilections But Diligent Learners?
https://resolver.caltech.edu/CaltechAUTHORS:20170814-141831110
DOI: 10.7907/qz81g-90134
Efficiency in the IPO (Initial Public Offering) aftermarket is tested without imposing any restrictions on the priors about potential default at the issue date. Merging Ritter's extended dataset (which covers the period 1975-84) with the CRSP tapes, IPOs are followed up to ten years after issue. Across all IPOs, or when stratifying IPOs according to issue underpricing, industry affiliation or rank of entry in an industry, little evidence against rational price behavior is found. In contrast, the market clearly over-reacts to information about the eventual fate of low-priced issues. A suggestive relationship between irrational price behavior and subsequent takeover activity is uncovered.https://resolver.caltech.edu/CaltechAUTHORS:20170814-141831110Experience-weighted Attraction Learning in Normal Form Games
https://resolver.caltech.edu/CaltechAUTHORS:20170814-161157311
DOI: 10.7907/4e2kf-nme71
We describe a general model, 'experience-weighted attraction' (EWA) learning, which includes reinforcement learning and a class of weighted fictitious play belief models as special cases. In EWA, strategies have attractions which reflect prior predispositions, are updated based on payoff experience, and determine choice probabilities according to some rule (e.g., logit). A key feature is a parameter δ which weights the strength of hypothetical reinforcement of strategies which were not chosen according to the payoff they would have yielded. When δ = 0 choice reinforcement results. When δ = 1, levels of reinforcement of strategies are proportional to expected payoffs given beliefs based on past history. Another key feature is the growth rates of attractions. The EWA model controls the growth rates by two decay parameters, φ and ρ, which depreciate attractions and amount of experience separately. When φ = ρ, belief-based models result; when ρ = 0 choice reinforcement results.
Using three data sets, parameter estimates of the model were calibrated on part of the data and used to predict the rest. Estimates of δ are generally around .50, φ around 1, and ρ varies from 0 to φ. Choice reinforcement models often outperform belief-based models in the calibration phase and underperform in out-of-sample validation. Both special cases are generally rejected in favor of EWA, though sometimes belief models do better. EWA is able to combine the best features of both approaches, allowing attractions to begin and grow flexibly as choice reinforcement does, but reinforcing unchosen strategies substantially as belief-based models implicitly do.https://resolver.caltech.edu/CaltechAUTHORS:20170814-161157311Bank Securities Holdings and Industrial Finance Before World War I: Britain and Germany Compared
https://resolver.caltech.edu/CaltechAUTHORS:20170814-153221064
DOI: 10.7907/sjzfm-bq221
Orthodox paradigms hold that through direct involvement with firms, the German universal banks funneled substantial amounts of financial capital into industry during the half century before World War I. At the same time, by avoiding such engagement with industrial companies, British banks are thought to have disadvantaged that country's economy with respect to its continental and American competitors. Using balance sheet data for the British deposit banks and the German universal banks, this paper shows that the German and British banks held approximately the same proportion of their assets in the form of non-government securities. In addition, the paper uses details garnered from two of the largest German banks to demonstrate that the universal banks became directly involved in only a few companies, that the total of these equity holdings amounted to a small share of bank assets, and that often the shares remained on the banks' books only because of an insufficient market for new issues. Thus, the idea that the German universal banks purposely took long-term stakes in industrial companies in order to credibly commit to behaving in the long-run interest of the firms finds limited support from this analysis.https://resolver.caltech.edu/CaltechAUTHORS:20170814-153221064What Are Judicially Manageable Standards For Redistricting? Evidence From History
https://resolver.caltech.edu/CaltechAUTHORS:20170815-165147079
DOI: 10.7907/mhby7-xsf81
In the 1960s the courts adopted population-equality measures as a means to limit gerrymandering. In recent cases, the courts have begun to use geographical compactness standards for this same purpose. In this research note, I argue that unlike population-equality measures, compactness standards are not easily manageable by the judiciary.
I use a variety of compactness measures and population-equality measures to evaluate 349 district plans, comprising the 3390 U.S. Congressional districts created between 1789 and 1913. I find that different population-equality measures, even those with poor theoretical properties, produce very similar evaluations of plans. On the other hand, different compactness measures fail to agree about the compactness of most districts and plans.
In effect, the courts can use any convenient measure of population equality and obtain similar results, while the courts' choice of compactness measures will significantly change the evaluations in each case. Since there is no generally accepted single measure of compactness, this disagreement among measures raises concerns whether compactness is a readily operationalizable notion, to use a social scientific formulation, or a judicially manageable one, to employ terms from law.https://resolver.caltech.edu/CaltechAUTHORS:20170815-165147079Explaining the Gender Gap in U.S. Presidential Elections, 1980-1992
https://resolver.caltech.edu/CaltechAUTHORS:20170815-162353811
DOI: 10.7907/9absa-ab280
This paper compares the voting behavior of women and men in presidential elections since 1980 to test competing explanations for the gender gap. We show that, consistent with prior research on individual elections, women placed more emphasis on the national economy than men, and men placed more emphasis on pocketbook voting than women. We add evidence showing that women have consistently more negative assessments of the economy than do men, suggesting that a part of what has been considered a Republican-Democratic gender gap is really an anti-incumbent bias on the part of women. Our multivariate analysis demonstrates that neither the differences between men and women's preferences nor emphasis on any single issue explains the significant gender gap in vote choice; but that a combination of respondent views on the economy, social programs, military action, abortion, and ideology can consistently explain at least three-fourths of the gender gap in the 1984, 1988, and 1992 elections. We also clarify the interpretation of partisan identification in explaining the gender gap.https://resolver.caltech.edu/CaltechAUTHORS:20170815-162353811Overconfidence and Excess Entry: An Experimental Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170815-170009974
DOI: 10.7907/teefk-9jr62
Psychological studies show that most people are overconfident about their own relative abilities, and unreasonably optimistic about their futures (e.g. Shelly E. Taylor and J.D. Brown, 1988; Neil D. Weinstein, 1980). When assessing their position in a distribution of peers on almost any positive trait-- like driving ability (Ola Svenson, 1981 ), income prospects, or longevity-- a vast majority of people say they are above the average, although of course, only half can be (if the trait is symmetrically distributed).
This paper explores whether optimistic biases could plausibly and predictably influence economic behavior in one particular setting-- entry into competitive games or markets. Many empirical studies show that most new businesses fail within a few years. For example, using plant level data from the U.S. Census of Manufacturers spanning 1963-1982, Timothy Dunne et al. (1988) estimated that 61.5 percent of all entrants exited within five years and 79.6 percent exited within 10 years. Most of these exits are failures (see also Dunne et al., 1989a, 1989b; D. Shapiro and R.S. Khemani, 1987).https://resolver.caltech.edu/CaltechAUTHORS:20170815-170009974Can Asset Markets be Manipulated? A Field Experiment with Racetrack Betting
https://resolver.caltech.edu/CaltechAUTHORS:20170815-154108288
DOI: 10.7907/6z163-c7z10
To test whether naturally-occurring markets can be strategically manipulated, $500 bets were made at a large racetrack, then cancelled. The net effects of these costless bets gives clues about whether market participants react to information potentially contained in large bets. While the bets moved odds on "attack" horses visibly (compared to matched-pair control horses with similar pre-bet odds), the net effect on betting was close to zero. A second study with $1000 bets at a smaller track replicated the result. These markets could not be successfully manipulated, indicating that bettors did not mistakenly infer information from the experimental bets.https://resolver.caltech.edu/CaltechAUTHORS:20170815-154108288Linearity with Multiple Priors
https://resolver.caltech.edu/CaltechAUTHORS:20170815-161726115
DOI: 10.7907/c8ke3-76j96
We characterize the types of functions over which the functional defined as the "min" of integrals with respect to probabilities in a given non-empty closed and convex class is linear. This happens exactly when "integrating" functions which are positive affine transformations of each other (or when one is constant). We show that the result is quite general by restricting the types of classes of probabilities considered. Finally we prove that, with a very peculiar exception, all the results hold more generally for functionals which are linear combinations of the "min" and the "max" functional.https://resolver.caltech.edu/CaltechAUTHORS:20170815-161726115Dynamic Efficiency and Voluntary Implementation in Markets with Repeated Pairwise Bargaining
https://resolver.caltech.edu/CaltechAUTHORS:20170810-154039652
DOI: 10.7907/t1fke-af726
We examine a simple bargaining setting, where heterogeneous buyers and sellers are repeatedly matched with each other. We begin by characterizing efficiency in such a dynamic setting, and discuss how it differs from efficiency in centralized static setting. We then study the allocations which can result in equilibrium when the matched buyers and sellers bargain through some extensive game form. We take an implementation approach, characterizing the possible allocation rules which result as the extensive game form is varied. We are particularly concerned with the impact of making trade voluntary: imposing individual rationality on and off the equilibrium path. No buyer or seller consummates an agreement which leaves them worse off than the discounted expected value of their future rematching in the market. Finally, we compare and contrast the efficient allocations with those that could ever arise as the equlibria of some voluntary negotiation procedure.https://resolver.caltech.edu/CaltechAUTHORS:20170810-154039652Arbitrage-Based Pricing When Volatility is Stochastic
https://resolver.caltech.edu/CaltechAUTHORS:20170815-164236183
DOI: 10.7907/73szt-nd743
In one of the early attempts to model stochastic volatility, Clark [1973] conjectured that the size of asset price movements is tied to the rate at which transactions occur. To formally analyze the econometric implications, he distinguished between transaction time and calendar time. The present paper exploits Clark's strategy for a different purpose, namely, asset pricing. It studies arbitrage-based pricing in economies where: (i) trade takes place in transaction time, (ii) there is a single state variable whose transaction time price path is binomial, (iii) there are risk-free bonds with calendar-time maturities, and (iv) the relation between transaction time and calendar time is stochastic. The state variable could be interpreted in various ways. E.g., it could be the price of a share of stock, as in Black and Scholes [1973], or a factor that summarizes changes in the investment opportunity set, as in Cox, Ingersoll and Ross [1985] or one that drives changes in the term structure of interest rates (Ho and Lee [1986], Heath, Jarrow and Morton [1992]). Property (iv) generally introduces stochastic volatility in the process of the state variable when recorded in calendar time. The paper investigates the pricing of derivative securities with calendar-time maturities. The restrictions obtained in Merton [1973] using simple buy-and-hold arbitrage portfolio arguments do not necessarily obtain. Conditions are derived for all derivatives to be priced by dynamic arbitrage, i.e., for market completeness in the sense of Harrison and Pliska [1981]. A particular class of stationary economies where markets are indeed complete is characterized.https://resolver.caltech.edu/CaltechAUTHORS:20170815-164236183The Effect of Bid Withdrawal in a Multi-Object Auction
https://resolver.caltech.edu/CaltechAUTHORS:20170815-154953137
DOI: 10.7907/2m71w-g2h86
The Federal Communications Commission currently utilizes a simultaneous multi-round ascending bid auction to allocate Personal Communication Services licenses. In the auction, participants are allowed to withdraw standing bids at a penalty. The penalty is equal to the difference between the price at which the bid was withdrawn and the highest bid after the withdrawal. The withdrawal rule was added to the auction design to assist bidders wishing to assemble combinations of licenses who may find themselves stranded with an assortment of licenses for which their bids sum to more than their value. This paper reports results of experiments that examine the effect of the withdrawal rule in environments in which losses can occur if packages of licenses must be assembled piecemeal. The experiments demonstrate that there is a tradeoff with using the rule: efficiency and revenue increase, but individual losses are larger. Furthermore, the increased efficiency does not outweigh the higher prices paid so that bidder surplus falls in the presence of the withdrawal rule.https://resolver.caltech.edu/CaltechAUTHORS:20170815-154953137Iterated Dominance and Iterated Best-Response in Experimental P-Beauty Contests
https://resolver.caltech.edu/CaltechAUTHORS:20170815-170549846
DOI: 10.7907/2wqmd-n6102
We study a dominance-solvable 'p-beauty contest' game in which a group of players simultaneously choose numbers from a closed interval. The winner is the player whose number is the closest top times the average, where p =/ 1. The numbers players choose can be taken as an indication of the number of steps of iterated reasoning about others they do. Choices in the first period show that the median number of steps of iterated reasoning is either one or two. Repeating the game produces reliable convergence to the unique Nash equilibrium. Choices in later periods are consistent with subjects' best-responding to previous choices, or iterating one step and best-responding to best responses. (Choices are not as consistent with 'learning direction theory' which embodies elements of belief-free reinforcement models). Variation in the values of p, the number of players, and whether subjects played a similar game before, all affect choices and learning.https://resolver.caltech.edu/CaltechAUTHORS:20170815-170549846Russian Federalism: Economic Reform and Political Behavior
https://resolver.caltech.edu/CaltechAUTHORS:20170815-171400624
DOI: 10.7907/vdc65-emq44
This paper explores the implications from Russian federalism of the "new" interregional economic inequality produced by the structural distortions inherited from the central planning, and immaturity of the Russian markets. As a result, regional preferences over federal policies are widely diverse, which makes a nation-wide consensus hard to reach. It is argued in the paper that high correlation between political and regional divides in Russia renders the federation unstable and that the instability, in its turn, reproduces interregional disparities. These links are also illustrated by a simple equilibrium model presented in the Appendix.
When the federal legislative bodies are "uprooted", captured by extra-territorial narrowly based interest groups, and lose their accountability to the voters, regional administrations emerge as representatives of their jurisdictions on the federal scene, and decision-making at the federal level assumes the form of federal-provincial bargaining. Such bargaining entails vast efficiency losses, since the federal administration is pressed to spend its resources for buying compliance of regional counterparts, instead of providing mandated public goods. It is stated in conclusion, that until markets integrate Russian regions and hence locally based political interests, the country will be failing to develop "market-preserving federalism" (Weingast, 1995), i.e. a federal system which stimulates economic growth and imposes self-enforcing restrictions on counterproductive discretion of public officials.https://resolver.caltech.edu/CaltechAUTHORS:20170815-171400624The Results of Some Tests of Mechanism Designs for the Allocation and Pricing of Collections of Heterogeneous Items
https://resolver.caltech.edu/CaltechAUTHORS:20170815-163214049
DOI: 10.7907/6ehr5-jww49
During the discussion and evaluation of proposals for the design of the Federal Communications Commission (FCC) mechanism to sell the spectrum, over 130 auctions were run under controlled conditions at Caltech for the National Telecommunications and Information Administration (NTIA), the FCC and others. In this paper we look at these data and try to extract some useful findings for those who may be involved in creating future designs of similar auctions. For those whose experience with experimental economics methodology is limited, we begin with a section on the general framework within which experimental work underlying applied mechanism design is conducted. Next we cover, in section 2, the various technical pieces needed to understand the data: performance measures, economic environments, mechanisms tested, and the major issues considered. The experimental data are presented and our observations are summarized in section 3. We end, in section 4, with some thoughts for future work and with the observation that there is a huge gap between, theory, scientific evidence, and practice in the design of complex auctions. Much needed research remains to be done.https://resolver.caltech.edu/CaltechAUTHORS:20170815-163214049Martingale Restrictions on Equilibrium Prices of Arrow-Debreu Securities Under Rational Expectations and Consistent Beliefs
https://resolver.caltech.edu/CaltechAUTHORS:20170816-162131145
DOI: 10.7907/de6dw-zq306
Consider the Rational Expectations price history of an Arrow-Debreu security that matures in the money: p1; p2; …; pr. Past information can be used to predict the return pt+1 - pt) = pt. Now consider a simple alternative performance measure: (pt+1 - pt)=pt+1. It differs from the return only in that the future price is used as basis. This variable cannot be forecasted from past information. The result obtains even if investors' beliefs are biased, i.e., prices are not set in a Rational Expectations Equilibrium (REE). It depends only on investors' using the rules of conditional probability to process information. More precisely, the result continues to hold in the Bayesian Equilibrium with Consistent Beliefs (CBE) introduced by Harsanyi [1967]. Many related results are proved in this paper and extensions to the pricing of equity subject to bankruptcy risk are discussed.https://resolver.caltech.edu/CaltechAUTHORS:20170816-162131145Labor Supply of New York City Cab Drivers: One Day at a Time
https://resolver.caltech.edu/CaltechAUTHORS:20170816-154840294
DOI: 10.7907/9dtj1-nks95
Life-cycle models of labor supply predict a positive relationship between hours supplied and transitory changes in wages because such changes have virtually no effect on life-cycle wealth. Previous attempts to test this hypothesis empirically with time-series data have not been supportive; estimated elasticities are typically negative or nonsignificant. Such analyses, however, are vulnerable to measurement error and other estimation problems. We use data on daily observations of wages and hours for New York City cab drivers to estimate the supply response to transitory fluctuations in wages. Cab drivers decide daily how many hours to supply, and face wages that are positively correlated within days, but largely uncorrelated between days. Using these data, our central finding is that wage elasticities are persistently negative–from -.5 to -1 in three different samples–even after correcting for measurement error using instrumental variables. These negative wage elasticities challenge the notion that cab drivers trade off labor and leisure at different points in time and question the empirical adequacy of life-cycle formulations of labor supply.https://resolver.caltech.edu/CaltechAUTHORS:20170816-154840294Russia's Party System: Is Russian Federalism Viable?
https://resolver.caltech.edu/CaltechAUTHORS:20170816-153137919
DOI: 10.7907/ts8j6-7gz34
Is Russia likely to develop a stable or efficient federal system that matches the definitions of federalism commonly offered in the literature or the descriptions that characterize intergovernmental relations in Germany, Switzerland, or the United States? Unfortunately, our answer to this question is NO. Unlike other discussions of federal relations in Russia - discussions that focus on current economic circumstances, federal treaties, and relations between political elites—we reach this conclusion by taking the view that the extent to which a federal state integrates the functions of different levels of government is determined largely by its political party system and the incentives for cooperation engendered by electoral politics at all levels. Assuming that Russia will continue on the path of democratic reform, we consider the types of parties that are likely to emerge in the long run as a function of Russia's current constitutional structure, current electoral arrangements for choosing a president and a national legislature, and that structure political competition at the regional and local levels. We argue that parties in Russia will be more like those found in, say, Canada than in the United States and Germany. Russia's current electoral arrangements, in combination with the political institutional designs of its regional governments—designs that mirror the command and control systems inherited from the Soviet past and which focus power on regional governors—will continue to encourage only the development of a party system that is not only highly fractured at the national level but one that fails to create adequate incentives for cooperation between levels of government. Even if the Russian economy recovers in the next few years or so and even if reformers maintain their position in Moscow, an adversarial relationship will continue to exist between regional and national governments, a relationship that will merely move the state from one crises to the next. We conclude with several suggestions for political reform, including simultaneous election of Duma deputies and president, increased use of elections as a method for filling regional and local public offices, and alternative methods for forming the Federation Council. However, we remain pessimistic about the prospects for a well-functioning federal system since most if not all of these suggestions are unlikely to be pursued as political reforms.https://resolver.caltech.edu/CaltechAUTHORS:20170816-153137919When Politics and Models Collide: Estimating Models of Multi-Party Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170816-160342093
DOI: 10.7907/jcges-hyx21
Theory: The spatial model of elections can better be represented by using conditional logit than by multinomial logit. The spatial model, and random utility models in general, suffer from a failure to adequately consider the substitutability of candidates sharing similar or identical issue positions.
Hypotheses: Multinomial logit is not much better than successive applications of binomial logit. Conditional logit allows for considering more interesting political questions than does multinomial logit. The spatial model may not correspond to voter decision-making in multiple-candidate settings. Multinomial probit allows for a relaxation of the IIA condition and this should improve estimates of the effect of adding or removing parties.
Methods: Comparisons of binomial logit, multinomial logit, conditional logit, and multinomial probit on simulated data and survey data from a three-party election.
Results: Multinomial logit offers almost no benefits over binomial logit. Conditional logit is capable of examining movements by parties, whereas multinomial logit is not. Multinomial probit performs better than conditional logit when considering the effects of altering the set of choices available to voters.https://resolver.caltech.edu/CaltechAUTHORS:20170816-160342093Attitudes, Uncertainty and Survey Responses
https://resolver.caltech.edu/CaltechAUTHORS:20170816-140007624
DOI: 10.7907/zyt7v-hz990
Theory: We assume that survey respondents are uncertain about their attitudes, and that their attitudes about political issues can be understood as probability distributions. From this perspective, we derive the "expected value" survey response model. We also derive a dynamic model of attitude change, based on the notion that attitudes are uncertain.
Hypotheses: This perspective on political attitudes leads to two predictions. The first is that uncertain respondents will show less variance in responses than certain respondents, and that the less certain will tend to give responses towards the midpoint of issue placement scales. The second is that uncertain respondents will have less stable opinions about political issues over time.
Methods: These hypotheses are tested using new survey questions we have developed to measure respondent uncertainty. These survey questions have been included in three recent national surveys, two conducted by the Letters and Sciences Survey Center at the University of Wisconsin, Madison and the other by the National Election Studies.
Results: We demonstrate that uncertain respondents are more likely that certain respondents to provide issue placements at the midpoint of the scale, controlling for many factors. Also, we show that uncertain respondents have less stable political attitudes than certain respondents.https://resolver.caltech.edu/CaltechAUTHORS:20170816-140007624Yudin Cones and Inductive Limit Topologies
https://resolver.caltech.edu/CaltechAUTHORS:20170816-145953994
DOI: 10.7907/javtf-mk962
A cone C in a vector space has a Yudin basis {ei}i∈I if every c ∈ C can be written uniquely in the form c = Σi∈Iλiei, where λi ≥ 0 for each i ∈ I and λi = 0 for all but finitely many i. A Yudin cone is a cone with a Yudin basis. Yudin cones arise naturally since the cone generated by an arbitrary family {ei}i∈I of linearly independent vectors
C = {∑i∈Iλiei: λi ≥ 0 for each I and λi = 0 for all but finitely many i}
is always a Yudin cone having the family { ei}iEJ as a Yudin basis. The Yudin cones possess several remarkable order and topological properties. Here is a list of some of these properties.
1. A Yudin cone C is a lattice cone in the vector subspace it generates M = C - C.
2. A closed generating cone in a two-dimensional vector space is always a Yudin cone.
3. If the cone of a Riesz space is a Yudin cone, then the lattice operations of the space
are given pointwise relative to the Yudin basis.
4. If a Riesz space has a Yudin cone, then the inductive limit topology generated by the finite dimensional subspaces is a Hausdorff order continuous locally convex-solid topology.
5. In a Riesz space with a Yudin cone the order intervals lie in finite dimensional Riesz subspaces (and so they are all compact with respect to any Hausdorff linear topology on the space).
The notion of a Yudin basis originated in studies on the optimality and efficiency of competitive securities markets in the provision of insurance for investors against risk or price uncertainty.• It is a natural extension to incomplete markets of Arrow's notion of a basis for complete markets, i.e., markets where full insurance against risk can be purchased. The obtained results have immediate applications to competitive securities markets. Especially, they are sufficient for establishing the efficiency of stock markets as a means for insuring against risk or price uncertainty.https://resolver.caltech.edu/CaltechAUTHORS:20170816-145953994Portfolio Dominance and Optimality in Infinite Securities Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170816-144820990
DOI: 10.7907/bsj3g-6ty02
The most natural way of ordering portfolios is by comparing their payoffs. If a portfolio has a payoff higher than the payoff of another portfolio, then it is greater than the other portfolio. This order is called the portfolio dominance order. An important property that a portfolio dominance order may have is the lattice property. It requires that the supremum and the infimum of any two portfolios are well-defined. The lattice property implies that such portfolio investment strategies as portfolio insurance or hedging an option's payoff are well-defined.
The lattice property of the portfolio dominance order plays an important role in the optimality and equilibrium analysis of markets with infinitely many securities with simple (i.e., arbitrary finite) portfolio holdings. If the portfolio dominance order is a lattice order and has a Yudin basis, then optimal portfolio allocations and equilibria in securities markets do exist. A Yudin basis constitutes a system of mutual funds of securities such that trading mutual funds provides the same spanning opportunities, and that the restriction of no short sales of mutual funds is equivalent to the restriction of non-negative wealth.https://resolver.caltech.edu/CaltechAUTHORS:20170816-144820990A Statistical Theory of Equilibrium in Games
https://resolver.caltech.edu/CaltechAUTHORS:20170816-165243053
DOI: 10.7907/m8276-pad75
This paper describes a statistical model of equilibrium behavior in games, which we call Quanta! Response Equilibrium (QRE). The key feature of the equilibrium is that individuals do not always play best responses to the strategies of their opponents, but play better strategies with higher probability than worse strategies. We illustrate several different applications of this approach, and establish a number of theoretical properties of this equilibrium concept. We also demonstrate an equivalence between this equilibrium notion and Bayesian games derived from games of complete information with perturbed payoffs.https://resolver.caltech.edu/CaltechAUTHORS:20170816-165243053Fraud or Fiction: Who Stole What in Russia's December 1993 Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170816-152141772
DOI: 10.7907/jzgay-6aq68
Serious allegations of fraud have been made with respect to Russia's first competitive party-based parliamentary election in December 1993 - the same election in which Russian's ostensibly ratified a new constitution for themselves. Although charges of fraud are common in elections, these allegations are especially serious in that the argument here was that over 9 million ballots were fraudulently cast and that the turnout threshold of 50% required to render the constitutional referendum legitimate was in fact not surpassed. These are profoundly important allegations. First, they bring into question the legitimacy of Russia's new constitution and thereby offer its opponents an excuse to suspend its provisions some time in the future. Second, they naturally enough cause us to be suspicious of Russia's December 1995 parliamentary elections. Finally, to the extent that the same methods for detecting fraud are likely to be applied to subsequent elections, if they revel significant levels of fraud there, they can provide an excuse for canceling those elections or invalidating their results. In this essay, then, we look at the two methodologies employed to detect and measure the extent of fraud in 1993. Without disputing the possibility that fraud was in fact extensive, we conclude that neither methodology as presently developed is adequate to the task at hand. The first, which assumes that we should observe a linear relationship between the log of the rank of parties and the log of their support at the polls employ s a number of ad hoc assumptions and a priori estimates that, in sum, are equivalent to assuming the conclusion. The second method, which looks at the relationship between turnout and the share of the electorate voting for one party or position versus another, is subject to a number of methodological pitfalls, including aggregation error and the possibility that unobserved variables correlate with both turnout and support so as to render any relationship indeterminate. Nevertheless, of the two methodologies, the second is the most promising for further development and our critique of it is intended to point the way to the requisite developments.https://resolver.caltech.edu/CaltechAUTHORS:20170816-152141772Laboratory Experimental Testbeds: Application to the PCS Auction
https://resolver.caltech.edu/CaltechAUTHORS:20170816-163211539
DOI: 10.7907/49k43-nsm83
The use of laboratory experimental methods in economics has been growing rapidly. With each application, new insights are gained about how the methodology can be used to supplement the more traditional forms of research. Such was the case with the development of the Federal Communications Commission (FCC) policy for the auction of licenses for Personal Communication Systems (PCS). At several different stages the laboratory experimental methods of economics were used. The application differed at each of these stages, representing the different types of relationships that can exist among theory, observation, and policy. This paper is a brief account of the applications.https://resolver.caltech.edu/CaltechAUTHORS:20170816-163211539Interim Efficiency in a Public Goods Problem
https://resolver.caltech.edu/CaltechAUTHORS:20170816-164222869
DOI: 10.7907/y319s-0vv88
We consider a Bayesian public goods environment with independent private valuations, where a public good can be produced at constant returns to scale, up to some capacity. We fully characterize the interim efficient allocation rules and prove that they correspond to decision rules based on a virtual cost-benefit criterion, together with the appropriate incentive taxes. Compared to the classical Lindahl-Samuelson solution there are generally distortions that depend on the welfare weights because the efficient way to reduce the tax burden on low-valuation (resp: high-valuation) types is to reduce (resp: increase) the level of provision of the public good. Second, we explore the implementation of efficient allocations by means of simple, dominant strategy voting rules, called referenda. In a referendum, individuals vote for or against production of the public good. If a sufficiently large fraction vote in favor, the good is provided at maximum capacity and costs are distributed equally across the population. Otherwise the good is not produced. We prove that for each interim efficient allocation rule there exists a referendum that approximates that achieves the same total surplus in large populations. Furthermore, if there is common value uncertainty in addition to the private valuations uncertainty, then the approximately optimal referendum is unique.https://resolver.caltech.edu/CaltechAUTHORS:20170816-164222869Stable Networks
https://resolver.caltech.edu/CaltechAUTHORS:20170816-132625443
DOI: 10.7907/hkx61-6hc41
A network is a graph where the nodes represent agents and an arc exists between two nodes if the corresponding agents interact bilaterally. An exogeneous value function gives the value of each network, while an allocation rule describes how the value of any graph is distributed amongst the agents. We explore the possibility of constructing allocation rules which will ensure that efficient networks of agents will form when the individual agents decide to form or severe links amongst themselves.https://resolver.caltech.edu/CaltechAUTHORS:20170816-132625443Voters Can Have Strong Incentives To Become Informed, Or To Be Strategically Ignorant
https://resolver.caltech.edu/CaltechAUTHORS:20170816-141436805
DOI: 10.7907/8zp37-mcw63
Before an election, two candidates choose policies which are lotteries over election-day distributive positions. I find conditions under which there exist mixed-strategy probabilistic-voting equilibria which are independent, treating voter groups independently. When voter efforts determine the quality of their signals regarding candidate positions, voters can have strong incentives regarding their visible efforts made before candidates choose policies.
Also, scale economies in group information production can make voters prefer large groups. Even with zero information costs, however, voters can ex ante prefer ignorance to full information. Optimal ignorance emphasizes negative over positive news, and induces candidates to take stable positions.https://resolver.caltech.edu/CaltechAUTHORS:20170816-141436805Is There a Winner's Curse In The Market For Baseball Players? Evidence From The Field
https://resolver.caltech.edu/CaltechAUTHORS:20170816-143334824
DOI: 10.7907/g0knw-0jx59
A winner's curse exists in common value auctions when bidders fail to fully account for the fact that the winning bidder's valuation of the object is an upward-biased estimates of its true (unknown) value. Previous studies have reported mixed evidence for a winner's curse in oil lease auctions, corporate takeovers, auctions for failed banks, and in many experiments. In this paper we search for a winner's curse in the 1990 negotiations for free agent baseball players. Free agents are overpaid, relative to direct estimates of the revenue impact of their performance, by about 50% on average. A control sample of non-free agent players, in contrast, are overpaid by only 1 %. However, proper adjustment for the winner's curse requires bidders to bid less when the variance of an object's value is lower; and salaries are lower for players with more variable previous performances. Taken together, the data show a large winner's curse but the curse is not due to teams mistakenly paying more for high-variance players.https://resolver.caltech.edu/CaltechAUTHORS:20170816-143334824Information and American Attitudes Toward Bureaucracy
https://resolver.caltech.edu/CaltechAUTHORS:20170816-154051390
DOI: 10.7907/z7ape-pqx33
The exploration of American attitudes towards the Internal Revenue Service joins an unusual pair of research domains: public opinion and public administration. Public administration scholars contend that the hostility Americans show towards "bureaucracy" stems from the contradictory expectations Americans have for bureaucratic performance. Drawing upon a survey commissioned by the IRS and conducted in 1987 just after the passage of the Tax Reform Act, we explore attitudes towards the performance of the IRS in eight categories. Using a new heteroskedastic ordinal logit technique, we demonstrate (1) that it is overwhelmingly a single expectation of flexibility that governs attitudes towards the IRS; (2) that these expectations are not in contradiction; and (3) that domain-specific information sharply focuses respondent attitudes towards bureaucracy.https://resolver.caltech.edu/CaltechAUTHORS:20170816-154051390Input Markets Development, Property Rights, and Extra-Market Redistribution
https://resolver.caltech.edu/CaltechAUTHORS:20170816-142253388
DOI: 10.7907/cawxk-k5a80
The paper links the intensity of re-distributional activities within an economy to the availability of the input markets. When an individual is faced with a choice between productive vs non-productive (re-distributional) activities, the outcome heavily depends on whether this individual can match his/her personal endowment of human resources (labor, entrepreneurial talent, skills etc.) with commensurable quantities of transferable economic inputs, which are required to complement the human resources in production technologies. If the markets for these inputs are missed or impeded, rational individuals could be forced into re-distribution, where "technologies" do not require matching inputs.
However, the development of the input markets alone is not sufficient to suppress redistributional activities. Another factor to be taken into account is the degree of protection of property rights. An equilibrium model is presented to demonstrate that if property rights are adequately protected, then opening of the input markets undermines the incentive to seek re-distributional gains. On the other hand, if property rights are protected poorly, making input markets available could further stimulate re-distribution, as the society is getting richer, and the rate of return to re-distributional efforts goes up.
Implications of the above observations for institutional change and economic reform are briefly discussed in conclusion.https://resolver.caltech.edu/CaltechAUTHORS:20170816-142253388Timing and Virtual Observability in Ultimatum Bargaining and Weak Link Coordination Games
https://resolver.caltech.edu/CaltechAUTHORS:20170816-134044325
DOI: 10.7907/gfg3r-ew664
Previous studies have shown that simply knowing some players move first can affect behavior in games, even when the first-movers' moves are unobservable. This observation violates the game-theoretic principle that timing of unobserved moves is irrelevant. We extend this work by varying timing of unobservable moves in ultimatum bargaining games and "weak link" coordination games. Timing without observability affects both bargaining and coordination, but only weakly. The results are consistent with theories that allow "virtual observability" of first-mover choices, rather than theories in which timing matters only because first-mover advantage is used as a principle of equilibrium selection.https://resolver.caltech.edu/CaltechAUTHORS:20170816-134044325Quantal Response Equilibria for Extensive Form Games
https://resolver.caltech.edu/CaltechAUTHORS:20170817-143555993
DOI: 10.7907/prxgj-mg253
This paper investigates the use of standard econometric models for quantal choice to study equilibria of extensive form games. Players make choices based on a quantal choice model, and assume other players do so as well. We define an Agent Quantal Response Equilibrium (AQRE), which applies QRE to the agent normal form of an extensive form game and imposes a statistical version of sequential rationality. We also define a parametric specification, called logit-AQRE, in which quantal choice probabilities are given by logit response functions.
AQRE makes predictions that contradict the invariance principle in systematic ways. We show that these predictions match up with some experimental findings by Schotter, Weigelt and Wilson (1993) about the play of games that differ only with respect to inessential transformations of the extensive form. The logit-AQRE also implies a unique selection from the set of subgame perfect equilibria in generic extensive form games. We examine data from signalling game experiments by Banks, Camerer, and Porter (1994) and Brandts and Holt (1993). We find that the logit-AQRE selection applied to these games succeeds in predicting patterns of behavior observed in these experiments, even when our prediction conflicts with more standard equilibrium refinements, such as the intuitive criterion. We also reexamine data from the McKelvey and Palfrey (1992) centipede experiment.https://resolver.caltech.edu/CaltechAUTHORS:20170817-143555993Outside Options and Social Comparison in 3-Player Ultimatum Game Experiments
https://resolver.caltech.edu/CaltechAUTHORS:20170818-144943716
DOI: 10.7907/t6hjv-nrt82
We conducted ultimatum games in which a proposer offers a division of $10 to a respondent, who accepts or rejects it. If an offer is rejected, players receive a known outside option. Our proposers made simultaneous offers to two respondents, with outside options of $2 and $4. The rate of rejected offers was higher than in similar studies, around 50%, and persisted across five trials. Outside options seem to make players "egocentrically" apply different interpretations of the amount being divided, which creates persistent disagreement. And half of respondents demand more when they know other respondents are being offered more.https://resolver.caltech.edu/CaltechAUTHORS:20170818-144943716The New Republic and the New Institutionalism: Hamilton's Plan and Extra-Legislative Organization
https://resolver.caltech.edu/CaltechAUTHORS:20170817-162344965
DOI: 10.7907/k5e1w-bsb18
Recent work under the "new institutionalism" rubric has emphasized the role that institutions play in majority rule legislatures. This paper applies this focus on institutions to examine why rudimentary political parties began to form in the early sessions of the United States Congress. While Constitutional structures could have provided institutional stability to the early Congresses, empirical evidence indicates that a necessary condition underlying the operation of Constitutional stability- enhancing structures was not fulfilled. It will be argued that in order to avoid the uncertainty inherent in the institution-free first two Congresses, political entrepreneurs (especially Hamilton and Madison) began to organize rough legislative factions behind particular political-economic policies. This paper will examine in particular the progress of Hamilton's fiscal plans in the first Congresses and the legislative polarization which provided the foundation upon which the Federalist and Jeffersonian Republican parties were built.https://resolver.caltech.edu/CaltechAUTHORS:20170817-162344965Deficits, Democrats, and Distributive Benefits: Congressional Elections and the Pork Barrel in the 1980s
https://resolver.caltech.edu/CaltechAUTHORS:20170818-132928764
DOI: 10.7907/6rr57-0a981
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170818-132928764Two Measures of Difficulty
https://resolver.caltech.edu/CaltechAUTHORS:20170818-133754197
DOI: 10.7907/yrcx8-84j64
This paper constructs two measures of difficulty for functions defined over binary strings. The first of these measures, cove r size, captures the difficulty of solving a problem in parallel. The second measure, ascent size, captures the difficulty of solving a problem sequential. We show how these measures can help us to better understand the performance of genetic algorithms and simulated annealing, two widely used search algorithms. We also show how disparities in these two measures may shed light on the organizational structure of firms.https://resolver.caltech.edu/CaltechAUTHORS:20170818-133754197Loss Avoidance and Forward Induction in Experimental Coordination Games
https://resolver.caltech.edu/CaltechAUTHORS:20170817-160321150
DOI: 10.7907/1n74w-dk551
We report experiments on how players select among multiple Pareto-ranked equilibria in a coordination game. Subjects initially choose inefficient equilibria. Charging a fee to play (which makes initial equilibria money-losing) creates coordination on better equilibria. When fees are optional, improved coordination is consistent with forward induction. But coordination improves even when subjects must pay the fee (forward induction does not apply). Subjects appear to use a "loss-avoidance" selection principle: they expect others to avoid strategies that always result in losses. Loss-avoidance implies that "mental accounting" of out- comes can affect choices in games.https://resolver.caltech.edu/CaltechAUTHORS:20170817-160321150Are Americans Ambivalent Towards Racial Policies?
https://resolver.caltech.edu/CaltechAUTHORS:20170817-162348562
DOI: 10.7907/01vc1-y8z98
Few debates, political or academic, are as conflictual as those over racial policy. In this paper, we explore the possibility that individual attitudes are internally conflictual through the use of inferential statistical techniques that estimate variability in individual respondents' considerations about racial policy. We consider six separate core beliefs potentially relevant towards racial policy choice (modern racism, anti-black stereotyping, authoritarianism, individualism, and anti-semitism), for four different policy choices. We evaluate two separate models for the source of individual variance: conflicting values and direct effects of values. Our analysis leads us to conclude that modern racism trumps rival explanatory variables in explanations of racial policy choice, and that variability in attitudes toward racial policy is due to uncertainty, and not to ambivalence.https://resolver.caltech.edu/CaltechAUTHORS:20170817-162348562Strategyproof Allocation of a Single Object
https://resolver.caltech.edu/CaltechAUTHORS:20170817-161618129
DOI: 10.7907/3jsj7-n5z76
The problem of allocating a single indivisible object to one of several selfish agents is considered, where monetary payments are not allowed, and the object is not necessarily desirable to each agent. It is shown that ordinality and positive responsiveness together are necessary and sufficient conditions for strategyproofness, which implies that efficient social choice functions are not strategyproof. However, any Pareto-optimal, ordinal social choice function is strategyproof. A Gibbard-Satterthwaite-type impossibility result is established for nonbossy mechanisms. Thus, the best the planner can do without monetary transfers is to give the object to an agent who desires it, but whose valuation of the object may not be the highest among the agents, using a mechanism that is either dictatorial or bossy. It is also shown that all strategyproof, nonbossy, and Pareto-optimal social choice functions are serial dictatorships.https://resolver.caltech.edu/CaltechAUTHORS:20170817-161618129Relationship Banking and corporate governance in the Kaiserreich
https://resolver.caltech.edu/CaltechAUTHORS:20170817-165303443
DOI: 10.7907/cr120-h6772
This paper focuses on the institution of interlocking directorates between universal banks and industrial firms in the Kaiserreich (1871-1914) and demonstrates that such formalized relationships were unusual prior to 1900. The investigation indicates further that there was a marked increase in bank representation at firms-both in the share of firms involved in such relationships and in the number of joint directors-around the turn of the century. Finally, the work suggest a number of explanations for the pattern of bank relationships that emerges.https://resolver.caltech.edu/CaltechAUTHORS:20170817-165303443Preferences Over Solution to the Bargaining Problem
https://resolver.caltech.edu/CaltechAUTHORS:20170818-141404867
DOI: 10.7907/6xdyr-40f88
There are several solutions to the Nash bargaining problem in the literature. Since various authors have expressed preferences for one solution over another, we find it useful to study preferences over solutions in their own right. We identify two sets of appealing axioms on such preferences that lead to unanimity in the choice of solution. Thus bargainers may be able to reach agreement on which solution to employ. Under the first set of axioms, the Nash solution is preferred to any other solution, while under the second set, a new solution, which we call the weighted linear solution, is best.https://resolver.caltech.edu/CaltechAUTHORS:20170818-141404867Constitutions for New Democracies: Reflections of Turmoil or Agents of Stability?
https://resolver.caltech.edu/CaltechAUTHORS:20170818-140301070
DOI: 10.7907/3re5t-xms92
Despite the widely held view in newly emerging democracies that constitutions are mere words on paper or that parchment barriers cannot render a state stable or democratic, those who draft such documents commonly act as if words ARE of consequence. The difficulty, however, is that contemporaneous conflicts too easily intervene so as to corrupt the drafting process and to preclude optimal constitutional design. The specific principle of design most likely to be violated is the proposition that we treat all parts of the constitution as an interconnected whole and that we not try to assess the consequences of one part without appreciating the full meaning of all other parts. This essay illustrates this violation by looking at the new Russian Constitution, ratified by direct popular vote in December 1993, with special attention paid to that document's treatment of federalism. We offer the additional argument, however, that even contemporary research in political institutional design pays insufficient heed to this principle.https://resolver.caltech.edu/CaltechAUTHORS:20170818-140301070Why Did The Incumbency Advantage In U.S. House Elections Grow?
https://resolver.caltech.edu/CaltechAUTHORS:20170817-154732918
DOI: 10.7907/mztr7-cd807
In the last twenty years, scholars have scrutinized the electoral advantages conferred by incumbency-both at the federal and at the state level-more than perhaps any other factor affecting U .S. legislative elections.1 Much of the literature focuses on explaining why the incumbency advantage in U .S. House elections grew so substantially, starting in the mid-1960s. The dominant contenders in the literature are two, one emphasizing resources of various kinds (Mayhew 1974) and opportunities to perform constituency services (Fiorina 1977; 1989), one emphasizing partisan dealignment (Erikson 1972; Burnham 1974; Ferejohn 1977). While not incompatible, these explanations do point to significantly different factors as key, and neither has emerged as a clear winner.
In this paper, we suggest a new approach to measuring the incumbency advantage, one that disaggregates the total value of incumbency into three components. By examining the trends over time in these three components we find evidence suggesting that much of the growth in the incumbency advantage at the federal level cannot be accounted for by resource growth; rather, some version of the dealignment story will have to be employed.https://resolver.caltech.edu/CaltechAUTHORS:20170817-154732918Reapportionment Wars: Party, Race, and Redistricting in California, 1971-1992
https://resolver.caltech.edu/CaltechAUTHORS:20170818-130903388
DOI: 10.7907/hwrbt-dr259
Understanding the impact of the redistricting of state legislative and congressional seats in California in the 1990s on ethnic minorities and the two principal political parties requires more than a narrative of events. Without explicit quantitative techniques for estimating the partisan consequences of plans that were unsuccessfully proposed, as well as those that were adopted, we cannot gauge what difference the ultimate choice of plans made or fully evaluate the intentions of the framers and opponents of each plan. Accordingly, in Social Science Working Paper 929, I developed and tested two methods of comparing the likely partisan outcomes under different reapportionment schemes. In this paper, I apply those techniques.
But neither quantitative methods nor a story that begins in 1990 tells us all we need to know. The recent past indelibly imprinted the actors in 1991 and deeply affected their behavior. Therefore, I begin in 1971 and explain how a controversy over electing a Latino to the Assembly wrecked an agreed compromise and led to a reapportionment imposed by the State Supreme Court, employing Special Masters and technicians, an experience that kindled Democratic, as well as Republican hopes in 1991 that the judiciary and those whom they appointed to carry out the redistricting would not treat their party unfairly. Even more important for the combatants in the 1990s was the fact that Democratic party control of the redistricting process in the 1980s led to a bitter ten-year partisan struggle that ultimately undermined the state legislature as an institution, brought about a partisan Republican takeover of the State Supreme Court, and encouraged Republicans to torpedo all compromise on redistricting in 1991 and turn reapportionment over to the court.
In the background of these events, and helping to shape them, were actions by the Congress and the federal judiciary. The equal population and minority vote dilution cases and the evolving Voting Rights Act significantly constrained the degree of discrimination against partisan and ethnic minorities in California, as elsewhere in the nation. As Latinos and African-Americans became a more and more important part of the Democratic leadership, as well as of the Democratic voters, partisan and ethnic interests became more and more correlated, and external legal constraints, less necessary to protect minority political power - as long as Democrats controlled redistricting. When Democrats lost control, and when unelected technocrats, particularly a redistricting commission created and appointed by Gov. Pete Wilson in 1991, took over, minority interests were largely ignored. Fortunately, in 1991, the judicial precedents protecting minorities were still strong and the technocrats who drafted the plans for the State Supreme Court were both diligent and, overall, sympathetic to minority concerns. What would happen if those precedents were reversed or weakened, as some have interpreted the U.S. Supreme Court decisions in Shaw v. Reno and Miller v. Johnson as doing, depends on who retains the ultimate power to draw districts.
The three chief findings of the quantitative analysis in this paper are striking: Neither the Masters' Plan of 1973 nor the so-called "Burton Gerrymander" of 1981 was as pro-Democratic as has often been suggested, nor was the Masters' Plan of 1991 so nonpartisan. The lessons for future reapportionments in California suggested by the analysis of those from 1971 on are pessimistic. Term limits will rob the legislature of any expertise in redistricting, turning remapping entirely over to unaccountable technicians, lobbyists, and party leaders. Term limits are also likely to increase partisan strife, because a bipartisan incumbent gerrymander will become impossible. The recent California tendency to replace legislative bargaining over reapportionment with judicial fiat, the courts' habit of intervening in matters previously left to the legislature, the public's cynicism about any legislative activity, and the U.S. Supreme Court's invitation to anyone aggrieved by a remapping to file suit virtually insures that redistricting in 2001 will be designed by, or at least exhaustively challenged in the courts. Whether that will be good for democracy and for the rights of ethnic minorities is less certain.https://resolver.caltech.edu/CaltechAUTHORS:20170818-130903388On Independence For Non-Additive Measures, With a Fubini Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170817-153142732
DOI: 10.7907/jj5s5-fte97
Recent models of decision making represent agents' beliefs by non-additive set-functions. An important technical question which arises in applications to diverse areas of economics is how to define independence of such set-functions. After arguing that the straightforward generalization of independence does not in general yield a unique product, in this work I show that, while Fubini's theorem is in general false if additivity is not granted, it is true when a certain type of function is being integrated. For these functions the iterated integrals coincide with the integral with respect to products which satisfy a certain property, strictly stronger than independence. I show that most of the assumptions made in these results are very close to being necessary. In general the mentioned property is still not strong enough to uniquely define a product. On the other hand I discuss some proposals which have been made in the literature, and I show that unicity can however be obtained when the product is assumed to be a belief function. Moreover I show that the unique product thus obtained has an intuitive justification when the marginals are distributions induced by random correspondences. Finally I use the results in the paper to discuss the question of randomization in decision models with non-additive beliefs.https://resolver.caltech.edu/CaltechAUTHORS:20170817-153142732The Dynamics of Issue Emphasis: Campaign Strategy and Media Coverage in Statewide Races
https://resolver.caltech.edu/CaltechAUTHORS:20170818-142346383
DOI: 10.7907/g1037-44d71
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170818-142346383The Czechoslovak Privatization Auction: An Empirical Investigation
https://resolver.caltech.edu/CaltechAUTHORS:20170818-143118005
DOI: 10.7907/kc67g-1qp09
The 1992 Czechoslovak mass-privatization program resembled a multiround Walrasian auction with tatonnement in which participants, endowed with points, bid simultaneously for non-uniform products, i.e., shares. The creation of this artificial primary market provides economists with a unique opportunity to investigate empirically (1) the role and aims of the auctioneer in a politically-motivated giveaway scheme, (2) the price-setting mechanism, and (3) the bidding strategies and rationality of the auction's participants. Unlike more conventional auctions, price discovery was only a secondary motive to the auctioneer. The principal aim was to transfer the shares quickly to the investing public in a politically acceptable manner. We show that the price-updating rules adopted alter each bidding round did achieve the auctioneer's principal aim, but they also served to inject noise. The results suggest an inherent tradeoff between socially acceptable outcomes in such auctions and efficient price discovery.https://resolver.caltech.edu/CaltechAUTHORS:20170818-143118005Estimating the Partisan Consequences of Redistricting Plans - Simply
https://resolver.caltech.edu/CaltechAUTHORS:20170818-132226552
DOI: 10.7907/s188t-ty278
Although some judges and political scientists have recently doubted that it is possible to predict the partisan consequences of redistricting plans, I demonstrate that it is simple to do so with a pair of OLS equations that regress voting percentages on major party registration percentages. I test this model on data for all California Assembly and Congressional elections from 1970 through 1992, and compare it to logit results and to more complicated equations that contain incumbency and socioeconomic variables. Since information on socioeconomic variables is often not available early in a redistricting cycle, and since incumbency in a district is often difficult to determine precisely after a reapportionment, I rely on the simplest equation, which correctly predicts 90% of the results. I show that analogous equations using registration or votes for minor or even major offices in California, North Carolina, and Texas can predict outcomes with considerable accuracy.
Using the party registration equations, I show that the so-called "Burton Gerrymander" of 1980 had minimal partisan consequences, while the "nonpartisan" plan instituted by the California Supreme Court's Special Masters in 1992 was nearly as biased in favor of the Republicans as the proposal of the Republican party, which would have insured the GOP a majority of the Congressional seats even if Democrats won a landslide of the votes. I conclude by introducing a new graphical representation of redistricting plans, which strongly implies that in 1991, Republican and Democratic line drawers in California agreed on the registration margins necessary for party control and drew their plans with these in mind.https://resolver.caltech.edu/CaltechAUTHORS:20170818-132226552A Comparison of Political Institutions in a Tiebout Model
https://resolver.caltech.edu/CaltechAUTHORS:20170818-135148172
DOI: 10.7907/mkce4-2tg22
In this paper, we construct a computational model of Tiebout competition. We show that the notion that Tiebout competition, as a result of enforcing efficiency, renders institutional arrangements unimportant does not preclude the possibility that political institutions may differ in their ability to sort citizens. In particular, institutions which perform poorly given a single location, may perform better when there are multiple locations because they allow for improved sorting. We demonstrate that insights from simulated annealing, a discrete nonlinear search algorithm, may explain this improvement.https://resolver.caltech.edu/CaltechAUTHORS:20170818-135148172Strategyproof and Nonbossy Assignments
https://resolver.caltech.edu/CaltechAUTHORS:20170817-155507110
DOI: 10.7907/p7bnp-hh210
We consider the assignment of heterogeneous and indivisible objects to agents without using monetary transfers, where each agent may be assigned more than one object, and the valuation of the objects to an agent may depend on what other objects the agent is assigned. The set of strategyproof, nonbossy, and Pareto-optimal social choice functions is characterized as dictatorial sequential choice functions. Thus, the consequences of a Gibbard-Satterthwaite-type result can only be escaped in this context by using bossy social choice functions. It is also established that all strategyproof, strongly nonbossy and Pareto-optimal social choice functions are serial dictatorships, where strong nonbossiness is a stricter version of bossiness.https://resolver.caltech.edu/CaltechAUTHORS:20170817-155507110Hamilton's Political Economy and the National Bank
https://resolver.caltech.edu/CaltechAUTHORS:20170817-163728303
DOI: 10.7907/2zxfw-ghf62
Alexander Hamilton was a major protagonist in the struggle to build a strong national government backed by the powers necessary to wield centralized power. Hamilton's political and economic writings are complementary bodies of work since they both have common ends - the development of a strong national government and the establishment of strong political-economic institutions. The complementary nature of Hamilton's political and economic writings are most apparent in his defense of the proposal to found a new national bank. This paper will examine in depth Hamilton's writings, concentrating on the bank proposal. The development of this strong and stable financial institution would play both political and economic roles in the new republic. In particular, the national bank would institutionalize citizen support for the new government and thereby ensure the longer-term stability and strength of the national government.https://resolver.caltech.edu/CaltechAUTHORS:20170817-163728303Comparing Absentee and Precinct Voters: Voting on Direct Legislation
https://resolver.caltech.edu/CaltechAUTHORS:20170817-151019038
DOI: 10.7907/wf98v-0q055
This paper addresses issues related to how the absentee voter actually casts their ballot on propositions. If the liberalization of absentee laws changed either the composition or behavior of the electorate then the outcome of the• election may be affected.
This paper tests whether the electoral behavior of absentee and precinct voters differs in regards to voting on propositions. The analysis is based on sample of actual absentee and precinct voter ballots drawn from the approximately three million ballots cast in Los Angeles county for the 1992 general election. The analysis uses a nested model of voter participation and is estimated using the weighted exogenous sampling maximum likelihood method.
We find that precinct and absentee voters do differ on both the propositions they cast votes on, and in their propensity to vote "Yes" for a proposition. For example, absentees appear to vote on fewer bonds and initiatives than do precinct voters. They also vote on fewer propositions dealing with state taxes, food taxes, and property taxes. In addition, given that a voter casts a valid vote, the propensity for absentee voters to vote "Yes" is higher on initiatives and propositions related to education, welfare, and health care than it is for precinct voters.https://resolver.caltech.edu/CaltechAUTHORS:20170817-151019038Relationship Banking, Liquidity, and Investment in the German Industrialization
https://resolver.caltech.edu/CaltechAUTHORS:20170818-154821525
DOI: 10.7907/k8q0m-qey65
Because of apparently close ties between banks and industry, German-style universal banking is thought by many to improve investment efficiency. This paper presents new evidence to the contrary. Using a panel of firm data from Germany's heavy industrialization period (1903-1913), the analysis shows that investment is more sensitive to internal liquidity for bank-networked firms than unattached firms, and that this effect is only minimally offset in the long run. Furthermore, the paper presents extensive evidence on the characteristics of bank-attached firms and demonstrates through direct tests that the estimated investment equations are free of selection bias.https://resolver.caltech.edu/CaltechAUTHORS:20170818-154821525Survey Measures of Uncertainty: A Report to the National Election Studies Board on the Use of "Certainty" Questions to Measure Uncertainty About Candidate Traits and Issue Positions
https://resolver.caltech.edu/CaltechAUTHORS:20170817-140539307
DOI: 10.7907/h974a-n4y88
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170817-140539307A Liapunov Function for Nash Equilibria
https://resolver.caltech.edu/CaltechAUTHORS:20170817-134102962
DOI: 10.7907/f61vr-arf47
In this paper, I construct a Liapunov function for Nash equilibria for finite n–person games in normal form. This function is useful for computation of Nash equilibria, since it converts the problem into a standard minimization problem. It provides an alternative to existing computational methods, which are based either on n - person extensions of the algorithm of Lemke and Howson [1961] (eg., Wilson [1971] and Rosenmiiller [1971]), or on methods for finding the fixed point of the best response correspondence, such as simplicial division algorithms (eg., Todd [1976], and Van der Laan et al. [1987]). This work is also related to that of Brown and von Neumann [1950], and Rosen [1964], who construct differential equation systems for solving certain classes of games.https://resolver.caltech.edu/CaltechAUTHORS:20170817-134102962Are People Bayesian? Uncovering Behavioral Strategies
https://resolver.caltech.edu/CaltechAUTHORS:20170818-145519355
DOI: 10.7907/hvm98-nab92
Economists and psychologists have recently been developing new theories of decision making under uncertainty that can accommodate the observed violations of standard statistical decision theoretic axioms by experimental subjects. We propose a procedure which finds a collection of decision rules that best explain the behavior of experimental subjects. The procedure is a combination of maximum likelihood estimation of the rules together with an implicit classification of subjects to the various rules, and a penalty for having too many rules. We apply our procedure to data on probabilistic updating by subjects in four different universities. We get remarkably robust results which show that the most important rules used by the subjects (in order of importance) are Bayes's rule, a representativeness rule (ignoring the prior), and to a lesser extent, conservatism (over-weighting the prior).https://resolver.caltech.edu/CaltechAUTHORS:20170818-145519355Implementation Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170818-161032873
DOI: 10.7907/f4x3p-w1n11
This surveys the branch of implementation theory initiated by Maskin (1977). Results for both complete and incomplete information environments are covered.https://resolver.caltech.edu/CaltechAUTHORS:20170818-161032873Economies With Many Commodities
https://resolver.caltech.edu/CaltechAUTHORS:20170817-150419178
DOI: 10.7907/jjhxf-hrj08
We discuss the two fundamental theorems of welfare economics in the context of the Arrow-Debreu-McKenzie model with an infinite dimensional commodity space. As an application, we prove the existence of competitive equilibrium in the standard single agent growth model.https://resolver.caltech.edu/CaltechAUTHORS:20170817-150419178Correlated Disturbances in Discrete Choice Models: A Comparison of Multinomial Probit Models and Logit Models
https://resolver.caltech.edu/CaltechAUTHORS:20170818-153518647
DOI: 10.7907/z0hv5-kw285
In political science, there are many cases where individuals make discrete choices from more than two alternatives. This paper uses Monte Carlo analysis to examine several questions about one class of discrete choice models - those involving both alternative specific and individual-specific variables on the right-hand side - and demonstrates several findings. First, the use of estimation techniques assuming uncorrelated disturbances across alternatives in discrete choice models can lead to significantly biased parameter estimates. This point is tempered by the observation that probability estimates based on the full choice set generated from such estimates are not likely to be biased enough to lead to incorrect inferences. However, attempts to infer the impact of altering the choice set - such as by removing one of the alternatives - will be less successful. Second, the Generalized Extreme Value (GEV) model is extremely unreliable when the pat tern of correlation among the disturbances is not as restricted as the GEV model assumes. GEV estimates may suggest grouping among the choices that is in fact not present in the data. Third, in samples the size of many typical political science applications – 1000 observations - Multinomial Probit (MNP) is capable of recovering precise estimates of the parameters of the systemic component of the model, though MNP is not likely to generate precise estimates of the relationship among the disturbances in samples of this size. Paradoxically, MNP's primary benefit is its ability to uncover relationships among alternatives and to correctly estimate the effect of removing an alternative from the choice set. Thus this paper suggests the increased use of MNP by political scientists examining discrete choice problems when the central question of interest is the effect of removing an alternative from the choice set. We demonstrate that for other questions, models positing in dependent disturbances may be 'close enough.'https://resolver.caltech.edu/CaltechAUTHORS:20170818-153518647Studying Congressional and Gubernatorial Campaigns
https://resolver.caltech.edu/CaltechAUTHORS:20170817-133125568
DOI: 10.7907/1mm41-kt439
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170817-133125568Shaw vs. Reno and the World of Redistricting and Representation
https://resolver.caltech.edu/CaltechAUTHORS:20170818-152333397
DOI: 10.7907/70h0s-dtc53
Justice O ' Connor's majority opinion in the 1993U.S. Supreme Court case of Shaw v. Reno has widely been seen as withdrawing judicial protection of minority voting rights -- a welcome development to those who believe as a matter of faith that discriminatory electoral rules, racist appeals in elections, and racially polarized voting are things of the distant past, but less hopeful to close students of redistricting and election campaigns of the last two decades. Deeply ambiguous, the opinion has spawned a wide range of interpretations, from assertions that it bans redistricters from taking the race of voters into account at all, even when they place them in majority-white districts, to contentions that it merely asks for further information about the basis for establishing certain "ugly" districts that have majorities of African Americans or Latinos.
In this paper, which is based on research that I carried out for Shaw v. Hunt, the remand version of Shaw v. Reno, and Vera v. Richards, its Texas counterpart, I try to restore a sense of reality to the often factually incorrect assertions or implications of Justice O'Connor's opinion, not only by a close textual reading of the briefs and opinions in the Supreme Court case, but also by looking in considerable detail at the actual redistricting processes in North Carolina and Texas during the 1970s, 80s, and 90s. Were race, partisanship, and individual politicians' interests taken into account in redrawing districts before 1991, or were all previous reapportionments pristine exercises in civic virtue? Might the states in the 1990s have had compelling interests in redressing past racially discriminatory practices? Were the motives of the 1991-92redistrictings so uncomplicated that they can be easily and unambiguously determined by a quick glance at a map? For North Carolina, I also examine whether white and black public opinion and the voting records of white and black members of Congress differ systematically from each other. Do black voters need black faces to represent them?
Shaw's vagueness affords the Supreme Court the possibility of gracefully backing away from its separate but unequal standards, standards that allow whites standing to sue without having to prove that the electoral rules at issue have a racially discriminatory effect and without having to show in detail that they were adopted with a racially discriminatory intent. In the final section, I outline five escape routes from Shaw, all of which are based on its factual inadequacies.https://resolver.caltech.edu/CaltechAUTHORS:20170818-152333397The Allocation of a Shared Resource Within an Organization
https://resolver.caltech.edu/CaltechAUTHORS:20170818-150309360
DOI: 10.7907/sb8p2-77f71
Many resources such as supercomputers, legal advisors, and university classrooms are shared by many members of an organization. When the supply of shared resources is limited, conflict usually results between contending demanders. If these conflicts can be adequately resolved, then value is created for the organization. In this paper we use the methodology of applied mechanism design to examine alternative processes for the resolution of such conflicts for a particular class of scheduling problems. We construct a laboratory environment, within which we evaluate the outcomes of various allocation mechanisms. In particular, we are able to measure efficiency, the value attained by the resulting allocations as a percentage of the maximum possible value. Our choice of environment and parameters is guided by a specific application, the allocation of time on NASA's Deep Space Network, but the results also provide insights relevant to other scheduling and allocation applications. We find (1) experienced user committees using decision support algorithms produce reasonably efficient allocations in lower conflict situations but perform badly when there is a high level of conflict between demanders, (2) there is a mechanism, called the Adaptive User Selection Mechanism (AUSM) which charges users for time, which yields high efficiencies in high conflict situations but because of the prices paid, the net surplus available to the users is less than that resulting from the inefficient user committee (a reason why users may not appreciate "market solutions" to organization problems) and (3) there is a modification of AUSM in which tokens, or internal money, replaces real money, which results in highly efficient allocations without extracting any of the users' surplus. Although the distribution of surplus is still an issue, the significant increase in efficiency provides users with a strong incentive to replace inefficient user committees with the more efficient AUSM.https://resolver.caltech.edu/CaltechAUTHORS:20170818-150309360Merging of Forecasts in Markov Models
https://resolver.caltech.edu/CaltechAUTHORS:20170817-140138224
DOI: 10.7907/js7h7-br259
Blackwell and Dubins (1962) and Kalai and Lehrer (1994) showed that absolute continuity is necessary and sufficient for merging of opinions. This paper suggests the concept of merging of forecasts which is a modification of merging of opinions in Markov models where the underlying state of nature may change over time.
We define the merging of forecasts as the conditional probabilities of the future state given the past observations of signals drawn conditional on the state get close to each other for different agents; it allows for the event that agents agree on the future evolution of the states even if they have not agreed in the distant past. For an ergodic Markov chain, any forecasts merge. In particular, we can dispense with the absolute continuity for merging of forecasts.https://resolver.caltech.edu/CaltechAUTHORS:20170817-140138224Rules for Experimenting in Psychology and Economics, and Why They Differ
https://resolver.caltech.edu/CaltechAUTHORS:20170817-144503614
DOI: 10.7907/hnsrp-4yw45
This chapter discusses methodological differences in the way economists and psychologists typically conduct experiments. The main argument is that methodological differences spring from basic differences in the way knowledge is created and cumulated in the two fields - especially the important role of simple, formal theory in economics which is largely absent in psychology.https://resolver.caltech.edu/CaltechAUTHORS:20170817-144503614Rational Price Discovery In Experimental And Field Data
https://resolver.caltech.edu/CaltechAUTHORS:20170817-134647024
DOI: 10.7907/c3h0d-2kp48
The methodology of tests for martingale properties in return series is analyzed. Martingale results obtain frequently in finance. One case is focused on here, namely, rational price discovery. Price discovery is the process by which a market moves towards a new equilibrium after a major event. It is rational if price changes cannot be predicted from commonly available information. The price discovery process, however, cannot be assumed stationary. Hence, to avoid false inference in the presence of nonstationarities, event studies of field data have been advocating the use of cross-sectional information in the computation of test statistics. Under the martingale hypothesis, however, this inference strategy is shown to add little except if higher moments of the return series do not exist. On the contrary, the cross-sectional approach may even be invalid if there is cross-sectional heterogeneity in the price discovery process. The time series statistic of Patell (1976], originally suggested in the context of i.i.d. time series but cross-sectional heterosceclasticity, may be preferable. It will not provide valid inference either, if higher serial correlation coincides with higher volatility. Unfortunately, this appears to be the case in the dataset which is used in the paper to illustrate the methodological issues, namely, transaction price changes from experiments on continuous double auctions with stochastic private valuations.https://resolver.caltech.edu/CaltechAUTHORS:20170817-134647024Comparing Absentee and Precinct Voters: A View Over Time
https://resolver.caltech.edu/CaltechAUTHORS:20170817-151921896
DOI: 10.7907/2q6zy-69h08
This paper examines the trend in absentee voting over the last thirty years in California. With the liberalization of absentee voting laws and practices, an increase in the numbers of absentee voters quickly followed. Absentee voters have already demonstrated their ability to influence the outcomes of local elections. An open question is what will become of absentee voters in the future. If they are the model for "voting at home," and if technological advances allow such, then the behavior of current absentee voters may be indicative of the future electorate.
The increasing trend of voters opting for absentee ballots is analyzed by using GLS on a random effects time-series cross-section model with county level data. The focus is on identifying structural factors such as changing voter demographics that have influenced the decision of voters to cast absentee ballots. Thirty-three recent state-wide elections in California are the basis for this analysis, covering the statewide primary and general elections from November 1962, through November 1994.
We find that the impact of demographics and time trends on absentee voting differ between general and primary elections. In addition, we find that a 1977 liberalization law in California had the effect of accelerating the usage of the absentee format. Finally, we conclude that absentee and precinct voting are substitutes in general elections but complements in primary elections.https://resolver.caltech.edu/CaltechAUTHORS:20170817-151921896Issues, Economics and the Dynamics of Multi-Party Elections: The British 1987 General Election
https://resolver.caltech.edu/CaltechAUTHORS:20170817-140948569
DOI: 10.7907/w9f59-jzr85
This paper offers a model of three-party elections which allows voters to combine retrospective economic evaluations with considerations of the positions of the parties in the issue-space as well as the issue-preferences of the voters. We describe a model of British elections which allows voters to consider simultaneously all three parties, rather than limiting voters to choices among pairs of parties as is usually done. Using this model we show that both policy issues and the state of the national economy matter in British elections. We also show how voters framed their decisions. Voters first made a retrospective evaluation of the Conservative party based on economic performance; and those voters that rejected the Conservative party chose between Labour and Alliance based on issue positions. Through simulations of the effects of issues—we move the parties in the issue space and re-estimate vote-shares—and the economy—we hypothesize an alternative distribution of views of t h e economy for voters—we show that Labour has virtually no chance to win with the Alliance as a viable alternative. Even if the Alliance (or the Liberal Democrats) disappears. Labour will need to significantly moderate its policy positions to have a chance of competing with the Conservative party. We argue that the methodological technique we employ multinomial probit, is a superior mechanism for studying three-party elections as it allows for a richer formulation of politics than do competing methods.https://resolver.caltech.edu/CaltechAUTHORS:20170817-140948569Fiduciari and Firm Liquidity Constraints: The Italian Experience with German-Style Universal Banking
https://resolver.caltech.edu/CaltechAUTHORS:20170817-142157556
DOI: 10.7907/ys61z-x8x87
This study presents new evidence on the role of bank relationships in attenuating Italian firms' liquidity constraints and interprets the findings in comparison with those from a similar study of the German case. Employing investment models on a panel of 170 firms between 1903 and 1911, the analysis shows that bank relationships had little effect on the liquidity constraints of the general population of investing firms, but that they may have attenuated liquidity constraints for new firms. Analysis of the characteristics associated with bank-attached firms indicates that such firms were significantly different from independent firms, and that German-style banks may, therefore, have been important for ex ante monitoring and signaling to investors. The results for Italy accord well with the German experience, and thus universal banking appears, in general, to have had limited impact on the investment patterns of firms during industrialization.https://resolver.caltech.edu/CaltechAUTHORS:20170817-142157556Mutually Destructive Bidding: The FCC Auction Design Problem
https://resolver.caltech.edu/CaltechAUTHORS:20170818-151840242
DOI: 10.7907/r2944-rm514
Dissatisfaction with previous assignment mechanisms and the desire to raise revenue induced Congress to grant the FCC authority to auction radio licenses. The debate over an appropriate auction design was wide ranging with many imaginative proposals. Many of the arguments and their scientific support are unfortunately not publicly available. Here, we present our side of this debate for the record.
Synergies across license valuations complicate the auction design process. Theory suggests that a "simple" (i.e., non-combinatorial) auction will have difficulty in assigning licenses efficiently in such an environment. This difficulty increases with increases in "fitting complexity." In some environments, bidding may become "mutually destructive." Experiments indicate that a combinatorial auction is superior to a simple auction in terms of economic efficiency and revenue generation in bidding environments with a low amount of fitting complexity. Concerns that a combinatorial auction will cause a "threshold" problem are not borne out when bidders for small packages can communicate.https://resolver.caltech.edu/CaltechAUTHORS:20170818-151840242Coping With Ignorance: Unforeseen Contingencies and Non-Additive Uncertainty
https://resolver.caltech.edu/CaltechAUTHORS:20170817-145148657
DOI: 10.7907/3kttz-cg466
In real-life decision problems, decision makers are never provided with the necessary background structure: the set of states of the world, the outcome space, the set of actions. They have to devise all these by themselves. I model the (static) choice problem of a decision maker (DM) who is aware that her perception of the state space is too coarse, as for instance when there might be unforeseen contingencies. After making assumptions on the way the DM perceives the decision problem, I present a set of axioms on her preferences which imply that they can represented by a (generalized) expectation with respect to a non-additive measure, called a belief function. As it turns out, the very natural axioms presented have strong implications on the way the DM copes with the type of ignorance described above. I show how some decision rules that have been studied in the literature can be obtained as a special case of the model presented here (though they have to be interpreted differently).
I then show that this formulation of the problem can yield very natural results on the comparative statics of beliefs as the DM's understanding of the decision problem becomes deeper, for instance when unforeseen events become foreseen. I present the implications of these results for a simple asset pricing model. Finally I argue that if we are willing to make assumptions on the faithfulness of the DM's perception with respect to reality, then the DM described here will, in the limit as her perception becomes finer and finer, resemble a Savage DM.https://resolver.caltech.edu/CaltechAUTHORS:20170817-145148657DELETE Costly Offers and the Equilibration Properties of the Multiple Unit Double Auction Under Conditions of Unpredictable Shifts of Demand and Supply
https://resolver.caltech.edu/CaltechAUTHORS:20170821-132239324
The paper reports on the behavior of markets in which a transactions cost is imposed in the form of a tax on bids and asks that are tendered in the market. That is, in the markets studied communication with the other side of the market was costly. The markets were nonstationary in the sense that market demand and market supply shifted unpredictably each period and the markets were organized by the computerized Multiple Unit Double Auction. The results are as follow. (1) A market equilibration process is observed across the periods of nonstationary markets. (2) The imposition of the cost on offers did not negate the tendency toward market equilibration but the price discovery process was "incomplete" relative to the free offer case. (3) Price equilibration with the offer cost was slower and efficiencies were reduced.https://resolver.caltech.edu/CaltechAUTHORS:20170821-132239324Exchange Economies and Loss Exposure: Experiments Exploring Prospect Theory and Competitive Equilibria in Market Environments
https://resolver.caltech.edu/CaltechAUTHORS:20170821-133106377
DOI: 10.7907/63rqp-z0260
A natural economic interpretation of Prospect Theory is that people have preferences that are risk seeking in losses and risk averse in gains. Thus, according to this interpretation of the theory, individuals in an exchange economy facing only losses in wealth, would have concave preferences as opposed to the usual convex preferences. That is, if individuals could engage in trade that would reduce the magnitude of expected losses and change the variance associated with losses, they would have a tendency to seek higher variance and perhaps be willing to do so at the cost of a reduction of expected value of wealth. Such individuals would be willing to sell insurance at prices below the expected value. With concave preferences all competitive equilibria have allocations at the boundaries of the Edgeworth Box. Experimental markets were constructed to determine if such behavior could be observed. The results are that risk seeking behavior is observed in many people. Furthermore, the propensity toward risk seeking in markets is consistent with answers given to questionnaires involving hypothetical choices among lotteries. The propensity toward risk seeking appears to be reduced with experience. In one sense the data are strongly supportive of Prospect Theory but in another sense the data are not. The evidence suggests that preferences in the market setting are not labile and that the risk seeking propensities are not a result of delicate framing effects. The preferences revealed in the market seemed to be a property of the people and not simply a property of their decision processes as required by Prospect Theory.https://resolver.caltech.edu/CaltechAUTHORS:20170821-133106377Two-Stage Estimation of Non-Recursive Choice Models
https://resolver.caltech.edu/CaltechAUTHORS:20170818-162556639
DOI: 10.7907/aa41w-fsk46
Questions of causation are important issues in empirical research on political behavior. Most of the discussion of the econometric problems associated with multi-equation models with reciprocal causation has focused on models with continuous dependent variables (e.g. Markus and Converse 1979; Page and Jones 1979). Since many models of political behavior involve discrete or dichotomous dependent variables, this paper turns to two techniques which can be employed to estimate reciprocal relationships between dichotomous and continuous dependent variables. One technique which I call two-stage probit least squares (2SPLS) is very similar to familiar two-stage instrumental variable techniques. The second technique, called two-stage conditional maximum likelihood (2SCML), may overcome problems associated with 2SPLS, but has not been used in the political science literature. First I show the properties of both techniques using Monte Carlo simulations. Then, I apply these techniques to an empirical example which focuses on the relationship between voter preferences in a presidential election and the voter's uncertainty about the policy positions taken by the candidates. This example demonstrates the importance of these techniques for political science research.https://resolver.caltech.edu/CaltechAUTHORS:20170818-162556639The Formation of Multiple Teams
https://resolver.caltech.edu/CaltechAUTHORS:20170821-134538958
DOI: 10.7907/keya4-azx13
Organizational forms such as task-oriented teams have often been proposed as a method to enhance the efficiency of a firm. Under asymmetric information, however, the costs of acquiring the information needed to improve efficiency may outweigh the efficiency gains and lead to lower profits. We illustrate this idea by considering a profit-maximizing principal who needs to allocate a group of agents among a number of projects, given that the principal has incomplete information about the agents' abilities. We study feasible incentive-compatible (truth-revealing) individually rational mechanisms under both the dominant strategy and Bayesian Nash behavioral assumptions. Some attention is also paid to Nash equilibrium mechanisms. The paper covers derivation of optimal mechanisms, efficiency analysis, and analysis of the principal's expected profit as a function of different types of environment and information structures. We find that if the principal has little or no information about the agents' private characteristics and the agents follow dominant strategy behavior, the principal may often run into losses in an attempt to discover the hidden information. Paradoxically, the loss occurs when the efficiency gains from team production are high and the competition among the agents is low. If the hidden information about each agent can be summarized as a one-dimensional type parameter, and if a prior distribution function of the agents ' types is common knowledge among the agents and the principal, an expected-profit maximizing Bayesian equilibrium mechanism exists and is of the optimal auction form (Myerson, 1 98 1 ) . Moreover, the mechanism can be equivalently implemented in dominant strategies with no expected profit loss for the principal. Yet, the principal's profit often decreases with an increase in the number of projects. These findings suggest that, in profit-maximizing firms with low competition among the employees, efficient organizational forms may often be foregone in favor of profits.https://resolver.caltech.edu/CaltechAUTHORS:20170821-134538958Voter Choice in 1992: Economics, Issues, and Anger
https://resolver.caltech.edu/CaltechAUTHORS:20170821-141721076
DOI: 10.7907/r4hfn-zvy61
This paper examines the voting behavior of individuals in the 1 992 presidential election. Employing a multinomial probit model we disprove several commonly held beliefs regarding the uniqueness of the election and the mood of the voters. We show emphatically the dominance of the economy as an issue, and that Clinton, not Perot, was the beneficiary of economic discontent. We show the limited influence of the candidates' efforts at choosing the optimal ideological position. We also demonstrate, via simulations of the outcome under hypothetical distributions of preferences, that the effect of the economy, while large, cannot by itself explain the magnitude of Bush's defeat. We also prove the surprisingly powerful impact of the candidates' positions on abortion on voters' choices. And we disprove the stylized fact that the 1 992 election was characterized by "angry voters." Finally, we show that Perot took more votes from Bush than he did from Clinton.https://resolver.caltech.edu/CaltechAUTHORS:20170821-141721076Landscape Formation in a Spatial Voting Model
https://resolver.caltech.edu/CaltechAUTHORS:20170821-135523809
DOI: 10.7907/046nc-0c893
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170821-135523809Rational Voters and Strategic Voting: Evidence from the 1968, 1980, and 1992 Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170822-132645397
DOI: 10.7907/1vp12-5fe44
Is the rational choice paradigm more than a mere tautology when applied to the study of voting or can it generate refutable propositions that cannot be deduced or inferred from other approaches? This is the question we address empirically in the context of three-candidate presidential elections. Although we reconfirm the conclusion that the decision to vote is largely a consumptive one, we also establish that once in the voting booth, voters act strategically in precisely the ways predicted by a Downsian model of voting. That is, although expected utility calculations and the like add little to our understanding of the decision to vote, those same calculations have a significant influence on the decision for whom to vote, over and above such things as partisanship.https://resolver.caltech.edu/CaltechAUTHORS:20170822-132645397Subcomponent Innovation and Moral Hazard: Where Technological Progress Meets the Division of Labor
https://resolver.caltech.edu/CaltechAUTHORS:20170822-140429442
DOI: 10.7907/ra1cw-psk31
I model the technical innovation of a final good as a process of incremental enhancement due to Research and Development (R&D) efforts undertaken on subcomponents to the final good. R&D contracting is analyzed within various principal I agent structures. I identify a principal who jointly values the performance capabilities of the subcomponent undergoing R&D and the funds available for other subcomponents; thus, he does not have a transferable utility function. I justify and characterize a performance seeking agent in addition to the conventional profit seeking agent. The information environment and the motivational properties of the principal and agent significantly affect the form and existence of optimal R&D contracts. I draw insights for private and public sector industrial organization.https://resolver.caltech.edu/CaltechAUTHORS:20170822-140429442An Experimental Analysis of the Two-Armed Bandit Program
https://resolver.caltech.edu/CaltechAUTHORS:20170822-142127351
DOI: 10.7907/cnh3w-1hp55
We investigate, in an experimental setting, the behavior of single decision makers who at discrete time intervals over an "infinite" horizon may choose one action from a set of possible actions where this set is constant over time, i.e. a bandit problem. Two bandit environments are examined, one in which the predicted behavior should always be myopic (the two-armed bandit) and the other in which the predicted behavior should never be myopic (the one-armed bandit). We also investigate the comparative static predictions as the underlying parameter of the bandit environments are changed. The aggregate results show that the cutpoint behavior in the two bandit environments are quantitatively different and in the direction of the theoretical predictions. Furthermore, while a significant number of individual cutpoints exhibit nonstationarity (contrary to the theory), the most likely, i.e. maximum likelihood estimates, collection of decision rules that best explain overall behavior are those that are consistent with the underlying theory.https://resolver.caltech.edu/CaltechAUTHORS:20170822-142127351Equilibria with Unrestricted Entry in Multi Member District Plurality (SNTV) Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170822-161622244
DOI: 10.7907/yzjqx-gsa77
The hypothesis that the Duverger's Law can be extended to plurality multi member district elections received some empirical support in Steven Reed's (1990) study of the Japanese elections. Here we return to electoral data of Japan and Taiwan in order to find evidence consistent with the theoretical result offered in Part I of this essay, namely, that cohesive electorates should witness either the competition among as many "serious" candidates as there are seats, or, at most, one "extra" candidate. per race. We also compare the consistency of the data with two alternative sets of predictions, one—derived from our candidate-based model (strategic candidates; sincere voters), another—proposed by Gary Cox (1993) where strategic voter behavior is analyzed. The indications are that the strategic role of the candidates should be viewed as leading.https://resolver.caltech.edu/CaltechAUTHORS:20170822-161622244First Best Bayesian Privatization Mechanisms
https://resolver.caltech.edu/CaltechAUTHORS:20170822-135920548
DOI: 10.7907/rgkfq-awr86
A planner is interested in designing an ex-post efficient, individually rational, Bayesian mechanism for allocating a single indivisible object to one of the agents who knows his own valuation and only the distribution of other agents' valuations of the object. In this paper, we show that it is impossible to design such a mechanism without any transfers among agents and the planner. However, we discover and describe an ex-post efficient, ex-post individually rational, Bayesian mechanism which balances transfers among agents without any payment to (or from) the planner.
Our result that an ex-post efficient, ex-post individually rational, transfer balanced, Bayesian mechanism exists, is in stark contrast to two well-known impossibility results in the literature; the nonexistence of a Bayesian public good mechanism satisfying expost efficiency, individual rationality and budget balance (Laffont and Maskin (1979)) and the impossibility of an ex-post efficient, individually rational, Bayesian bilateral trading mechanism between a seller and a buyer without an outside subsidy (Myerson and Satterthwaite (1983)).https://resolver.caltech.edu/CaltechAUTHORS:20170822-135920548Congressional Committees and the Political Economy of Federal Outlays
https://resolver.caltech.edu/CaltechAUTHORS:20170822-133838152
DOI: 10.7907/4s991-1x478
The literature on the organization of the United States Congress has been dominated by "distributive" and "informational" theory. One important source of disagreement between these two theories is their characterization of whether individual legislators can engage in pork-barrel activities. Here we provide evidence which indicates that the pork-barrel is alive and well in the contemporary United States Congress. We focus on whether members of power and constituency committees can direct disproportionate federal expenditures to their districts. Finding strong and systematic evidence of pork-barrel activities by committee members provides empirical support for distributive theories of legislative organization.https://resolver.caltech.edu/CaltechAUTHORS:20170822-133838152Economical Experiments: Bayesian Efficient Experimental Design
https://resolver.caltech.edu/CaltechAUTHORS:20170822-160511103
DOI: 10.7907/gkc2n-v7q38
We propose and implement a Bayesian optimal design procedure. Our procedure takes as its primitives a class of models, a class of experimental designs, and priors on the nuisance parameters of those models. We select the experimental design that maximizes the information (in the sense of Kullback-Liebler) from the experiment. We sequentially sample with the given design and models until all but one of the models has viable posterior odds. A model which has low posterior odds in a small collection of models will have an even lower posterior odds when compared to a larger class, and hence we can dismiss it. The procedure can be used sequentially by introducing new models and comparing them to the models that survived earlier rounds of experiments. The emphasis is not on running as many experiments as possible, but rather on choosing experimental designs to distinguish between models in the shortest possible time period. The first stage of optimal design is illustrated with a simple experimental game with one-sided incomplete information.https://resolver.caltech.edu/CaltechAUTHORS:20170822-160511103Equilibria with Unrestricted Entry in Multi Member District Plurality (SNTV) Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170822-163104441
DOI: 10.7907/fpm96-56b91
Extending Duverger's Law to electoral districts of arbitrary district magnitude would imply just one "extra" candidate running in each race. In this paper we analyze equilibrium properties (possible equilibrium configuration and then existence) of a plurality electoral system returning more than one legislator per district. We look at sincere Downsian voters and strategically behaving candidates (who can change their policy platforms at no cost, while new candidates can enter the race). In Part II we find empirical evidence in favor of the implications of this analysis in the performance of actual SNTV electoral systems, such as the one in Japan and Taiwan.https://resolver.caltech.edu/CaltechAUTHORS:20170822-163104441A Bottom-Up Efficient Algorithm for Allocating Public Projects with Positive Complementarities
https://resolver.caltech.edu/CaltechAUTHORS:20170822-154731169
DOI: 10.7907/64gk6-vk958
In this paper, we consider the problem of locating an optimal package of public projects from a set of potential projects when the public projects have positive complementarities. We formulate the problem as a discrete nonlinear optimization problem whose domain equals the power set of a finite collection of projects. The main contribution of this paper is the construction of an efficient algorithm, among the set of bottom-up algorithms, for projects with positive and positive uniform complementarities. The restriction to bottom-up algorithms sterns from practical considerations discussed in the paper. We also discuss shortcomings of three natural approaches to addressing the problem: exhaustive search over packages, simultaneous evaluation of projects, and sequential evaluation of projects.https://resolver.caltech.edu/CaltechAUTHORS:20170822-154731169Alliances Versus Federations: An Analysis with Military and Economic Capabilities Distinguished
https://resolver.caltech.edu/CaltechAUTHORS:20170822-141207023
DOI: 10.7907/6wz3e-c2738
This essay explores the distinction between federations and alliances and asks the question: When will states choose to federate rather than ally? William Riker (1964) argues that a necessary condition for a federal state's formation is that those offering the federal bargain must seek to "expand their territorial control, usually either to meet an external military or diplomatic threat or to prepare for military or diplomatic aggression and aggrandizement." This argument, though, fails to ask why states sometimes respond to threats by forming federations and at other times by forming alliances. Here, after assuming that states have initial endowments of military and economic resources, where economic resources enter utility functions directly and a.re what states maximize and where military capability influences preference only insofar as it determines a state's ability to counter threats, we offer a. multi-stage game-theoretic model in which states may be compelled to divert economic resources to military spending. Alliances, in turn, are self-enforcing coalitions designed to augment a state's offensive or defensive capabilities. Federations, which serve the same ends as alliances, a.re coalitions that need to be enforced by the "higher authority" established when the federation is formed. Our operating assumption is that states seek to form a. federation in lieu of an alliance if and only if (1) a stable alliance partition does not exist or, if one exists, it is dominated by an unstable partition and (2) if the cost of the loss of sovereignty to each state in the federation is offset by the gains from joining it, relative to what that state secures as its security value.https://resolver.caltech.edu/CaltechAUTHORS:20170822-141207023A Core-Theoretic Solution for the Design of Cooperative Agreements on Transfrontier Pollution
https://resolver.caltech.edu/CaltechAUTHORS:20170822-134728746
DOI: 10.7907/2e6r1-9na73
The paper highlights the relevance of the game theoretic concept of the core of a cooperative game for the design of international treaties on transfrontier pollution. Specifically, a formula is offered for allocating abatement costs between the countries involved for which the justification is of core-theoretic nature. The analysis emphasizes the strategic role of monetary transfers among the countries.https://resolver.caltech.edu/CaltechAUTHORS:20170822-134728746The Core of an Economy With Multilateral Environmental Externalities
https://resolver.caltech.edu/CaltechAUTHORS:20170817-165201984
DOI: 10.7907/tz6kj-f7g59
When environmental externalities are international—i.e. transfrontier—they most often are multilateral and embody public good characteristics. Improving upon inefficient laissez-faire equilibria requires voluntary cooperation for which the game-theoretic core concept provides optimal outcomes that have interesting properties against free riding.
To define the core, however, the characteristic function of the game associated with the economy (which specifies the payoff achievable by each possible coalition of players—here, the countries) must also specify in each case the behavior of the players which are not members of the coalition. This has been for a long time a major unsolved problem in the theory of the core of economies with many producers of a public good.
Among the several assumptions that can be made in this respect, a plausible one is defined in this paper, for which it is then shown that the core is nonempty. The proof is constructive in the sense that it exhibits a solution (i.e., an explicit coordinated abatement policy ) that has the desired property of nondomination by any proper coalition of countries, given the assumed behavior of the other countries.https://resolver.caltech.edu/CaltechAUTHORS:20170817-165201984Welfare Magnets, The Labor-Leisure Decision and Economic Efficiency
https://resolver.caltech.edu/CaltechAUTHORS:20170822-143609973
DOI: 10.7907/yfzr8-edd17
This paper develops a model designed to capture the fiscal externalities associated with redistributive policy in a system of jurisdictions. Policy changes in one jurisdiction affect other jurisdictions through both migration and work-disincentive effects. Previous work ignores work-disincentive effects and concludes that centralization is sufficient to eliminate fiscal externalities. Inclusion of work-disincentive effects unambiguously worsens fiscal externalities under both centralized and decentralized redistribution. Sufficiently severe work-disincentive effects guarantee that an increase in redistribution will harm the poor.https://resolver.caltech.edu/CaltechAUTHORS:20170822-143609973If Hamilton and Madison Were Merely Lucky, What Hope Is There for Russian Federalism?
https://resolver.caltech.edu/CaltechAUTHORS:20170822-145454303
DOI: 10.7907/yv98z-edr97
Just as the two-headed eagle of imperial and contemporary Russia looks in two different directions, this essay has two objectives: to evaluate, on the basis of the American experience, the prospects for stable democratic federalism in Russia, and to reconsider the insights into federalism offered by Madison and Hamilton in The Federalist. The swirl of events in Russia make it difficult if not impossible to confidently render conclusions about the future direction of events and the prospects for meaningful federal domestic relations. However, some theoretical perspective can be gained by looking at the theory of federalism offered in The Federalist Papers, with special attention to Madison and Hamilton's failure to appreciate fully the role political parties would play in the eventual integration of American political institutions so as to establish, in Madison's words, a "properly structured" federation. Looking as well at the early history of parties in the U.S. we see, in addition to the usual constitutional provisions associated with federalism, the importance of those things that structure political competition within states. Properly designed, these things encourage the development of political parties that mirror federal relations, and integrate regional and national political elites so as to avert center-periphery conflict. Unfortunately, a review of the provisions currently in place for Russia reveals that electoral practices and regional and republic constitutions and proposals are unlikely to encourage parties of the sort that facilitate a stable federal system. This fact, in conjunction with several other trends (notably, corruption and the political instincts of political elites in Moscow) lead to the conclusion that a "federation" of the type currently observed in, say, Mexico is a better scenario of the future for Russia than is a federation that imitates the U.S., Australia, Germany, or Switzerland.https://resolver.caltech.edu/CaltechAUTHORS:20170822-145454303Human Capital and Legislative Outcomes
https://resolver.caltech.edu/CaltechAUTHORS:20170822-144456865
DOI: 10.7907/486rs-pky55
This paper presents a model in which legislators use informational advantages to engage in rent-seeking activities. Previous work that postulated either informational asymmetries or rent-seeking behavior could not explain deviations from the median preference without reference to "committee power." Integration of these forces demonstrates that legislative outcomes need not correspond to the median preference regardless of the extent to which "committee power" is present in a legislature. In general, both procedural and informational asymmetries can induce deviations in legislative outcomes.https://resolver.caltech.edu/CaltechAUTHORS:20170822-144456865Institutions and Incentives: The Prospects for Russian Democracy
https://resolver.caltech.edu/CaltechAUTHORS:20170822-141704504
DOI: 10.7907/a6qwf-33e44
The lament that Russia is at the mercy of powerful personalities contesting for the reigns of power may be accurate. But here we want to find a way out of this condition. We begin by noting that more than mere lip-service needs to be paid to the idea that the two dimensions of reform - economic and political - are fused and that one cannot be attacked without attacking the other. Just as economic policies are manipulated in accordance with the principle that socially desirable outcomes cannot be willed or wished into existence - they derive, if at all, from the ways in which government action and the structure of economic institutions channel individual self-interest - the same must be true of political reform. Tracing the interests established by Russia's current constitutional order with respect to representation and elections, though, we conclude that that order and those interests almost certainly preordain executive-legislative conflict. Focusing, then, on those things that can be changed without constitutional amendment, we suggest a set of electoral reforms that promise to alleviate at least this problem and that allow for presidential leadership rather than the mere administration of authority and power.https://resolver.caltech.edu/CaltechAUTHORS:20170822-141704504Relational-Functional Voting Operations
https://resolver.caltech.edu/CaltechAUTHORS:20170828-144816682
DOI: 10.7907/hket7-e2774
The operators which transform binary relations of voters to collective choice function are studied. The conditions on operators are given, the main one is the condition of locality. The explicit form of operators for special subclass of local operators is obtained. All study are made in terms of special language—list-form representation of operators.https://resolver.caltech.edu/CaltechAUTHORS:20170828-144816682Notes on Constitutional Change in the ROC: Presidential versus Parliamentary Government
https://resolver.caltech.edu/CaltechAUTHORS:20170823-144216742
DOI: 10.7907/dqqde-vaa45
The debate over constitutional reform has moved to center stage in Taiwan, with a focus on two issues: the choice of presidential versus parliamentary government and a determination of the ultimate role of the National Assembly. These two issues, in turn, are linked by a third -- whether the president ought to be elected indirectly by the National Assembly or directly in a mass popular vote. Of these issues, though, the choice between a presidential and a parliamentary system is central, because it requires that we consider the methods whereby chief executives and legislators are elected and, correspondingly, the role of the National Assembly. Beginning, then, with the issue of presidential versus parliamentary government, this essay argues that the most commonly cited arguments over the advisability of choosing one or the other of these two forms are, for the most part, theoretically meaningless and are largely rhetorical devices for rationalizing prejudices about preferred governmental structures and the state's role. Consequently, we attempt here to provide a more useful set of criteria with which to evaluate reform in general and the choice between presidential and parliamentary government in particular. We conclude that although the choice between presidential and parliamentary forms is important, equal attention should be given to the methods whereby a president and the legislature are elected. It is these institutional parameters that determine the character of political parties in Taiwan, their ability to accommodate any mainlander-native Taiwanese conflict, and the likelihood that executive and legislative branches will formulate coherent domestic and international policy.https://resolver.caltech.edu/CaltechAUTHORS:20170823-144216742The Preference Reversal Phenomenon: Response Mode, Markets and Incentives
https://resolver.caltech.edu/CaltechAUTHORS:20170828-162350199
DOI: 10.7907/7qqxt-mz131
This paper addresses the apparent conflict between the results of experiments on individual choice and judgment and the results of market experiments. Data are reported for experiments designed to analyze the effects of (a) economic incentives, repetition, feedback and information and (b) choice and valuation response modes on (c) subjects' decisions in paired market and nonmarket environments. Causes of divergent market and nonmarket behavior are identified in the context of the preference reversal phenomenon (PRP). Study of the PRP is extended to two types of market environments. The PRP is observed on the first repetition in a market setting (second price auction) with immediate feedback, both with and without financial incentives. However, after five repetitions of the auction, the subjects' bids are generally consistent with their choices and the asymmetry between the rates of predicted and unpredicted reversals disappears. An individual pricing task using the BDM mechanism yields similar results on the first repetition but results which differ from the second price auction on the fifth repetition. Choice tasks produce lower rates of reversals than do pricing tasks in both market and individual decision making settings.https://resolver.caltech.edu/CaltechAUTHORS:20170828-162350199An Experimental Study of Constant-sum Centipede Games
https://resolver.caltech.edu/CaltechAUTHORS:20170823-133811016
DOI: 10.7907/g1kfx-nex14
In this paper, we report the results of a series of experiments on a version of the centipede game in which the total payoff to the two players is constant. Standard backward-induction arguments lead to a unique Nash equilibrium outcome prediction, which is the same as the prediction made by theories of "fair" or "focal" outcomes.
We find that subjects frequently fail to select the unique Nash outcome prediction. While this behavior was also observed in McKelvey and Palfrey (1992) in the "growing pie" version of the game they studied, the Nash outcome was not "fair", and there was the possibility of Pareto improvement by deviating from Nash play. Their findings could therefore be explained by small amounts of altruistic behavior. There are no Pareto improvements available in the constant-sum games we examine, hence explanations based on altruism cannot account for these new data.
We examine and compare two classes of models to explain this data. The first class consists of non-equilibrium modifications of the standard "Always Take" model. The other class we investigate, the Quanta! Response Equilibrium model, describes an equilibrium in which subjects make mistakes in implementing their best replies and assume other players do so as well. One specification of this model fits the experimental data best, among the models we test, and is able to account for all the main features we observe in the data.https://resolver.caltech.edu/CaltechAUTHORS:20170823-133811016Investment and Insider Trading
https://resolver.caltech.edu/CaltechAUTHORS:20170825-161136189
DOI: 10.7907/xtkt5-qzm74
Within a dynamic environment, this paper introduces an inside trader to an economy where rational, but uninformed, traders choose between investment projects with different levels of insider trading. When inside information has little value in future investment decisions, insider trading distorts investment towards assets with less private information, imposing net welfare costs on the economy. When an insider's private information is valuable in making future investment decisions, the net social benefit of inside trading can be positive; the resulting increases in investment efficiency due to more informative prices is enough to compensate for the distortion induced by the inside trader.
When insiders receive private information more than once, insiders may trade to reveal their private information at the beginning of their relationship with the firm. This has two effects; i) more information is revealed in equilibrium and ii) there is less chance than an uninformed agent will have to trade with the insider. Both these effects reduce the investment inefficiencies associated with insider trading. As a consequence, uninformed liquidity traders prefer to trade in a market with a long-term insider. This improvement in investment efficiency, leads to a Pareto improvement - both the uninformed traders and the insider are made better off if the insider receives information more than once.https://resolver.caltech.edu/CaltechAUTHORS:20170825-161136189A Note on Sequential Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170825-162510308
DOI: 10.7907/mq1ss-6rm28
This note provides an explanation for the 'declining-price anomaly' in sequential second price auctions. We illustrate how the average winning bids of risk neutral agents bidding for objects with valuations drawn from independent, identical distributions are lower in later auctions than in earlier auctions. When the objects are not identical we determine the optimal order in which they should be auctioned.https://resolver.caltech.edu/CaltechAUTHORS:20170825-162510308Choosing a Tax Treatment for New Financial Products
https://resolver.caltech.edu/CaltechAUTHORS:20170828-144012134
DOI: 10.7907/0f5gs-x4152
Two desirable properties for a tax system that must specify tax treatments for new financial instruments are consistency and universality. A tax system is universal if the system can designate a tax treatment for any cash flow pattern. Consistency requires that the tax treatment for each cash flow pattern be unique. A third property, linearity, holds if dividing the cash flow into different combinations of securities will not affect the tax treatment.
One way to achieve consistency and universality is to construct a tax system with a single systematic pattern of taxation, such as cash flow taxation or accretion taxation. But this extreme degree of homogeneity is not necessary. Consistent and universal tax systems can harbor radically different treatments for different types of transactions. "Bifurcation approaches" divide a new financial instrument into certain prototype transactions with known tax treatments. The tax treatment for the new instrument is the sum of the tax treatments of the prototypes that sum up to the instrument. "Integration approaches" use rules that tax aggregates of instruments within the taxpayer's portfolio.
Bifurcation methods have a natural connection to linearity. These methods will not achieve consistency and universality in a nonlinear setting unless they are accompanied by elements of an integration approach. All universal and linear tax systems can be generated by "the spanning method," a specific kind of bifurcation.
Spanning method approaches are only a subclass of a broader set of integration approaches that achieve consistency and universality. In evaluating integration approaches, a key property is continuity, the requirement that tax treatments do not jump in response to small changes in any given portfolio. Continuity is a generalization of consistency. The existence of jumps leads to the possibility of serious tax manipulation of the same sort that would arise from inconsistencies.
The current U.S. tax system includes some direct inconsistencies. That is, the same transaction can be packaged different ways to achieve different tax results. These inconsistencies can only be eliminated by fundamental reform. Even the most powerful integration approaches cannot address the problem of direct inconsistencies.
This fact raises difficulties for authorities such as the Treasury Department and the courts who have only low level reform at their disposal. In promulgating regulations or deciding cases that involve new financial instruments, these authorities must choose rules using a second best approach. Loose ends in the form of inconsistencies or lack of universality are inevitable.https://resolver.caltech.edu/CaltechAUTHORS:20170828-144012134Ethnic Heterogeneity, District Magnitude, and the Number of Parties
https://resolver.caltech.edu/CaltechAUTHORS:20170828-163253963
DOI: 10.7907/w0tap-b7q51
Recent events leading to the importation of democratic ideas and ideals by previously totalitarian states increase our interest in the ways in which electoral institutions influence party systems. However, even if we restrict our attention to Eastern Europe or the successor states of the Soviet empire, we encounter a range of social diversity – ethnic heterogeneity - that is as great as those in the set of countries examined in earlier studies that seek to identify the influence of electoral laws (c.f., Rae, Lijphart, and Taagepera and Shugart). Curiously, though, these earlier studies fail to ascertain whether and to what extent electoral laws mediate the influence of this heterogeneity. Hence, to develop a more pragmatic understanding of electoral institutions, we adopt the view of electoral laws as intervening structures and, using the data of these earlier analyses, we reconsider the role of one institutional parameter - district magnitude - that some researchers regard as the most important characteristic of an electoral system. Aside from the usual caveats about the limitations of our data, our primary conclusion is that district magnitude is not merely an important determinant of the number of parties that compete in a political system, but that it can offset the tendency of parties to multiply in heterogeneous societies.https://resolver.caltech.edu/CaltechAUTHORS:20170828-163253963Issues and The Presidential Primary Voter
https://resolver.caltech.edu/CaltechAUTHORS:20170824-161119322
DOI: 10.7907/2dzqg-j2c55
Most agree that voting in presidential general elections is largely contingent on the evaluations of the candidates, issues, and parties. Yet in presidential primary elections the determinants of voter choices a.re less clear. Partisanship is inconsequential, information about candidate personalities and policy positions is scarce, and a fourth factor, expectations, may influence voters. In this paper, we reconsider the influence of political issues in presidential primaries. We argue that pa.st work has not adequately considered how "issues matter" in primary elections. Primaries are intra-party affairs, and the political issues which typically divide the parties are not very relevant in primaries. Instead, we focus on the policy issues ea.ch candidate chooses to emphasize in their quest for the nomination, which we call policy priori ties. With data gathered about media coverage of the presidential contenders in the 1988 primaries, and using exit poll data from the 1988 Super Tuesday primaries, we show that issues, as policy priorities, do matter in presidential primary elections. This research also implies that primary campaigns matter, since information concerning the policy priorities of the candidates reaches the intended audience.https://resolver.caltech.edu/CaltechAUTHORS:20170824-161119322Fads, Upward Sloping Demand and the Stability of Equilibria in an Experimental Market
https://resolver.caltech.edu/CaltechAUTHORS:20170828-151300217
DOI: 10.7907/8sxef-4m416
The objective of the paper is to study markets in which the value of the activity to any one person increases with the level with which the activity is undertaken by others. The general interpretation could be fads, mimicking behavior or some sort of belief formation process in which the beliefs or expectations of agents about some underlying state of nature are influenced by the buying behavior of other agents. The result is to create a market that can be modeled as having an upward sloping market demand curve. The questions posed are: (1) in the fad-like environment does the classical concept of equilibrium (as an equating of market demand and market supply) accurately predict market behavior; (2) can both stable and unstable equilibria be observed; and (3) which of the two classical concepts of stability best describes the conditions under with instability is observed?
The results of the paper confirm some of the major findings of Plott and George who studied a similar environment with a downward sloping supply. In a market organized by the multiple unit double auction (MUDA), the equilibration at a demand equals supply equilibrium is observed under the conditions of a fad. Disequilibrium follows the dynamics of the Marshallian model as opposed to the Walrasian model.https://resolver.caltech.edu/CaltechAUTHORS:20170828-151300217General Equilibrium, Markets, Macroeconomics and Money in a Laboratory Experimental Environment
https://resolver.caltech.edu/CaltechAUTHORS:20170825-135222487
DOI: 10.7907/hbe2k-nn561
This paper reports on the use of laboratory experimental techniques to create relatively complete economic systems. The creation of these market systems reflects a first attempt to explore the nature of inherently interdependent environments, to assess the ability of simultaneous equations equilibrium models like the classical static general competitive equilibrium model, and to predict aspects of system behaviors. In addition, the impact of the quantity of a fiat money was studied. The economies were successfully created. Classical models capture much of what was observed.https://resolver.caltech.edu/CaltechAUTHORS:20170825-135222487Less Filling, Tastes Great: The Realist-Neoliberal Debate
https://resolver.caltech.edu/CaltechAUTHORS:20170828-153031000
DOI: 10.7907/zr4b6-39b34
This essay examines and reformulates the realist-neoliberal debate. Focusing initially on the issue of the attribution of goals to states, we argue that not only are goals merely the epiphenomena of other things but also that their specification constitutes but a re-description of strategic environments. That is, although an attribution of goals may contribute to our characterization of outcomes, a discussion of them is not central to the development of a theory that explains and predicts the outbreak of conflict and the patterns of cooperation. Instead we argue that the realist-neoliberal debate should be recast so that our central research agenda is the development of substantively specific models that allow us to ascertain how the equilibrium to a game in which states structure international affairs influences the types of issue-specific subgames states play, how countries coordinate to equilibria of different types, how we can characterize the coordination problems associated with different equilibria, how states can enhance the attractiveness of an equilibrium, and how states can signal commitments to the strategies that are part of that equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170828-153031000Uncertainty and Political Perceptions
https://resolver.caltech.edu/CaltechAUTHORS:20170825-133724656
DOI: 10.7907/xbn07-tae09
The world of politics is uncertain. Citizens are only imperfectly informed about current governmental actions and about the promises of politicians regarding future courses of public policy. Politicians and candidates, moreover, often have incentives to disseminate ambiguous and perhaps inconsistent information. Previous work, both theoretical and empirical, has largely failed to incorporate this uncertainty into the analysis of public opinion and electoral behavior. In this paper we discuss measures designed to elicit the uncertainty survey respondents feel about their political perceptions. These measures demonstrate response patterns which are interpretable and substantively interesting. Also, the response patterns are consistent with a model relating uncertainty to citizen information costs. And last, these measures allow us to understand the stated perceptions of respondents in novel and important ways.https://resolver.caltech.edu/CaltechAUTHORS:20170825-133724656Price Expectations in Experimental Asset Markets with Futures Contracting
https://resolver.caltech.edu/CaltechAUTHORS:20170825-165046753
DOI: 10.7907/vtvad-h9c04
A financial bubble is defined as a condition in which the trading price of an asset is above (and increasing relative to) its discounted present value of earnings, i.e., its fundamental value. Consider the data in Figure 1 where an asset was traded over 15 consecutive trading periods. In the figure, we plot the deviation in mean contract price from the net asset value (NAV) of the security for each of the trading periods. Notice that the price grows steadily, peaks, and then "crashes" to its NAV. There is a puzzle concerning the level of premiums and discounts from NAV for closed-end funds, which provides a challenge to a rational expectations theory of asset pricing, since the data in Figure 1 was generated from an experiment in which the fundamental value of the asset was controlled. There were no external market factors to justify the deviation from fundamental value other than the capital gains expectation of the participants (see smith, Suchanek and Williams [1988] hereafter referred to as SSW). The phenomena describe in Figure 1 is a very standard outcome of a specific experimental asset market that has been replicated over 70 times with diverse subject pools.https://resolver.caltech.edu/CaltechAUTHORS:20170825-165046753A Draft Constitution for the Russian Federation
https://resolver.caltech.edu/CaltechAUTHORS:20170824-164317036
DOI: 10.7907/z0t1v-6vq54
The draft constitution presented here was prepared to assist an ad hoc committee of Russian academics, politicians, and businessmen in their attempt to write a constitution for the Russian Federation that would overcome the deficiencies of the current constitution and existing official proposals for a new constitution. Russia's current constitution, ratified in 1978, amended over two hundred times in the last two years, and designed almost as if it were intended to generate constitutional crises, requires immediate replacement. A year and a half after it was formed, Russia's Constitutional Reform Commission has not crafted an acceptable alternative. Its proposals include a bill of rights clouded with qualifications that render those rights unenforceable or non-existent. Rights are combined with socialist guarantees that obscure the distinction between limits on the state and social aspirations, and articles are prepared under the premise that constitutions constrain not only the state but citizens as well, including their behavior in such areas as child-rearing and property ownership.https://resolver.caltech.edu/CaltechAUTHORS:20170824-164317036Designing Organizations for Trading Pollution Rights
https://resolver.caltech.edu/CaltechAUTHORS:20170825-143301240
DOI: 10.7907/zvw0j-4sc93
Regulators and academicians have recently become interested in using a marketable permits program as a new way to control aggregate pollution emissions. Our research focuses on choosing a permit trading mechanism that is both economically efficient and politically viable. We consider an organized trading process and a revenue neutral auction, both of which involve an initial allocation of permits based on past history. Each is tested in a competitive and in a non-competitive environment to determine which mechanism performs best. The results of our research suggest that, overall, the organized trading process outperforms the revenue neutral auction.https://resolver.caltech.edu/CaltechAUTHORS:20170825-143301240Policy Moderation Qr Conflicting Expectations? Testing The Intentional Models of Split-Ticket Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170825-132819904
DOI: 10.7907/ykx1p-w3965
In this paper we examine two models of the electoral origins of divided government. One model is the policy-moderation model, advocated originally by Fiorina (1988, 1992). The other model focuses on the different expectations held by the electorate of the branches of government, as well as the different electoral contexts (congressional and presidential) in which voter decision making occurs (Jacobson 1990A, 1990B). Utilizing individual-level survey data, we test various hypotheses derived from each model. Our empirical results give little support to the policy-moderation model. However, the second model has strong empirical support. We conclude with a discussion of our results for empirical and normative studies of divided control of government.https://resolver.caltech.edu/CaltechAUTHORS:20170825-132819904Veto Games: Spatial Committees Under Unanimity Rule
https://resolver.caltech.edu/CaltechAUTHORS:20170824-163710512
DOI: 10.7907/bqcs9-c3k77
There exists a large literature on two-person bargaining games and distribution games (or divide-the-dollar games) under simple majority rule, where in equilibrium a minimal winning coalition takes full advantage over everyone else. Here we extend the study to an n-person veto game where players take turns proposing policies in an n-dimensional policy space and everybody has a veto over changes in the status quo. Briefly, we find a Nash equilibrium where the initial proposer offers a policy in the intersection of the Pareto optimal set and the Pareto superior set that gives everyone their continuation values, and punishments are never implemented. Comparing the equilibrium outcomes under two different agendas—sequential recognition and random recognition—we find that there are advantages generated by the order of proposal under the sequential recognition rule. We also provide some conditions under which the players will prefer to rotate proposals rather than allow any specific policy to prevail indefinitely.https://resolver.caltech.edu/CaltechAUTHORS:20170824-163710512Trading in a Pure Exchange Economy without an Auctioneer: An Experimental Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170828-140533521
DOI: 10.7907/fy9vd-3v732
This paper explores alternatives to the vVa1rasian Auctioneer for the allocation of resources in a pure exchange economy lacking a bilateral coincidence of wants. We have created three different computerized trading processes called BARTER, NUMERAIRE, and CARE (acronym for Computer Assisted Resource Exchange). CARE is a "smart market" in the sense that it contains computer algorithms that assist users in finding a coincidence of wants. The experimental results show that CARE outperforms BARTER and NUMERAIRE by extracting most of the gains from exchange with fewer contracts, lower volume, smaller utility swings, and lower variances in final utility positions.https://resolver.caltech.edu/CaltechAUTHORS:20170828-140533521Perception and Misperception: Constituent Knowledge of Their Representative's Persian Gulf War Vote
https://resolver.caltech.edu/CaltechAUTHORS:20170825-135911186
DOI: 10.7907/43gk8-wxh23
Many assert that constituents know little of their elected official's behavior, especially how their representative s have voted on specific legislative bills. When legislation concerns foreign affairs, the prospects for constituent knowledge are usually asserted to be even bleaker. We challenge these assertions. Our challenge is based on an intensive analysis of one highly salient roll call vote: The House and Senate vote s on the January 14, 1991 "Use of Force Resolution." Using data from the 1990-1991 Panel Study of the Political Consequence s of War we examine constituent perceptions of House and Senate member "Use of Force Resolution" votes. We find that aggregate perceptions of senator votes vary according to the senator's party, his or her tenure and past electoral competitiveness. At the individual level, we find that, while more informed constituents had more accurate perceptions, many less- informed citizens were able to use readily-understandable cues in developing their perceptions.https://resolver.caltech.edu/CaltechAUTHORS:20170825-135911186Competitive Solutions and Uniform Competitive Solutions for Cooperative Games
https://resolver.caltech.edu/CaltechAUTHORS:20170823-145603278
DOI: 10.7907/9x7c6-vdf58
The competitive solution and the uniform competitive solution are different solution concepts for cooperative games. The first was introduced by McKelvey, Ordeshook and Winer (1978), the second is proposed in the present paper, however, it is deriving from the same original ideas.
A (uniform) competitive solution is a finite configuration of effective payoff vectors, each of them being associate to a coalition. This configuration satisfies two fundamental requirements: the internal stability and the external stability. Despite the existing differences the competitive solutions and the uniform competitive solutions have some common properties and are also related to some other classical solution-concepts.
The uniform competitive solution is a modified version of the competitive solution. Some disadvantages of the earlier concept are removed and the existence theorems for very large classes of both transferable utility and non-transferable utility games are provided.https://resolver.caltech.edu/CaltechAUTHORS:20170823-145603278Strategic Manipulability is Inescapable: Gibbard-Satterthwaite without Resoluteness
https://resolver.caltech.edu/CaltechAUTHORS:20170828-150133697
DOI: 10.7907/mq19n-p5k14
The Gibbard-Satterthwaite Theorem on the manipulability of collective-choice procedures treats only of resolute procedures. Few real or reasonable procedures are resolute. We prove a generalization of Gibbard-Satterthwaite that covers the nonresolute case. It opens harder questions than it answers about the prediction of behavior and outcomes and the design of institutions.https://resolver.caltech.edu/CaltechAUTHORS:20170828-150133697A Dynamic Migration Model with Uncertainty
https://resolver.caltech.edu/CaltechAUTHORS:20170825-134514585
DOI: 10.7907/h6gfj-xhx72
We study a dynamic version of the Harris-Todaro migration model where a finite population of infinitely lived Bayesian agents choose consumption and migration decision rules as a function of their histories. The agents do not know the production functions in the two sectors and learn about them through wage draws that they receive from the stochastic production functions. The government knows the true production functions but is uninformed about the agents' beliefs, and the actual wage draws they observe. The government maximizes its welfare function- using wage subsidies in the two sectors, and a migration tax. We solve the agents' dynamic programming problem, and then use the solution to solve the government's dynamic programming problem. We study the effects of government policies on the population distribution, and illustrate the model by numerically solving a particular parametric example.https://resolver.caltech.edu/CaltechAUTHORS:20170825-134514585The Design of Coordination Mechanisms and Organizational Computing
https://resolver.caltech.edu/CaltechAUTHORS:20170825-145611518
DOI: 10.7907/y75p1-g1790
We provide an introduction to a theory of coordination mechanism design and show how to apply it to an assignment problem. The purpose is to introduce those familiar with organizational computing, but unfamiliar with game theory and economics, to the subject. We also describe briefly how we can test new mechanisms before taking them into the field. Finally, we raise some unresolved research questions.https://resolver.caltech.edu/CaltechAUTHORS:20170825-145611518When Do Campaigns Matter? Informed Votes, the Heteroscedastic Logit and the Responsiveness of Electoral Outcomes
https://resolver.caltech.edu/CaltechAUTHORS:20170828-154104341
DOI: 10.7907/jqtw3-sf596
Previous research suggests that voters in mass elections tend to be badly informed. If these voters do not know enough about the relationship between the policy consequences of electoral outcomes and their own interests, then electoral outcomes may not provide meaningful expressions of voter interests. Can campaign activity affect the relationship between voter interests and electoral outcomes? To answer this question, we use survey data from 35 comparable elections and a new empirical methodology (Dubin and Zeng's [1991] heteroscedastic logit). The new methodology allows us to estimate the joint effect of voter information and interests on voting behavior in a way that is both theoretically justifiable and better at explaining the available data than traditional methods. We find that campaign activity increases the likelihood that electoral outcomes are responsive to (perhaps, otherwise badly informed) voter interests, when campaigners are able to exert costly and observable effort, are able to make credible statements and have the opportunity to engage in a vigorous and competitive campaign.https://resolver.caltech.edu/CaltechAUTHORS:20170828-154104341Discrete Pricing and Institutional Design of Dealership Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170825-151830431
DOI: 10.7907/ng8gs-j2119
This paper models trade in dealership markets when the price grid is in discrete units. Strategic interaction among market makers is complex: Because prices are no longer determined by a zero expected profits condition, priority rules and the timing of offers—do market makers submit price schedules first, or do traders first submit their orders and then market makers set prices—have significant effects on equilibrium outcomes. Discreteness effectively limits competition and permits market makers to offer profitable quotes. In order-driven institutions where traders first submit orders, absolute time priority leads to the "best" price schedule, one which is "better" than that obtained from quote-driven institutions where brokers submit schedules first. This may explain the institutional structure of the NYSE.https://resolver.caltech.edu/CaltechAUTHORS:20170825-151830431In or Out?: Centralization by Majority Vote
https://resolver.caltech.edu/CaltechAUTHORS:20170822-164842884
DOI: 10.7907/kqm1m-g9v70
We present a positive theory of centralization of political decisions. Voters choose centralization or decentralization depending on their forecast of the political organization that will favor the policies they prefer. We study the induced preferences for centralization as well results of different forms of referenda.https://resolver.caltech.edu/CaltechAUTHORS:20170822-164842884Intraday Trade in Dealership Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170824-151719863
DOI: 10.7907/ejv10-v2x89
We develop and test a structural asymmetric information transaction model to characterize the price impact of information when markets are thin. Since orders are accepted individually, the model allows for transaction costs and brokerage fees. Equilibrium demands mixed entry strategies on the part of potentially-informed traders. Estimation of the structural parameters is performed using a maximum likelihood procedure on NYSE data. The price impact of information is found to be positive and significant, but economically small. This is because while the amount of private information is substantial, the quality of the information signals is quite poor. Insiders do not trade small quantities, which suggests that their ability to divide orders is limited by transaction costs.https://resolver.caltech.edu/CaltechAUTHORS:20170824-151719863Testing Dividend Signalling Models
https://resolver.caltech.edu/CaltechAUTHORS:20170825-163157890
DOI: 10.7907/ffbr8-10564
This paper derives a key monotonicity property common to all dividend signaling models: the greater the rate that dividend income is taxed relative to capital gains income, the greater the value of information revealed by a given dividend, and hence the greater the associated excess return. This monotonicity condition is tested with robust non-parametric techniques. No evidence is found to support dividend signaling models. The same results are inconsistent with tax-based CAPM arguments.https://resolver.caltech.edu/CaltechAUTHORS:20170825-163157890Competitive Campaigns and the Responsiveness of Collective Choice
https://resolver.caltech.edu/CaltechAUTHORS:20170828-155317851
DOI: 10.7907/38v6t-m8153
We analyze a model of direct legislation to identify conditions under which competition in the provision of campaign information can affect the responsiveness of electoral outcomes to the preferences that a voter (or set of voters) would express if she (they) knew everything there was to know about the consequences associated with her electoral alternatives. The basic intuition underlying the model is that a voter's ability use campaign information to form accurate inferences about the consequences of competing electoral alternatives can be affected by information provider attributes. We show that competition in the provision of campaign information increases the responsiveness of electoral outcomes only if competition produces these attributes.https://resolver.caltech.edu/CaltechAUTHORS:20170828-155317851Binary Relations, Numerical Comparisons with Errors and Rationality Conditions for Choice
https://resolver.caltech.edu/CaltechAUTHORS:20170828-160548168
DOI: 10.7907/136db-xpg65
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170828-160548168Russia's Transition to Democracy: Essays 11-18
https://resolver.caltech.edu/CaltechAUTHORS:20170825-144512752
DOI: 10.7907/1nes8-bby40
The eight essays contained herein are the second in a series prepared for translation into Russian and publication in Moscow's Independent Gazette. Their translation into Russian will incorporate a number of details particular to Russia that are not included in the current English versions. In any event, these essays are predicated on the assumption that Russians know democracy only in superficial and sometimes inaccurate ways - that they fail to appreciate the interrelationships of constitutional institutions, of extraconstitutional structures, and of the give-and-take of democratic process that sometimes seems chaotic to those unaccustomed to the interplay of these institutions and structures. Support for this project was provided by the University of Maryland's project on Institutional Reform and The Informal Sector (IRIS). We also would like to thank Olga Shvetsova for her suggestions on topics that require coverage and her comments on the essays' specific content.https://resolver.caltech.edu/CaltechAUTHORS:20170825-144512752Firing in Non-Repeated Incentive Contracts
https://resolver.caltech.edu/CaltechAUTHORS:20170828-161619896
DOI: 10.7907/f6kh2-37503
When the presence of limited liability restricts a principal from imposing monetary fines on an agent in case of poor performance, the principal might use other kinds of punishment threats to deter the agent from shirking. If firing is costly to the agent it can be used by the principal even in non-repeated contracts. This paper considers the conditions under which a profit-sharing arrangement combined with a certain firing rule improves the principal's position compared to the situation in which firing is not an option. The optimal firing rule is established, and its effectiveness is considered as a function of exogenous economic variables. Possible applications of the proposed contract are suggested.https://resolver.caltech.edu/CaltechAUTHORS:20170828-161619896Are Legislators Afraid of Initiatives? Anticipation and Reaction in the Policy Process
https://resolver.caltech.edu/CaltechAUTHORS:20170824-151040280
DOI: 10.7907/qx9s4-4e470
This research considers how and when the popular initiative constrains legislative behavior and policy. I develop a spatial model of the policy process in which legislators anticipate the possibility that an initiative may be proposed in response to laws they pass. I use the model to identify conditions under which the initiative forces legislators to respond to citizen preferences. I conclude that features of the initiative process, especially electoral laws that affect the costs of proposing initiatives, as well as the preferences of political actors, largely determine whether legislators will be constrained.https://resolver.caltech.edu/CaltechAUTHORS:20170824-151040280An Experimental Examination of the Walrasian Tatonnement Mechanism
https://resolver.caltech.edu/CaltechAUTHORS:20170828-135118421
DOI: 10.7907/mwfh9-3n604
Joyce (1984) reports results of experiments of a Walrasian tatonnement auction that show that the mechanism is stable, exhibits strong convergence properties and generates efficiency averaging better than 97%. He also found that when subjects could see part of the order flow (excess demand), price tended to be lower (favorable to buyers). His experiments consisted of a stationary environment where subjects were provided with single-unit supply and demand functions. This paper assesses the robustness of his results in a more complex setting and systematically investigates the effect of various order flow information and message restriction rules on the performance of the Walrasian mechanism. In particular, our subjects were provided with multi-unit demands and supplies where equilibrium price and subject values or costs were changed each trading period.https://resolver.caltech.edu/CaltechAUTHORS:20170828-135118421Asset Prices and Volume in a Beauty Contest
https://resolver.caltech.edu/CaltechAUTHORS:20170825-155217920
DOI: 10.7907/ycpe4-67t20
The dynamics of prices and volume are investigated in a market where agents disagree about the fundamental value of the asset. The distribution of beliefs is not taken to be common knowledge. The resulting infinite hierarchy of beliefs is solved by making the assumption that, prior to the first trading round, agents consider themselves to be average. Speculation is shown to generate substantial volatility and volume, bid and transaction price predictability, rich patterns of volume, and an inverse relationship between changes in transaction prices and the number of trading rounds without volume.https://resolver.caltech.edu/CaltechAUTHORS:20170825-155217920The Rational Expectations-n(ϵ)ϵ-Equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20170828-135832080
DOI: 10.7907/8hfva-ej356
In the rational expectations paradigm, one solves models of a large number of agents who optimize subject to a stochastic law of motion by assuming that all agents know that law of motion. If the agents do not know that law of motion perfectly, one needs a learning model. This paper follows the optimal learning literature by assuming that each agent constructs priors about the unknowns of the problem, and then updates those priors using the Bayes updating rule. The agents need to construct priors on the distribution of other agents' priors, and then on the distribution of priors on the distribution of priors, and so on, leading to an infinite hierarchy of beliefs. The existence of an optimal response given the current state vector and hierarchy of beliefs is proved. It is then shown that the resulting equilibrium, labeled the Rational Expectations-∞ equilibrium can be approximated by an ϵ-equilibrium where the infinite hierarchy is truncated at some level n(ϵ), and each agent believes that all of his higher level beliefs are concentrated at their true values.https://resolver.caltech.edu/CaltechAUTHORS:20170828-135832080The Groves-Ledyard Mechanism: An Experimental Study of Institutional Design
https://resolver.caltech.edu/CaltechAUTHORS:20170823-150747577
DOI: 10.7907/ammfx-20z29
The Groves-Ledyard mechanism theoretically can solve the 'free-rider' problem in public good provision in certain environments. Two questions are of overriding importance in implementing the mechanism. The first is related to the actual performance of the mechanism in general. The second is the choice of a 'punishment parameter', γ, which is the only parameter that is available for those who may want to actually use the mechanism. Thus the determination of the role of this variable on mechanism performance is fundamental for any advances along the lines of actual implementation. In studying the Groves-Ledyard mechanism, we show that the punishment parameter, γ, plays a crucial role in the performance of the mechanism. By using γ = 1 and 100, we show that under the higher punishment parameter, the Groves-Ledyard equilibrium is chosen much more frequently; a higher level of the public good is provided and efficiency is higher. By examining two behavioral models, we show that a higher γ leads to an increase in the probability of an individual choosing a best response predicted by the model. The parameter γ alone explains nearly 70% of the data in both the Cournot and the Carlson-Auster behavioral model. We also found that convergence to Cournot behavior is faster and more stable under a high γ than under a low γ.https://resolver.caltech.edu/CaltechAUTHORS:20170823-150747577Russia's Transition to Democracy: Essays 1-10
https://resolver.caltech.edu/CaltechAUTHORS:20170828-132542878
DOI: 10.7907/7wem4-mrn38
The ten essays contained herein are the first in a series of twenty prepared for translation into Russian and publication in Moscow's Independent Gazette. Their translation into Russian will incorporate a number of details particular to Russia that are not included in the current English versions. In any event, these essays are predicated on the assumption that Russians know democracy only in superficial and sometimes inaccurate ways - - that they fail to appreciate the interrelationships of constitutional institutions, of extra-constitutional structures, and of the give-and-take of democratic process that sometimes seems chaotic to those unaccustomed to the interplay of these institutions and structures. Support for this project was provided by the University of Maryland's project on Institutional Reform and The Informal Sector (IRIS). We also would like to thank Olga Shvetsova for her suggestions on topics that require coverage and her comments on the essays' specific content.https://resolver.caltech.edu/CaltechAUTHORS:20170828-132542878Uncovering Behavioral Strategies: Likelihood-Based Experimental Data Mining
https://resolver.caltech.edu/CaltechAUTHORS:20170824-160040708
DOI: 10.7907/evcyp-qez40
Economists and psychologists have recently been developing new theories of decision making under uncertainty that can accommodate the observed violations of standard statistical decision theoretic axioms by experimental subjects. We propose a procedure which finds a collection of decision rules that best explain the behavior of experimental subjects. The procedure is a combination of maximum likelihood estimation of the rules together with an implicit classification of subjects to the various rules, and a penalty for having too many rules. We prove that our procedure yields consistent estimates (as the number of tasks per subject and the number of subjects go to infinity) of the number of rules being used, the rules themselves, and the proportion of our population using each of the rules. We apply our procedure to data on probabilistic updating by subjects in four different universities. We get remarkably robust results which show that the most important rules used by the subjects are Bayes rule, representativeness rule (ignoring the prior), and conservatism (over-weighting the prior). In our procedure, the subjects are allowed to make errors, and our estimated error rate is typically 20%.https://resolver.caltech.edu/CaltechAUTHORS:20170824-160040708Government Partisanship, Labor Organization and Macroeconomic Performance: A Corrigendum
https://resolver.caltech.edu/CaltechAUTHORS:20170824-162753007
DOI: 10.7907/5f7wb-bjt75
Alvarez, Garrett and Lange (1991) used cross-national panel data on the OECD nations to show that countries with left governments and encompassing labor movements enjoyed superior economic performance. Here we show that the standard errors reported in that article are incorrect. Re-estimation of the model using ordinary least squares and robust standard errors shows that the major finding of Alvarez, Garrett and Lange, regarding the political and institutional causes of economic growth, is upheld but the findings for unemployment and inflation are open to question. We show that the model used by Alvarez, Garrett and Lange, feasible generalized least squares, cannot produce standard errors when the number of countries analyzed exceeds the length of the time period under analysis. Also, we argue that ordinary least squares with robust standard errors is superior to feasible generalized least squares for typical cross-national panel studies.https://resolver.caltech.edu/CaltechAUTHORS:20170824-162753007Dynamic Consistency Implies Approximately Expected Utility Preferences
https://resolver.caltech.edu/CaltechAUTHORS:20170828-143106635
DOI: 10.7907/yxwaw-esd54
Machina has proposed a definition of dynamic consistency which admits non-expected utility functionals. We show that even under this new definition, a dynamically consistent preference relation that is differentiable becomes arbitrarily close to an expected utility preference after the realization of a low probability event.https://resolver.caltech.edu/CaltechAUTHORS:20170828-143106635Covers
https://resolver.caltech.edu/CaltechAUTHORS:20170823-141457013
DOI: 10.7907/5hffk-6gp53
This paper introduces the theory of covers for functions defined over binary variables. Covers formalize the notion of decomposability. Large complex problems are decomposed into subproblems each containing fewer variables, which can then be solved in parallel. Practical applications of the benefits from decomposition include the parallel architecture of supercomputers, the divisionalization of firms, and the decentralization of economic activity. In this introductory paper, we show how covers also shed light on the choice among public projects with complementarities. Further, covers provide a measure of complexity/decomposability with respect to contour sets, allowing for nonlinear effects which occur near the optimum to receive more weight than nonlinear effects arbitrarily located in the domain. Finally, as we demonstrate, covers can be used to analyze and to calibrate search algorithms.https://resolver.caltech.edu/CaltechAUTHORS:20170823-141457013The Extraction of Information From Multiple Point Estimates
https://resolver.caltech.edu/CaltechAUTHORS:20170825-142324974
DOI: 10.7907/rm31z-kcz44
The use of a number of point estimation experiments to construct a least informative prior subject to the information in the estimation experiments is studied. The form of the resulting prior is established. The prior depends on parameters that result from solving a calculus of variations problem. It is shown that simple Gibbs sampler algorithms converge to the desired solution, and simulated annealing algorithms yield the mode of the prior. The Gibbs sampler algorithm is ergodic, and hence Bayes risks can be directly computed using time averages of a single series of draws from the sampler.https://resolver.caltech.edu/CaltechAUTHORS:20170825-142324974Transaction Prices When Insiders Trade Portfolios
https://resolver.caltech.edu/CaltechAUTHORS:20170825-151049012
DOI: 10.7907/nz7b5-k7z07
Statistical properties of transaction prices are investigated in the context of a multi-asset extension of Kyle [1985]. Under the restriction that market makers cannot condition prices on volume in other markets, Kyle's model is shown to be consistent with well-documented lack of predictability of individual asset prices, positive autocorrelation of index returns, and low cross-sectional covariance. The covariance estimator of Cohen, e.a. [1983] provides the right estimates of the "true" covariance. However, Kyle's model cannot explain the asymmetry and rank deficiency of the matrix of first-order autocovariances. Asymmetry obtains when the insider limits his strategies to trading a set of pre-determined portfolios. If these portfolios are well-diversified, the matrix of first-order autocovariances is asymptotically rank-deficient. If the insider uses only one portfolio (as when "timing the market"), its asymptotic rank equals one, conform to the empirical results in Gibbons and Ferson [1985].https://resolver.caltech.edu/CaltechAUTHORS:20170825-151049012Arbitrage and Existence of Equilibrium in Infinite Asset Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170828-132928163
DOI: 10.7907/w548r-cpv88
This paper develops a framework for a general equilibrium analysis of asset markets when the number of assets is infinite. Such markets have been studied in financial economics in the context of asset pricing theories. A distinctive feature of an equilibrium model of asset markets is that investors' portfolio-choice sets are typically not bounded below. We prove that an equilibrium exists under a condition that markets are arbitrage-free. The markets are arbitrage-free if there is a price system under which no investor has an arbitrage opportunity. The concept of an arbitrage opportunity used in this paper differs from the standard concept on an arbitrage portfolio in financial markets which is a portfolio that guarantees a non-negative payoff in every event, a positive payoff in some event and has zero price. We provide an extensive discussion of concepts of an arbitrage opportunity.https://resolver.caltech.edu/CaltechAUTHORS:20170828-132928163The Spending Game: Money, Votes, and Incumbency in Congressional Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170824-153709478
DOI: 10.7907/a9wsd-5xx59
This paper takes a game-theoretic approach to the analysis of the spending-votes relationship in Congressional elections to reinvestigate the surprisingly weak effects of incumbent spending measured in previous studies. Rather than focusing narrowly on the impact of spending on electoral outcomes, we attempt to take account of the reciprocal effect of (anticipated) closeness on spending using several statistical approaches. We also offer improvements in the specification and measurement of the vote equation, by using a better measure of district party strength adjusted for year-effects, and by including a variable that measures the heat of the campaign in terms of total spending by the incumbents and challengers. The latter measure partially corrects for the simultaneously determined (and highly positively correlated) levels of incumbent and challenger spending. A more rigorous multiequation simultaneous equations model, identified by uncorrelated errors, provides even more leverage for sorting out the effects of incumbent and challenger spending on votes. That analysis indicates (in a complete turnaround from findings reported elsewhere) that incumbent spending effects are highly significant and of a magnitude that is, if anything, greater than challenger spending effects. The paper concludes by using a game theoretic model to estimate the effect of anticipated closeness on spending and to estimate differences in campaign financing costs between incumbents and challengers.https://resolver.caltech.edu/CaltechAUTHORS:20170824-153709478Subjective Games and Equilibria
https://resolver.caltech.edu/CaltechAUTHORS:20170823-134424901
DOI: 10.7907/wgnmq-mgm72
Applying the concepts of Nash, Bayesian or correlated equilibrium to analysis of strategic interaction, requires that players possess objective knowledge of the game and opponents' strategies. Such knowledge is often not available.
The proposed notions of subjective games, and subjective Na.sh and correlated equilibria, replace unavailable objective knowledge by subjective assessments. When playing such a game repeatedly, subjective optimizers will converge to a subjective equilibrium. We apply this approach to some well known examples including a single multi-arm bandit player, multi-person multi-arm bandit games, and repeated Cournot oligopoly games.https://resolver.caltech.edu/CaltechAUTHORS:20170823-134424901Judging the Rationality of Decisions in the Presence of Vague Alternatives
https://resolver.caltech.edu/CaltechAUTHORS:20170825-140609961
DOI: 10.7907/cjf91-8xj89
The standard framework of the decision theory is subjected to partial revision in regard to the usage of the notion of alternative. An approach to judging the rationality of decision-maker's behavior is suggested for various cases of incomplete observability and/or controllability of alternatives. The approach stems from the conventional axiomatic treatment of rationality in the general choice theory and proceeds via modifying the description of alternative modes of behavior into a generalized model that requires no explicit consideration of alternatives. The criteria of rationality in the generalized decision model are proposed. For the conventional model in the choice theory, these criteria can be reduced to the well known criteria of the regularity (binariness) of choice functions. Game and economic examples are considered.https://resolver.caltech.edu/CaltechAUTHORS:20170825-140609961The Holdout Game: An Experimental Study of an Infinitely Repeated Game with Two-Sided Incomplete Information
https://resolver.caltech.edu/CaltechAUTHORS:20170829-134803360
DOI: 10.7907/h5sgq-1q935
We investigate experimentally a two-person infinitely repeated game of incomplete information. In the stage game, each player chooses to give in or hold out. Players have privately known costs of giving in and each player receives a fixed benefit whenever at least one player gives in. High cost players have a dominant strategy in the stage game to hold out, and the low cost players ' best response depends on what the opponent does. Equilibrium play to the infinitely repeated game conveys information about the players' type.
We investigate two questions: whether there is any evidence that subject behavior approximates belief stationary equilibria, and whether there is evidence that subjects will converge to an equilibrium of the correct state. We conclude that subjects do not adopt symmetric belief stationary strategies for the holdout game. However, we cannot reject the hypotheses that subjects converge towards eventually playing an equilibrium of the correct state (even though they do not always learn the correct state). Behavior of experienced subjects is closer to the predictions of symmetric belief-stationary equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170829-134803360The Use and Misuse of Surveys in Economic Analysis: Natural Resource Damage Assessment Under CERCLA
https://resolver.caltech.edu/CaltechAUTHORS:20170830-153743263
DOI: 10.7907/t68nv-jj920
This paper examines problems with the admissibility of contingent use methodology surveys in natural resource damage assessment cases under the Comprehensive Environmental Response, Compensation and Liability Act of 1980 (CERCLA), as well as the propriety of their use in formulating public policy. Using a contingent use survey conducted in conjunction with the New Bedford Harbor Superfund case and two follow-up surveys, a number of errors and biases associated with contingent use methodology surveys are isolated and analyzed.https://resolver.caltech.edu/CaltechAUTHORS:20170830-153743263Constitutional Secession Clauses
https://resolver.caltech.edu/CaltechAUTHORS:20170822-154623781
DOI: 10.7907/0ck4t-s7q27
Taking the view that constitutions are devices whereby people coordinate to specific equilibria in circumstances that allow multiple equilibria, we show that a constitutional secession clause can serve as such a device and, therefore, that such a clause is more than an empty promise or an ineffectual threat. Employing a simple three-person recursive game, we establish that under certain conditions, this game possesses two equilibria - one in which a disadvantaged federal unit secedes and is not punished by the other units in the federation, and a second equilibrium in which this unit does not secede but is punished if it chooses to do so.https://resolver.caltech.edu/CaltechAUTHORS:20170822-154623781Noisy Signalling in Financial Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170830-163107142
DOI: 10.7907/31151-11489
Separating signaling equilibria of financial markets with anonymous insiders are investigated. Definitions of separating signaling equilibria are extended to allow for the noise that provides anonymity. The role of noise in equilibrium existence results is clarified. In particular, the result of Glosten and Madhavan, that noise is necessary for dealer markets to remain open, is qualified. The separating signaling equilibrium is written as the solution to a central planner's problem. Besides facilitating computation, this formulation highlights: (i) the critical nature of incentive compatibility constraints. (ii) the welfare aspects . The former causes many equilibrium price-quantity schedules to be non-linear and non-differentiable. An analysis of the latter leads to the conclusion that Pareto-efficient outcomes can be approximated by a repeated version of an insider game.https://resolver.caltech.edu/CaltechAUTHORS:20170830-163107142Experimental Estimates of the Impact of Wage Subsidies
https://resolver.caltech.edu/CaltechAUTHORS:20170830-141305574
DOI: 10.7907/wpeyq-2kx14
The effects of a wage subsidy program on the duration of insured unemployment are investigated using experimental data. Participation in the experiment was voluntary and about one third of the subjects refused to take the subsidy voucher offered to them. Because subsidies appear to have stigmatic effects which tend to lower participation rates by high-skilled workers, experimental participants have longer average durations of unemployment than non-participants. However, correcting for self-selection, we find that wage subsidies can substantially increase a participant's probability of reemployment. Subsidies are also compared to a search bonus proposal which is also cost effective, but, due to differences in participation patterns, has rather different effects.https://resolver.caltech.edu/CaltechAUTHORS:20170830-141305574The Measure Representation: A Correction
https://resolver.caltech.edu/CaltechAUTHORS:20170830-135419858
DOI: 10.7907/pgv0f-kcm93
Wakker [9] and Puppe [2] point out a mistake in Theorem 1 in Segal [6]. This theorem deals with representing preference relations over lotteries by the measure of their epigraphs. An error in the theorem is that it gives wrong conditions concerning the continuity of the measure. This paper corrects the error. Another problem is that the axioms do not imply that the measure is bounded, therefore the measure representation applies only to subsets of the space of lotteries, although these subsets can become arbitrarily close to the whole space of lotteries. Some additional axioms (Segal [6, 7]), implying that the measure is a product measure (and hence anticipated utility), also guarantee that the measure is bounded.https://resolver.caltech.edu/CaltechAUTHORS:20170830-135419858Economies of Scale, Natural Monopoly and Imperfect Competition in an Experimental Market
https://resolver.caltech.edu/CaltechAUTHORS:20170830-143409554
DOI: 10.7907/s644e-hvw69
This paper reports on the behavior of markets in which all agents have identical costs with economies of scale over the entire range of demand. Each firm, by choosing a larger scale of plant and a larger volume, can experience lower average cost. Thus the markets are characterized by the fundamental technological property that has motivated decades of theorizing about natural monopoly and imperfect competition. The primary question posed by the research is whether or not a natural monopoly emerges and sets prices at monopoly levels or whether the data are more closely approximated by some alternative model of imperfect competition such as monopolistic competition, Cournot oligopoly or contestable market theory. The results are that monopoly emerges and charges prices closely approximated by contestable market theory. No support is found for Cournot forms of oligopoly or for other types of monopolistic competition.https://resolver.caltech.edu/CaltechAUTHORS:20170830-143409554Altruism, Reputation, and Noise in Linear Public Goods Experiments
https://resolver.caltech.edu/CaltechAUTHORS:20170823-154709009
DOI: 10.7907/1rdj0-77t87
We report the results of a public goods experiment using a design that enables us to directly measure individual response functions in voluntary contributions games and estimate error rates. In addition, following Andreoni (1988), we employ two treatments in order to measure the extent to which voluntary contribution is due to reputation effects. The partners treatment involves a fixed group of subjects playing a repeated game. The strangers treatment approximates a one-shot game by randomly changing group assignments after each play. Our data shows that essentially the only difference between the two treatments is the amount of noise in the data, with the strangers treatment being the noisier of the two. This noise manifests itself in two distinct ways. First, there is more variation of decision rules across subjects in the strangers treatment. Second, individual behavior is, on average, less consistent with a cutpoint decision rule in the strangers treatment, which produces higher estimates of individual error rates. The differences between the strangers and partners data are virtually the same as differences between data from experienced and inexperienced subjects. This suggests an explanation for the finding in Andreoni (1988) that there was greater contribution under the strangers treatment in the standard homogeneous environment in which one direction of errors (under-contribution) are censored. Our results also support his conclusion that reputation effects do not appear to play a prominent role in repeated linear public goods voluntary contribution games. Many findings from past public goods experiments are consistent with our model of random variation in a population of subjects who are, on average, neither altruistic nor spiteful.https://resolver.caltech.edu/CaltechAUTHORS:20170823-154709009Constitutional Stability
https://resolver.caltech.edu/CaltechAUTHORS:20170830-140538063
DOI: 10.7907/dg30h-t7q67
Despite attempts to paper over the dispute, political scientists in the pluralist tradition disagree sharply with public and social choice theorists about the importance of institutions and with William Riker in particular who argues in Liberalism against Populism that the liberal institutions of indirect democracy ought to be preferred to those of populist democracy. This essay reconsiders this dispute in light of two ideas unavailable to Riker at the time. The first, offered by Russell Hardin, is that constitutions can more usefully be conceptualized as coordinating devices as opposed to social contracts. The importance of this idea is that it allows for a more theoretically satisfying view of the way that constitutions become self-enforcing. The second idea, which derives from the various applications of concepts such as the uncoveted set, argues that although institutions such as the direct election of president are subject to the usual inabilities that concern social choice theorists, those instabilities do not imply that "anything can happen" - instead, final outcomes will be constrained, where the severity of those constraints depend on institutional details. We maintain that these ideas strengthen Riker's argument about the importance of such constitutional devices as the separation of powers, bicameralism, the executive veto, and scheduled elections, as well as the view that federalism is an important component of the institutions that stabilize the American political system. We conclude with the proposition that the American Civil War should not be regarded as a constitutional failure, but rather as a success.https://resolver.caltech.edu/CaltechAUTHORS:20170830-140538063Initial Versus Continuing Proposal Power in Legislative Seniority
https://resolver.caltech.edu/CaltechAUTHORS:20170830-153142114
DOI: 10.7907/w9sq4-wzr65
We compare two different seniority systems in a legislature whose sole task is to decide on distributive issues, and which operates under a Baron-Ferejohn recognition rule, where recognition probability is based on seniority. In the first system, called "initial proposal power", recognition probability for the initial proposal is based on seniority, but once the proposal is voted on by the legislature, all members have equal recognition probabilities for any reconsideration. Under the second system, called "continuing proposal power,'' seniority is used to determine proposal power both in the initial consideration and in any reconsideration. We find that in the case of seniority systems embodying continuing proposal power, there does not exist an equilibrium in which incumbents are reelected, and in which legislators would endogenously choose to impose such a seniority system on themselves. This contrasts with previous results in which we have shown that there does exist such an equilibrium for the case of initial proposal power. The reason for this result is that continuing proposal power lowers the value of senior members, since it makes them less desirable as coalition partners.https://resolver.caltech.edu/CaltechAUTHORS:20170830-153142114Dominant and Nash Strategy Mechanisms for the Assignment Problem
https://resolver.caltech.edu/CaltechAUTHORS:20170830-151242331
DOI: 10.7907/sf9j0-vp733
In this paper, I examine the problem of matching or assigning a fixed set of goods or services to a fixed set of agents. I characterize the social choice correspondences that can be implemented in dominant and Nash strategies when transfers are not allowed. This is an extension of the literature that was begun by Gibbard (1973) and Satterthwaite (1975), who independently proved that if a mechanism is nonmanipulable it is dictatorial. For the classes of mechanisms that are described in the paper, the results imply that the only mechanisms that are implementable in dominant and Nash strategies are choice mechanisms that rely only on ordinal rankings. I also describe a subclass of mechanisms that are Pareto optimal. In addition, the results explain the modeling conventions found in the literature - that when nontransfer mechanisms are studied individuals are endowed with ordinal preferences, and when transfer mechanisms are studied individuals are endowed with cardinal preferences.https://resolver.caltech.edu/CaltechAUTHORS:20170830-151242331Implementation of the Voting Rights Act in North Carolina
https://resolver.caltech.edu/CaltechAUTHORS:20170829-155341565
DOI: 10.7907/y4qqa-hcs79
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170829-155341565A Consistent Test of Stationary Ergodicity
https://resolver.caltech.edu/CaltechAUTHORS:20170829-151422958
DOI: 10.7907/3pgd8-65s84
A formal statistical test of stationary-ergodicity is developed for known Markovian processes on R^d. This makes it applicable to testing models and algorithms, as well as estimated time series processes ignoring the estimation error. The analysis is conducted by examining the asymptotic properties of the Markov operator on density space generated by the transition in the state space. The test is developed under the null of stationary-ergodicity, and it is shown to be consistent against the alternative of nonstationary-ergodicity. The test can be easily performed using any of a number of standard statistical and mathematical computer packages.https://resolver.caltech.edu/CaltechAUTHORS:20170829-151422958Rules, Discretion, and Accountability in Macroeconomic Policymaking
https://resolver.caltech.edu/CaltechAUTHORS:20170829-161329587
DOI: 10.7907/aqre4-ae727
Arguments for rules rather than discretion in macroeconomic policymaking facilitate the understanding of some fundamental issues of democratic theory. This article reviews four such arguments, and relates them to issues of delegation and accountability in representative government.https://resolver.caltech.edu/CaltechAUTHORS:20170829-161329587Repeated Play, Cooperation and Coordination: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20170830-132920230
DOI: 10.7907/d4sd9-y8j89
An experiment was conducted to test whether discounted repeated play leads to greater cooperation and coordination than one-shot play, in a public good environment with incomplete information. The experiment was designed so that, theoretically repeated play can sustain equilibria with higher group earnings than result in the one-shot Bayesian Nash equilibrium. The design varied a number of environment al parameters, including the size of the group, the marginal rate of transformation between the public and private good, and the statistical distribution of marginal rates of substitution between the public and private good. Marginal rates of substitution were private information but the statistical distribution was common knowledge. The results indicate that repetition leads to greater cooperation, and that the magnitude of these gains depends both on the ability of players to monitor each other's strategy and on the underlying environmental parameters.https://resolver.caltech.edu/CaltechAUTHORS:20170830-132920230Lower Bounds on Asset Return Comovement
https://resolver.caltech.edu/CaltechAUTHORS:20170829-144805279
DOI: 10.7907/jek4e-m2m07
Under standard assumptions from dynamic asset pricing theory (value additivity, complete markets, rational expectations, and strict stationarity and ergodicity) and absence of arbitrage, lower bounds on the conditional and unconditional cross-moments of the returns on two assets a.re derived. They a.re expressed in terms of the second moment of a linear combination of option premia. The restrictions a.re probed with data from the foreign exchange market covering the period 1983-1991. Assuming that the value of the economy's benchmark payoff never exceeds one, and substituting linear projection for conditional expectation, several violations of the conditional lower bounds are discovered. The violations are attributed to unit roots in the data.https://resolver.caltech.edu/CaltechAUTHORS:20170829-144805279Public Goods: A Survey of Experimental Research
https://resolver.caltech.edu/CaltechAUTHORS:20170823-160736011
DOI: 10.7907/f3s6c-3pw50
Environments with public goods are a wonderful playground for those interested in delicate experimental problems, serious theoretical challenges, and difficult mechanism design issues. A review is made of various public goods experiments. It is found that the public goods environment is a very sensitive one with much that can affect outcomes but are difficult to control. The many factors interact with each other in unknown ways. Nothing is known for sure. Environments with public goods present a serious challenge even to skilled experimentalists and many opportunities for imaginative work.https://resolver.caltech.edu/CaltechAUTHORS:20170823-160736011Stationarity and Chaos in Infinitely Repeated Games of Incomplete Information
https://resolver.caltech.edu/CaltechAUTHORS:20170829-135834135
DOI: 10.7907/kz8hk-4de75
Consider an incomplete information game in which the players first learn their own types, and then infinitely often play the same normal form game with the same opponents. After each play, the players observe their own payoff and the action of their opponents. The payoff for a strategy n-tuple in the infinitely repeated game is the discounted present value of the infinite stream of payoffs generated by the strategy. This paper studies Bayesian learning in such a setting. Kalai and Lehrer [1991] and Jordan [1991] have shown that Bayesian equilibria to such games exist and eventually look like Nash equilibria to the infinitely repeated full information game with the correct types. However, due to folk theorems for complete information games, this still leaves the class of equilibria for such games to be quite large.
In order to refine the set of equilibria, we impose a restriction on the equilibrium strategies of the players which requires stationarity with respect to the profile of current beliefs: if the same profile of beliefs is reached at two different points in time, the players must choose the same behavioral strategy at both points in time. This set, called the belief stationary equilibria, is a subset of the Bayesian Nash equilibria. We compute a belief stationary equilibrium in an example. The equilibria that result can have elements of chaotic behavior. The equilibrium path of beliefs when types are not revealed can be chaotic, and small changes in initial beliefs can result in large changes in equilibrium actions.https://resolver.caltech.edu/CaltechAUTHORS:20170829-135834135Nonlinear Behavior in Sealed Bid First Price Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170830-142425505
DOI: 10.7907/y5qc6-vnx85
The extent to which the behavior of people is consistent with game theoretic principles is investigated in a first price sealed bid auction environment Three linear rules of thumb with increasing complexity are used as benchmarks to gauge the accuracy of the Constant Relative Risk Aversion Model (CRRAM). In addition, the CRRAM is tested against the relaxation of the rational expectation hypothesis.
Existing competitive bidding experiments cannot clearly distinguish between game theoretic models and linear markdown rules on an individual level. Within the parametric environments studied and reported in the experimental literature, game theoretic solutions are linear over the range of private values in which bid functions are estimated. In this study, agents drew values from non-uniform distributions. As a result, the game theoretic bidding behavior is nonlinear.
Due to the nonlinearity, special econometric and numerical techniques are applied to solve the model and obtain the estimates. The CRRAM exhibits good fit of the data. The pseudo R2 is greater than 0.8 in 90 percent of the subjects. The CRRAM is more accurate than the Markdown Model (MM) and the Simple Ad hoc Model (SIMAM) but not as accurate as the Sophisticated Ad hoc Model (SOP AM). The data also supports the relaxation of the rational expectation hypothesis and suggests that substantial increases in the predictive power of game theoretic models can be gained from improvements in the theory of belief formation.https://resolver.caltech.edu/CaltechAUTHORS:20170830-142425505Was Memphis's Electoral Structure Adopted or Maintained for a Racially Discriminatory Purpose?
https://resolver.caltech.edu/CaltechAUTHORS:20170828-165240217
DOI: 10.7907/vxj2z-wcb60
Two themes run throughout this chronologically organized, extensively documented paper: The first is the pervasiveness of racial issues and racial conflicts in Memphis from 1955 to 1971, the period on which the paper centers. The second is the interconnection between electoral politics and electoral rules. That these are central and tragic themes in southern history is precisely the point: Memphis has never entirely outgrown the worst parts of its southern heritage.
The two themes imply one conclusion: The changes in Memphis's electoral rules in this period came about for racially discriminatory reasons. More particularly, the designated post and majority vote requirements were adopted primarily for the purpose of preventing African-Americans from enjoying a fair opportunity to elect candidates of their choice, and at-large seats on the Memphis City Council and School Board were maintained because of the same racially discriminatory motives. Key elements of the redrawing of election district boundaries in 1971were probably influenced by a desire to keep as many council seats as possible in white hands.
In the mid- and late-nineteenth century, as in the mid- and late-twentieth, white political and business leaders in Memphis put down political threats from the lower social orders, especially African-Americans, by changing electoral laws. Events of more than a century ago are apposite because they show that similar "solutions" have repeatedly been applied to similar "problems" in Memphis, as elsewhere in the United States. If racially discriminatory purposes moved men of the 1870s and 80s to rewrite electoral laws, as historians have found, then we should at the very least be alert to the possibility of racial motivation behind later electoral laws.
Particularly important were moves, first proposed by wealthy white Memphians a few months after the enfranchisement of blacks in Tennessee, to replace the ward-based local government with a five-man commission appointed by the governor. In 1879, after a terrible yellow fever epidemic, similar forces managed to convince the state to replace the local government with a "taxing district," whose officials were eventually to be elected at large. Thereafter, lower status groups held proportionately many fewer elective offices, although they were not excluded altogether until the passage of registration, secret ballot, and poll tax laws in 1889-90. There were no district elections for local government in Memphis from 1879 until 1967.
Blacks were allowed to cast ballots, or, often, to have their nominal ballots counted, under the regime of E.H. Crump. After 1927, Crump's control was so complete that he did not need black support to survive, and he effectively ran the leading black politician out of town. Whatever corruption, violence, and degradation of the democratic process existed under Crump was not the fault of blacks. Indeed, African-Americans were much more often the victims than the beneficiaries of Crump's dictatorship, for the boss was a virulent racist who had those blacks whom he could not browbeat physically beaten, and who enthusiastically embraced the Dixiecrat party in 1948. By 1951, a smaller number of blacks were registered to vote in Memphis than there had been in 1914. It was in this environment that those whites who framed the electoral laws of the post- 1954period, and those blacks who opposed them, grew up.
From 1909 through 1955, Memphis City Commissioners, except for the mayor, were elected without running for specific posts. The top four in a "free-for-all" race were chosen, and they then decided among themselves what departments each would head. Four members of the Board of Education were chosen by the same process. In 1955, however, black Baptist minister Roy Love finished fifth out of sixteen candidates for the school board, missing election by less than 6,000 votes out of 260,000 cast. Before the next municipal election, two local private acts, backed unanimously by the Shelby County delegation and passed as a matter of courtesy by the state legislature, required that each candidate for the Commission and Board of Education run for a specific post. This prevented black "single-shotting," a widely understood and often discussed tactic by which a politically cohesive minority can elect candidates of its choice, even in at-large elections. Statements made during this period, as well as the sequence of events, make clear the racial motivation of this change in electoral laws.
The framers, however, forgot to include a majority vote requirement. When a strong black candidate, Russell Sugarmon, and four serious white candidates announced for an open seat on the Commission in 1959, Commissioner Henry Loeb, then and thereafter the leader of the segregationist forces in Memphis government, asked Gov. Buford Ellington to call a special session of the state legislature purely in order to pass a runoff law for Memphis. After Ellington declined, and an attempt to set up a local white primary was abandoned because of doubts about its legality, the vestiges of retiring Mayor Edmund Orgill's organization, both daily newspapers, Loeb, Commissioner Claude Armour, and several white civic organizations orchestrated a bandwagon for one of the white candidates, Bill Farris, who was elected.
Sugarmon's threat to the white monopoly on political offices provided an unmistakable lesson on the necessity of including a runoff in the at-large, designated post scheme, and led to renewed attempts to pass such a law for Memphis. Even though Loeb and a unanimous city commission twice backed a runoff amendment, it failed to pass the legislature.
In 1961-62, the Chamber of Commerce spearheaded an attempt to consolidate Memphis and Shelby County into one metropolitan government. Although the initial draft provided for district elections, the Charter Commission eventually proposed an all at-large plan. The one black on the ten-man Charter Commission, Lt. George W. Lee, protested, because he said that no black could win an at-large election in Memphis, and he was joined by virtually every black community leader. Black and AFL-CIO opposition, conjoined with that of the county political organization and some middle-class whites who were afraid of higher taxes, defeated metro government soundly.
Frustrated at their inability to pass such regulations as the runoff law, Memphis's civic leaders put a "home rule" amendment on the November, 1963 ballot. Although black leaders opposed it, fearing that it would facilitate the passage of discriminatory provisions, it was too difficult to rally the community against such an abstract, seemingly harmless proposal, and it passed.
Dispirited by their defeat in 1959, blacks in 1963 ran no candidates for mayor or commissioners, but only for less visible at-large posts on the board of education and for a vacant city judgeship. A formalized bar association primary, publicized for weeks in the newspapers, conducted with city voting machines, and continuing for several rounds until one white candidate received a majority, kept the judgeship safely in white hands and once more demonstrated for all to see the usefulness of the runoff in preserving a white monopoly on offices. In the mayoral contest, a majority of blacks rejected the endorsee of most black leaders, Bill Farris, and instead voted for a shrewd and contentious "populist" candidate, William B. Ingram. Black candidates for the Board of Education once again proved that, no matter how non-controversial and seemingly well qualified a black could not be elected in an at-large election in Memphis, even if he were endorsed by one of the daily newspapers.
In 1965-66, leaders from the Chamber of Commerce and other governmental reformers renewed their effort to replace the commission form of government with a council-manager or council-mayor. Having learned several lessons from the 1962metro loss, they involved black, labor, and Republican leaders from the beginning. The most controversial issues, which badly split the "Program of Progress" ("POP") committee of 25 members, were whether to replace at-large elections wholly or in part with district contests and whether to include a runoff provision in the new charter. Throughout the discussion, the issues were treated primarily as racial controversies, and everyone agreed that it would be much more difficult for blacks to be elected under at-large systems and with runoffs. Black leaders, who preferred all or the vast majority of seats to be elected by districts, almost unanimously opposed the 7 district, 6 at-large final plan, and accepted it only reluctantly, as at least better than the current all at-large system.
Had the POP charter included a runoff requirement, they would almost certainly have opposed the whole charter in the referendum, so the POP committee finessed the issue. The City Commission put the runoff question on the August primary ballot and the POP charter on the November general election ballot, partly for fear that black opposition to the former would spill over to the latter, and partly because they thought that whites would be more likely to accept some districts in the POP plan if they knew that the runoff would keep the vast majority of council seats white. The voters accepted (although they did not frame) both the runoff and the POP charter.
In the 1967 elections, the new system worked almost exactly as designed. The widespread view that no black could possibly win in an at-large runoff destroyed the first campaign for mayor by a black in Memphis's history and discouraged any serious African-American candidacy for an at-large city council seat. Black candidates won two solidly black council districts, placed third in another that was 38% black, and eked out a victory in a 47% black district against a weak white opponent. Whites won 10of 13 council seats, and their favored candidate in the runoff, Henry Loeb, beat Mayor Ingram by gaining the "white backlash" vote in a very racially polarized election.
White incumbents also swept the school board seats, which were all still elected at large. The board, all-white from the 1880s until 1970, despite the growing proportion of black students in the school system, had thrown nearly every possible impediment in the way of integration. By 1969, in the wake of the garbage strike and the assassination of Martin Luther King, Jr., blacks, in a militant mood, launched a massive school boycott. To split the black leadership and end the boycott, a majority of the school board promised to appoint two blacks as interim "advisers" and to press the legislature to allow at least some school board members to be elected by districts, which everyone agreed was the only way to elect a black to the board.
After some jockeying in the 1970legislature, it was agreed that the board would use the same seven districts as the City Council, and that two other members would be elected at large. Winners for every seat would have to be elected by a majority. Everyone assumed, of course, that the at-large members would be white. Again, blacks leaders accepted the compromise as at least better than the status quo.
In 1971, there was another, rather half-hearted attempt to consolidate Memphis with Shelby County. Because of continual annexations, 91% of Shelby's people lived in Memphis as it was. Again, blacks opposed metro, this time because they did not want to add white flight suburbanites to a city in which their proportion and influence were increasing. Metro failed again. After that vote, the City Council reapportioned itself, significantly decreasing the black proportion in the district with the highest proportion black that was currently represented by a white councilman. Inter-district transfers put more blacks in an overwhelmingly black district and more whites in an overwhelmingly white district.
To sum up in one sentence: Memphis politics has been racial politics, and Memphis's election laws were the most effective weapons in maintaining white political power for so long.https://resolver.caltech.edu/CaltechAUTHORS:20170828-165240217Contracting Theory with Coincidence of Interest
https://resolver.caltech.edu/CaltechAUTHORS:20170829-132815681
DOI: 10.7907/5529z-m5r22
In the standard models of principal agent theory, the relationship between the principal and agent is adversarial in the sense that the objective of the agent is to maximize monetary income and minimize effort without regards for the objectives of the principal. In the real world, however, agents often have a personal stake in their contractual responsibilities. Furthermore, standard approaches to optimal contracting involve explicit monetary transfers, thereby excluding from analysis the case of non-profit agents. This paper presents a contracting model--the cooperative model--distinguished from standard models by the following three properties: first, the principal contracts with a non-profit agent for the provision of some commodity; second, the interests of the principal and agent coincide to the extent that their utilities are both increasing in the quality of the commodity provided; third, the using a suitable reformulation of the standard moral hazard variable, the optimal contract for the case of quasi-linear preferences has an extremely simple form.
After the cooperative model is formalized, cost-plus and fixed price contracting are defined and compared, the form of the optimal contract is determined for the case of quasi-linear preferences, the suboptimality of cost-plus and fixed price contracting is demonstrated, and the possibility of decentralizing the optimal contract through a menu of linear contracts is explored. Finally, a standard model--the adversarial model--is presented for the purposes of comparison, along with a general model in which subsumes both models as special cases. Starting with the adversarial model and altering it in each of the three ways outlined above, it is possible to trace the ramifications of the assumptions underlying the cooperative model.https://resolver.caltech.edu/CaltechAUTHORS:20170829-132815681Testing The Mean-Variance Efficiency of Well-Diversified Portfolios in Very Large Cross-Sections
https://resolver.caltech.edu/CaltechAUTHORS:20170824-150038586
DOI: 10.7907/ptdpn-hf013
We propose a new way of testing the mean-variance efficiency of well-diversified portfolios that exploits the cross-sectional size of typical financial datasets. The methodology consists of a sequence of simple tests, the results of which are aggregated in a statistic. This statistic is shown to be asymptotically standard normally distributed, despite dependence, in cross-section and over time, of the idiosyncratic risk. We investigate theoretically the asymptotic power of our test against the alternative that the well-diversified portfolio is not mean-variance efficient. By construction, our procedure is more powerful than standard tests of mean-variance efficiency that combine the assets in the cross-section into a limited set of (arguably) arbitrary portfolios. Even in cases where the latter has zero power, it can have unit asymptotic power. The incremental power is evidenced in tests of the mean-variance efficiency of the value weighted portfolio of common stock listed on the NYSE and AMEX. Unlike previously thought, however, the selection bias caused by including only continuously traded securities in the test is found to be important. By running the test in a case where it is known to have zero power, we are able to empirically confirm the correctness of the theoretical asymptotic properties of our statistic.https://resolver.caltech.edu/CaltechAUTHORS:20170824-150038586Political Competition in a Model of Economic Growth; Some Theoretical Results
https://resolver.caltech.edu/CaltechAUTHORS:20170830-135904085
DOI: 10.7907/dja9h-egh71
We study a one-sector model of economic growth in which decisions about capital accumulation and consumption are made through a political process of two candidate competition. Each voter's utility for a consumption stream is the discounted value of that voter's utility of consumption in each period. We consider the case when voters' one period utility functions for consumption are identical but discount factors are different. We are particularly interested in the conditions under which neoclassical optimal growth paths occur, and conditions in which political business cycles occur.
The answer depends on the ability or inability of the candidates to commit to multi-period investment strategies. If candidates can commit indefinitely into the future, then a political (majority rule) equilibrium path will not exist if all discount factors are different. For any feasible consumption path, there is a perturbation which is majority preferred to it. For any neoclassical optimal path there exists a perturbated path that is preferred to it either unanimously or by all but one voter. These results are true even if the perturbations can differ at no more than three consecutive periods from the original path.
If candidates are unable to commit to multi-period plans, we show there is a unique subgame perfect, stationary, symmetric equilibrium to the infinite horizon two candidate competition game; namely the optimal consumption path for the median voter. The equilibrium is unique in the following sense: It is the unique limit of subgame perfect equilibria to the finite horizon electoral game.
In the case when candidates can commit for a finite time into the future, we show that a stationary minmax path (a path which minimizes the maximum vote that can be obtained against it) yields a political business cycle.https://resolver.caltech.edu/CaltechAUTHORS:20170830-135904085Political Parties and Electoral Landscapes
https://resolver.caltech.edu/CaltechAUTHORS:20170823-141812198
DOI: 10.7907/ycdfc-bct60
We study the relationship between voters' preferences and the emergence of party platforms in two-party democratic elections with adaptive parties. In the model, preferences of voters and the opposition party's platform determine an electoral landscape on which the challenging party must adaptively search for votes. We show that changes in the underlying distribution of voters' preferences result in different electoral landscapes which can be characterized by a measure of ruggedness. We find that locally adapting parties converge to moderate platforms regardless of the landscape's ruggedness. Greater ruggedness, however, tempers a party's ability to find such platforms. Thus, we are able to establish a link between the distribution of voters' preferences and the responsiveness of adaptive parties.https://resolver.caltech.edu/CaltechAUTHORS:20170823-141812198Engodeneity of Alternating Offers in a Bargaining Game
https://resolver.caltech.edu/CaltechAUTHORS:20170823-134601561
DOI: 10.7907/46qat-kfm09
We investigate an infinite horizon two-person simultaneous offer bargaining game of incomplete information with discounted playoffs. In each period, each player chooses to give in or hold out. The game continues until at least one of the players chooses to give in, at which point agreement has been reached and the game terminates, with an agreement benefit accruing to each player, and a cost to the player (or players) that give in. Players have privately known agreement benefits. 'Low benefit players have a weakly dominant strategy to hold out forever; high benefit players would be better off giving in if they knew their opponent was planning to hold out forever.
For any discount factor there is a unique Nash equilibrium in which the two players alternate in their willingness to give in, if the players' priors about each others type are sufficiently asymmetric. Second, for almost all priors, this is the unique equilibrium if the discount factor is close enough to one.https://resolver.caltech.edu/CaltechAUTHORS:20170823-134601561The Optimal Choice of Privatizing State-Owned Enterprises: A Political Economic Model
https://resolver.caltech.edu/CaltechAUTHORS:20170828-164419100
DOI: 10.7907/wxqwx-ydz37
We study the choice of a maximizing Bureaucrat over privatization policies and their effects on consumer welfare in a transition economy. We study a Bureaucrat whose objective function is maximizing a surplus budget subject to the constraint of staying in office, and a Bureaucrat who maximizes popularity/consumer welfare subject to the constraint of a balanced budget. Other things being equal, both types of Bureaucrat will privatize the sector (firms) with the least market power and the most subsidy first. This is the "cheapest" way to privatize state-owned enterprises. Also, it is shown that it is relatively easier and faster to privatize in a less democratic society.https://resolver.caltech.edu/CaltechAUTHORS:20170828-164419100Rational Individual Behavior in Markets and Social Choice Processes
https://resolver.caltech.edu/CaltechAUTHORS:20170823-155731351
DOI: 10.7907/w32ec-wq985
This paper reviews a series of paradoxes that exist in the experimental economics literature. These paradoxes are instances in which otherwise accurate models of markets and social choice processes fail to capture the data of experiments. A loosely developed theory called The Discovered Preference Hypothesis is advanced in the paper as an explanation. Behavior seems to go through stages of rationality that begin with a type of myopia when faced with unfamiliar tasks. With incentives and practice, which might take the form of repeated decisions in the experimental work, (but might include play, banter, discussions with others, stages of commitment, etc.) the myopia gives way to what appears to be a. stage of more considered choices that reflect stable attitudes or preferences (as opposed to the labile attitudes identified by psychologists). Social institutions are seen as playing a role in the attainment of a third stage of rationality in which individual decisions incorporate the rationality of others, or the lack of it, in their own decisions.https://resolver.caltech.edu/CaltechAUTHORS:20170823-155731351A Micro-Econometric Analysis of Risk-Aversion and the Decision to Self-Insure
https://resolver.caltech.edu/CaltechAUTHORS:20170830-133904499
DOI: 10.7907/zqnfd-x3139
This study estimates a von Neumann-Morgenstern utility function using market data and micro-econometric methods. We investigate the decision whether to purchase insurance against the risk of telephone line trouble in the home. Using the choices of approximately 10,000 residential customers, we determine the shape of the utility function and the degree of risk-aversion. We find that risk-aversion varies systematically in the population and varies with the level of income and that the observed choice behavior is consistent with expected utility maximization. We are unable to detect the presence of ambiguity effects or over-weighting of low-probability events.https://resolver.caltech.edu/CaltechAUTHORS:20170830-133904499The Maximal Number of Regular Totally Mixed Nash Equilibria
https://resolver.caltech.edu/CaltechAUTHORS:20170823-152433647
DOI: 10.7907/w4xpy-8z371
Let S=∏^n_(i=1) Si be the strategy space for a finite n-person game. Let (S10,…, Sn0) ϵ S be any strategy n-tuple, and let Ti = Si - {si0}, i = 1, ..., n. We show that the maximum number of regular totally mixed Nash equilibria to a game with strategy sets Si is the number of partitions P = {P1,…, Pn} of UiTi such that, for each i, #Pi = #Ti and Pi ∩ Ti = ∅. The bound is tight, as we give a method for constructing a game with the maximum number of equilibria.https://resolver.caltech.edu/CaltechAUTHORS:20170823-152433647An Experimental Examination of the Assignment Problem
https://resolver.caltech.edu/CaltechAUTHORS:20170830-141829391
DOI: 10.7907/wffr3-4zg50
The problem of optimally assigning individuals to heterogeneous objects so that each individual is allocated at most one object (the assignment problem) has a long history. Algorithms• •based •on •ordinal preferences have been developed and several auctions using monetary transfers have been proposed. The performance of two auction mechanisms to solve the assignment problem is examined in an experimental setting. One of the auctions is a sealed-bid variant of the Vickrey auction for homogeneous objects and the other auction is an extension of the English auction. The auctions are tested in two diverse competitive environments (high and low contention). The experimental results show that the English auction generates higher revenues and efficiencies than its sealed-bid counterpart especially if there is a high level of contention. However, the efficiency gains of the English auction are at the expense of consumers' surplus. Indeed, a random assignment creates greater consumers' surplus relative to either auction outcomes in the high contention environment.https://resolver.caltech.edu/CaltechAUTHORS:20170830-141829391Some Properties of Hare Voting with Strategic Voters
https://resolver.caltech.edu/CaltechAUTHORS:20170830-145105291
DOI: 10.7907/ev76a-f2977
This essay examines some properties of the Single Transferable Vote (Hare Voting) procedure for electing candidates in multi-member districts under the assumption that all voters are strategic. From the perspective of the most common criterion for evaluating voting procedures - the extent to which they ensure the eventual selection of Condorcet winning candidates - the results we offer in this essay can be interpreted as indictments of STY. Even if we restrict preferences by imposing conditions on attitudes towards risk and assume a strong form of separability, STY is not necessarily incentive compatible and strategic voting does not ensure the selection of Condorcet winning candidates or of Condorcet outcomes. This fact, moreover is not dependent on the existence of "bogus" equilibria - outcomes that exclude Condorcet candidates cannot be avoided under all circumstances even if we limit our analysis to strong or to individually stable equilibria.https://resolver.caltech.edu/CaltechAUTHORS:20170830-145105291Permits or Taxes? How to Regulate Cournot Duopoly with Polluting Firms
https://resolver.caltech.edu/CaltechAUTHORS:20170829-153257189
DOI: 10.7907/fymv6-6pp57
The paper investigates pollution control of firms engaging in imperfect competition. We consider asymmetric Cournot duopoly where firms have linear technologies. Welfare is assumed to be separable in consumers' surplus and social damage which is given by a convex function depending on the aggregate pollution level. After deriving social optimum, we give a complete characterization of the optimal linear tax as well as of the optimal number of permits taking into account the firms' strategic behavior, and then compare the two both policies with respect to welfare. None of them turns out to implement social optimum in general. Also, no policy can be said to be superior, for all parameters. However, for a considerable range of parameters giving out permits yields a higher welfare than taxes. Finally, we consider double taxation of output and pollutants. In this case social optimum can always be achieved, if there are only two firms.https://resolver.caltech.edu/CaltechAUTHORS:20170829-153257189Market Barriers to Conservation: Are Implicit Discount Rates Too High?
https://resolver.caltech.edu/CaltechAUTHORS:20170829-140258761
DOI: 10.7907/ht47e-mha45
This paper reconsiders whether implicit discount rates, generally cited as a market barrier to conservation, are really too high, and demonstrates that probabilistic choice studies of consumer durable purchases and hedonic housing price regression studies measure similar but non-identical discount factors. Four hedonic regression studies are reviewed which attempt to ascertain whether and to what extent the housing market capitalizes energy conservation investments. A theoretical model is presented which links the probabilistic choice and hedonic regression methods and shows how using results from both studies allows measurement of individual discount rates without bias. The paper identifies several factors which cause the degree of capitalization to differ from unity, resulting in consumer decisions which are rational from the individual perspective, but which can lead to low levels of social conservation.https://resolver.caltech.edu/CaltechAUTHORS:20170829-140258761The Bayesian Voter: The Dynamics of Information and Learning in a Presidential Election Campaign
https://resolver.caltech.edu/CaltechAUTHORS:20170823-163841669
DOI: 10.7907/v4epr-1bp02
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170823-163841669Functional Voting Operators: The Non-Monotonic Case
https://resolver.caltech.edu/CaltechAUTHORS:20170823-163016725
DOI: 10.7907/yb239-z0950
We extend the non-binary framework of social choice introduced by Aizerman and Aleskerov (1986) , in which individual choice functions are aggregated into a social choice function, by considering non-monotonic operators. We characterize the class of "local" operators and provide the explicit forms of local operators satisfying various combinations of normative and rationality conditions in the absence of monotonicity. Surprisingly, the restriction of monotonicity is not binding for operators satisfying the usual rationality conditions. We identify two rationality restrictions which do admit non-monotonic operators. One restriction admits every sovereign and neutral operator, and the other admits only dictatorship and anti-dictatorship operators. This last result is a direct non-binary counterpart to Wilson's (1972) theorem.https://resolver.caltech.edu/CaltechAUTHORS:20170823-163016725How To Gerrymander: A Formal Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170824-145613292
DOI: 10.7907/4ncwz-88r70
The paper presents an effort to incorporate geographic and other possible exogenous constraints that might be imposed on districting into an optimal partisan gerrymandering scheme. We consider an optimal districting scheme for a party which maximizes the number of districts that it will, in expectation, win, given arbitrary distributions of voters and party supporters over the electoral territory. We show that such a scheme exists if an equal size requirement is the only constraint imposed on districting. If, further, the requirement of territorial connectedness is imposed, the optimal districting scheme still exists when arbitrarily small deviations from the equal size requirement are admissible. Additional constraints imposed on districting make gerrymandering more difficult and sometimes impossible. Although the party is assumed to ignore the risk associated with possible shifts in electoral votes and thus takes the expected share of votes as a perfect predictor of electoral outcomes, the presented approach is valid for a party with any attitude towards risk and for any kind of majority rule used in elections. The results are consistent with earlier findings on unconstrained optimal partisan gerrymandering.https://resolver.caltech.edu/CaltechAUTHORS:20170824-145613292Politics, Economics, and Politics Again
https://resolver.caltech.edu/CaltechAUTHORS:20170830-154645866
DOI: 10.7907/ywgdt-ad109
This essay reconsiders the meaning of politics. It argues that economics offers theory and language that can contribute to the understanding and fulfillment of political life by facilitating analysis of the public interest. However, economics does not provide an escape from political disagreement, whether based on inevitable differences of interest or of belief, or on self-serving efforts to advance one cause at the expense of another. As a language of discourse, economics is shown to be compatible with a broader conception of human nature than is sometimes claimed by its practitioners or acknowledged by its critics.https://resolver.caltech.edu/CaltechAUTHORS:20170830-154645866Testing Minority Preferences in Broadcasting
https://resolver.caltech.edu/CaltechAUTHORS:20170824-143954159
DOI: 10.7907/503f9-xh666
The United States government has several policies and programs designed to increase the number of broadcasting stations owned by racial minorities. Increasing the number of minority-owned broadcasting stations, the government claims, will diversify the content of broadcast programs by increasing the amount of minority-oriented programming. Minority owners will program their stations differently than white owners, the government claims. In this paper we present the first econometric test of these propositions about minority ownership of broadcasting stations as well as a number of other related propositions. We conclude that increasing the number of minority-owned broadcasting stations increases the amount of minority-oriented programming. We also conclude that increasing the number of female-owned stations-a policy that has been ruled unconstitutional would be just as effective at increasing minority-oriented programming.https://resolver.caltech.edu/CaltechAUTHORS:20170824-143954159Patterns of Voting on Ballot Propositions: A Mixture Model of Voter Types
https://resolver.caltech.edu/CaltechAUTHORS:20170829-150410925
DOI: 10.7907/xprd7-7jt08
In this paper we analyze the patterns of behavior voters exhibit over a set of votes. We explore a set of structural estimation problems that involve analyzing several votes at one time and develop estimation techniques for identifying and analyzing patterns. Using the information in these patterns, we introduce a method for studying voter heterogeneity based on a finite mixture model. Finally, we employ data containing actual micro-level vote returns to estimate the mixture model parameters.https://resolver.caltech.edu/CaltechAUTHORS:20170829-150410925Asset Prices in a Speculative Market
https://resolver.caltech.edu/CaltechAUTHORS:20170829-145747905
DOI: 10.7907/g9j6y-67681
The stochastic properties of prices in a speculative market are investigated. Agents in the market start with different priors, but update in a rational (i.e., Bayesian) way from realizations of payoffs on the risky asset. Convergence of the equilibrium price to the rational expectations price is investigated, as well as the asymptotic properties of two standard tests of rational expectations. The results are contrasted with stylized facts from forward markets.https://resolver.caltech.edu/CaltechAUTHORS:20170829-145747905Why Were There Black Schools in the Segregated South? The Exit Explanation Reconsidered
https://resolver.caltech.edu/CaltechAUTHORS:20170829-141036655
DOI: 10.7907/55y30-0zz26
African-American geographic mobility plays a central and somewhat contradictory role in Robert Margo's Race and Schooling in the South. 1880-1950. [1990] On the one hand, it is the solution to what Margo calls "Myrdal's Paradox." Blacks, in Margo's view, forced white school boards to spend at least some money on black schools after disfranchisement by threatening to deprive white planters of a labor force if black schools were too terrible. On the other hand, geographic mobility was the result of that solution to Myrdal's Paradox. Blacks who migrated north, Margo showed, were likely to be relatively well educated. In an article that accompanied his book, Margo elaborated a model of school board action in the legally segregated, post-disfranchisement South and briefly examined a small amount of data that he claimed was "broadly consistent with the model." [1991, p. 67.] In this paper, I consider extensive evidence, largely from the period before 1910, that bears on the first part of Margo's argument. Since almost none of that evidence corroborates his thesis, I conclude that explanations other than black geographic mobility must account for the pattern of support for black schools in the South during the era of legalized segregation.https://resolver.caltech.edu/CaltechAUTHORS:20170829-141036655Notes About Theory of Pseudo-Criteria and Binary Pseudo-Relations and Their Application to the Theory of Choice and Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170830-155321490
DOI: 10.7907/7qzgp-vy226
The pages of this working paper are copies of transparencies used in a lecture on the general theory of choice given at the California Institute of Technology June 1991.https://resolver.caltech.edu/CaltechAUTHORS:20170830-155321490The Political Economy of Government Debt in England (1693-1800): War, Liquidity, and Institutional Innovation
https://resolver.caltech.edu/CaltechAUTHORS:20170829-152307489
DOI: 10.7907/5jj4s-0nb49
Economists and historians have offered several explanations for the significant fluctuations in 1 8th century British government interest rates. This article discusses their short-comings and supports the thesis that the fluctuations in interest rates were largely a function of England's participation in large-scale wars. The risks facing the government-being deposed in the event the war is lost or defaulting on loans because of a shortfall in revenues-hurt their "credit worthiness" in the eyes of investors. Thus, lenders demanded high interest rates to compensate themselves for investing in risky government loans.https://resolver.caltech.edu/CaltechAUTHORS:20170829-152307489When Core Beliefs Collide: Conflict, Complexity, or Just Plain Confusion?
https://resolver.caltech.edu/CaltechAUTHORS:20170823-161705914
DOI: 10.7907/6p349-99205
In this paper, we argue that on many important public policy questions, people may be unsure of their preferences because their underlying principles or values are in conflict. We build a simple model of conflicting core beliefs, building on the work of Heider (1958). Using abortion policies as our test case, we develop a test for our theory using heteroskedastic probit, using data taken from the 1988 General Social Survey. The heteroskedastic probit results confirm our model, and in the last section of the paper, we trace the implications of this model for some of the larger questions in public opinion research.https://resolver.caltech.edu/CaltechAUTHORS:20170823-161705914Tax Depreciation and Risk
https://resolver.caltech.edu/CaltechAUTHORS:20170830-160444410
DOI: 10.7907/vcgma-32563
The theoretically ideal tax depreciation rule under an accretion tax is economic depreciation, a stream of deductions that replicates the decline in value of an asset over time. When the future value path of an asset is known in advance, the tax depreciation schedule should be based on the age-price profile for surviving assets. When the future value path of an asset is uncertain, this approach fails. A taxpayer can accelerate the statutory schedule by "strategic loss-taking." A series of special disposition rules (where each rule is combined with an adjustment in the ex ante depreciation schedule) address this problem, but each such rule has particular disadvantages. Finally, strategic loss-taking and rules designed to address it are particularly important in formulating a policy toward group accounting methods of depreciation.https://resolver.caltech.edu/CaltechAUTHORS:20170830-160444410The Spatial Analysis of Elections and Committees: Four Decades of Research
https://resolver.caltech.edu/CaltechAUTHORS:20170823-143030828
DOI: 10.7907/03y83-zga07
It has been more than thirty five years since the publication of Downs's (1957) seminal volume on elections and spatial theory and more than forty since Black and Newing (1951) offered their analysis of majority rule and committees. Thus, in response to the question "What have we accomplished since then?" it is not unreasonable to suppose that the appropriate answer would be "a great deal." Unfortunately, reality admits of only a more ambiguous response.https://resolver.caltech.edu/CaltechAUTHORS:20170823-143030828Legislatures, Initiatives, and Representation: Comparing the Effects of Institutions on Policy Outcomes
https://resolver.caltech.edu/CaltechAUTHORS:20170829-143152851
DOI: 10.7907/6mxyw-37q78
This research compares policy outcome resulting from the legislative process and the direct ballot process to estimate the effect of political institutions on preference aggregation and policy outcomes. Using data from California statewide elections, we analyzing policies which were considered in both processes and for which the two processes led to different outcomes. We conclude that features of the legislature, especially party, may lead legislators to vote against their district majority preference, and therefore lead legislative and direct ballot outcomes to diverge.https://resolver.caltech.edu/CaltechAUTHORS:20170829-143152851An Experimental Analysis of Two-Person Reciprocity Games
https://resolver.caltech.edu/CaltechAUTHORS:20170829-162254700
DOI: 10.7907/fsc33-sed43
This paper presents experimental evidence concerned with behavior in one-shot, finite repetition, and infinite repetition, two-person Reciprocity Games. Both symmetric and asymmetric games as well as games with explicit punishment actions are studied and compared. Along with classifying the group outcomes to the games, individual strategies are classified. The importance of alternation or turn-taking, group welfare, and equality as focal solutions is examined. Also considered is whether or not outcomes are unique, Pareto Optimal, or individually rational, and whether or not finite repetition treatments are subject to end-game effects.https://resolver.caltech.edu/CaltechAUTHORS:20170829-162254700A General Characterization of Optimal Income Taxation and Enforcement
https://resolver.caltech.edu/CaltechAUTHORS:20170829-154428133
DOI: 10.7907/2p5wg-hsm49
This paper develops a general approach to characterizing optimal income tax enforcement. Our analysis clarifies the nature of the interplay between tax rates, audit probabilities, and penalties for misreporting. In particular, it is shown that for a variety of objective functions for the principal the optimal tax schedule is in general concave (at least weakly) and monotonic; the marginal tax rates determine the audit probabilities; and less harsh penalties lead to higher enforcement costs. Our results imply that there exists a tradeoff between equity and efficiency considerations in the enforcement context which is similar to that in the moral hazard context for tax policy.https://resolver.caltech.edu/CaltechAUTHORS:20170829-154428133Laws of Large Numbers for Dynamical Systems with Randomly Matched Individuals
https://resolver.caltech.edu/CaltechAUTHORS:20170831-145159481
DOI: 10.7907/2np5h-81951
Biologists and economists have analyzed populations where each individual interacts with randomly selected individuals. The random matching generates a very complicated stochastic system. Consequently biologists and economists have approximated such a system with a deterministic system. The justitication for such an approximation is that the population is assumed to be very large and thus some law of large numbers must hold. This paper gives a characterization of random matching schemes for countably infinite populations. In particular this paper shows that there exists a random matching scheme such that the stochastic system and the deterministic system are the same. Finally, we show that if the process lasts finitely many periods and if the population is large enough then the deterministic model offers a good approximation of the stochastic model. In doing so we make precise what we mean by population, matching process, and evolution of the population.https://resolver.caltech.edu/CaltechAUTHORS:20170831-145159481Implementation in Bayesian Equilibrium: The Multiple Equilibrium Problem in Mechanism Design
https://resolver.caltech.edu/CaltechAUTHORS:20170831-135222800
DOI: 10.7907/hvv49-r3f64
This paper surveys the literature on implementation in Bayesian Equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170831-135222800The Heterogenous Logit Model
https://resolver.caltech.edu/CaltechAUTHORS:20170831-135519750
DOI: 10.7907/3x3n7-9fx82
Probabilistic choice systems in the generalized extreme value (GEV) family embody two restrictions not shared by the covariance probit model. First, the unobserved components of random utility are homoscedastic across individuals and alternatives. Second, the degree of similarity among alternatives is also assumed to be constant across individuals. This paper considers extensions to models in the GEV class which relax these two restrictions. An empirical application concerning the demand for cameras is developed to demonstrate the potential significance of the heterogenous logit model.https://resolver.caltech.edu/CaltechAUTHORS:20170831-135519750Criminal Choice, Nonmonetary Sanctions, and Marginal Deterrence: A Normative Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170831-153642931
DOI: 10.7907/fdh2h-vhe62
This paper develops a normative model of optimal sanctions in the Becker Tradition which emphasizes the role of marginal deterrence. The paper complements Shavell's 1987 American Economic Review paper, the essential difference being that Shavell's model concentrates on variations in the sanction imposed within a single category of acts (a specific crime) while the model in this paper concentrates on variations in the sanction imposed across categories of acts (different crimes). In their most general formulations, neither Shavell's model nor the model developed in this paper yields the result that acts with greater social harm should receive greater sanctions. But special cases, which readers may or may not find reasonable, do yield that result, within crimes for both models and across crimes in the model developed in this paper. This paper also identifies the necessary condition of jointness in the cost of law enforcement in the case of comparisons across crimes.https://resolver.caltech.edu/CaltechAUTHORS:20170831-153642931Equilibrium Enforcement and Compliance in the Presence of Tax Practitioners
https://resolver.caltech.edu/CaltechAUTHORS:20170831-152440504
DOI: 10.7907/1mdtp-vx909
We develop a game-theoretic model in which taxpayers, tax practitioners and a tax agency all interact to determine the extent of tax compliance. The model focuses exclusively on the service aspects of third-party assistance. We characterize four types of equilibria, depending on whether taxpayers prefer to use tax practitioners and whether the tax agency prefers them to use tax practitioners. In the empirically relevant case, which occurs when tax practitioner penalties for noncompliance are sufficiently low and the efficiency gains from using practitioners are sufficiently high, the tax agency prefers taxpayers to prepare their own returns, but taxpayers prefer to use a tax practitioner. In this case, the use of a tax practitioner is associated with lower compliance and higher audit rates.https://resolver.caltech.edu/CaltechAUTHORS:20170831-152440504The French Rural Communist Electorate in the 1920s and 1930s
https://resolver.caltech.edu/CaltechAUTHORS:20170831-144705458
DOI: 10.7907/647cm-yx728
One of the original characteristics of French communism has been its durable strength in some rural areas ever since its foundation at the December 1920 Congrès de Tours. Over the past six decades, the Communists have found strong support not only in certain urban, industrial areas but in some of the more rural and backward areas of the country as well. The Communist party's implantation in the countryside–both in terms of militants and voters – has been concentrated among a number of departments along the northern and western edge of the Massif Central, and along the Mediterranean littoral. The first time the Communists were up for national office–in 1924–they scored best not in an urban department but in the overwhelmingly rural Lot-et-Garonne (south-east of Bordeaux) where they gathered over 30% of the valid votes cast. Eight of the eighteen departments where they did best in that year were predominantly rural. By 1 936 the Party's strength in rural areas had increased notably. The Communists received over 20% of the vote in sixteen departments, nine of which can hardly be considered industrial, and in three of these over 60% of the active population was engaged in agriculture. During the interwar years it is estimated that 15% of the Party's members belonged to the agricultural professions. The question which springs to mind, then, is why the Communists have done so well (and continued to do so in the 1980s during a period of vertiginous electoral decline) in rural departments which hardly correspond, from a sociological perspective, to the image one has of the parti de la classe ouvrière?https://resolver.caltech.edu/CaltechAUTHORS:20170831-144705458Will Economics Become an Experimental Science?
https://resolver.caltech.edu/CaltechAUTHORS:20170831-140221374
DOI: 10.7907/enx5b-pdg22
Economics is becoming a science that is supported by both field research and by laboratory experimental research. The paper explores six events that form the foundation for the growth of modern laboratory experimental methodology.https://resolver.caltech.edu/CaltechAUTHORS:20170831-140221374Allocating Priority with Auctions: An Experimental Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170831-150636801
DOI: 10.7907/epp4j-4e623
There are many examples of markets where resources that were allocated ex-ante must be rationed ex-post. Two alternative methods of rationing are considered in this paper: priority service (see Chao and Wilson [1987] and Wilson [1989]) and proportional rationing (see Spulber [1989]). An experimental environment is developed in which the two rules are implemented within two well known institutions, the English and the Uniform Price sealed bid Auctions, under two different information conditions. We find that priority service generates more efficient allocations than proportional rationing, the sealed bid auction performed better than the English and that that both mechanisms and rationing rules were more efficient when there was a lack of common information.https://resolver.caltech.edu/CaltechAUTHORS:20170831-150636801Dutch Book Arguments and Subjective Probability
https://resolver.caltech.edu/CaltechAUTHORS:20170831-151749064
DOI: 10.7907/0z0pw-6pj51
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170831-151749064Welfare Economics for Tobit Models
https://resolver.caltech.edu/CaltechAUTHORS:20170831-142356533
DOI: 10.7907/72gj0-4f919
In this paper we demonstrate the correct calculation of consumer surplus in censored and truncated regression models, focusing on Tobit models. We review a variety of examples from the literature and isolate the nature of the bias associated with the incorrect calculation of consumer surplus in several of them.https://resolver.caltech.edu/CaltechAUTHORS:20170831-142356533Tax-Induced lntertemporal Restrictions on Security Returns
https://resolver.caltech.edu/CaltechAUTHORS:20170831-132709679
DOI: 10.7907/8y525-hrj13
This paper derives testable restrictions on equilibrium prices when capital gains and losses are taxed only when realized. We use the Generalized Method of Moments (GMM) procedure to estimate and test the restrictions. The empirical results show evidence of capital gains tax effects on the pricing of common stock. The restrictions are not rejected by the data and estimates of the coefficient of risk aversion and the dividend tax rate are precise and economically plausible. Estimates of the capital gains tax rate, however, are often imprecise and economically implausible. Further results indicate that this can be attributed to the fact that our model does not accommodate differential long and short-term tax rates. The data appear to favor the martingale hypothesis for after-tax asset returns over a before-tax consumption-based asset pricing model.https://resolver.caltech.edu/CaltechAUTHORS:20170831-132709679Arbitrage Restrictions Across Financial Markets: Theory, Methodology and Tests
https://resolver.caltech.edu/CaltechAUTHORS:20170831-143858811
DOI: 10.7907/psgxd-nyf58
The Cox, Ingersoll and Ross [1985a] general equilibrium model is extended by allowing the representative investor to trade in a batch call option market with execution price uncertainty. Necessary restrictions on the execution price uncertainty for the original equilibrium to remain intact are derived. They take the form of moment conditions in the pricing error (defined as the difference between the observed call price and the theoretical call price that would obtain in the absence of execution price uncertainty). The moment conditions can easily be estimated and tested using a version of the Method of Simulation Moments (MSM). In it, simulation estimates, obtained by discretely approximating the risk-neutral processes of the underlying stock price and the interest rate, are substituted for analytically unknown call prices. The asymptotics and other aspects of the MSM estimator are discussed. The model is tested on transaction prices from the Berkeley Options Data Base.https://resolver.caltech.edu/CaltechAUTHORS:20170831-143858811A Bayesian Sequential Experimental Study of Learning in Games
https://resolver.caltech.edu/CaltechAUTHORS:20170831-141309234
DOI: 10.7907/hgs8z-hwd27
We apply a sequential Bayesian sampling procedure to study two models of learning in repeated games. The first model is that individuals learn only about an opponent when they play her/him repeatedly, but do not update from their experience with that opponent when they move on to play the same game with other opponents. We label this the non-sequential model. The second model is that individuals use Bayesian updating to learn about population parameters from each of their opponents, as well as learning about the idiosyncrasies of that particular opponent. We call that the sequential model.
We sequentially sample observations on the behavior of experimental subjects in the so called 'centipede game'. This game has the property of allowing for a trade-off between competition and cooperation, which is of interest in many economic situations. At each point in time, the 'state' of our dynamic problem consists of our beliefs about the two models, and beliefs about the nuisance parameters of the two models. Our 'choice' set is to sample or not to sample one more data point, and if we should not sample, which of the models to select. After 19 matches (4 subjects per match), we stop and reject the non-sequential model in favor of the sequential model.https://resolver.caltech.edu/CaltechAUTHORS:20170831-141309234The Divergence Between Willingness-To-Pay and Willingness-To-Accept Measures of Value
https://resolver.caltech.edu/CaltechAUTHORS:20170831-142012841
DOI: 10.7907/54gj7-f0788
Do people value commodities more when they own the commodities than when they do not? Although economic models generally presume that economic agents evaluate commodities independently of whether the agents own those commodities or not, an assumption that we term the "basic independence" assumption, researchers in economics and law are starting to doubt that this is true. These doubts about the soundness of the basic independence assumption challenge accepted economic doctrine. Most theoretical and applied models in economics use the basic independence assumption both to predict and assess the operation of markets. And in the relatively new discipline of law and economics, the basic independence assumption produces the Coase Theorem, which is the starting point for much economic analysis of legal rules.
In this paper we present, organize, and critique the modern evidence on the basic independence assumption so as to draw together the learning of the economists and the lawyers. We will first investigate the evidence on the divergence between willingness-to-accept and willingness-to-pay measures of value, and then ask about possible explanations for the evidence. Next, we will explore the implications of the divergence for analysis in law and economics. Last, we will show that although the divergence between willingness-to-accept and willingness-to-pay measures of value may entail substantially limiting the role of cost- benefit analysis, we cannot precisely map those limits without answering some difficult questions about the source of the disparity between willingness-to-accept and willingness-to-pay.https://resolver.caltech.edu/CaltechAUTHORS:20170831-142012841Alliances in Anarchic International Systems
https://resolver.caltech.edu/CaltechAUTHORS:20170831-134520974
DOI: 10.7907/y706q-8c051
Alliances play a central role in international relations theory. However, aside from applications of traditional cooperative game theory that ignore the issue of enforcement in anarchic systems, or interpretations of the repeated Prisoners' Dilemma in the attempt to understand the source of cooperation in such systems, we have little theory on which to base predictions about alliance formation. This essay, then, builds on an n-country, non-cooperative, game-theoretic model of conflict in anarchic systems in order to furnish a theoretical basis for such predictions. Defining an alliance as a collection of countries that jointly abide by "collective security strategies" with respect to each other but not with respect to members outside of the alliance, we establish the necessary and sufficient conditions for an alliance system to be stable. In addition, we show that not all winning or minimal winning coalitions can form alliances, that alliances among smaller states can be stable, that bipolar alliance structures do not exhaust the set of stable structures, and that only specific countries can play the role of balancer.https://resolver.caltech.edu/CaltechAUTHORS:20170831-134520974The Development of Contemporary Political Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170831-133619242
DOI: 10.7907/dnm93-64s47
Bold claims are made about contemporary political theory's accomplishments since that theory was inaugurated by the writings of Arrow, Downs, Black, Riker, Buchanan and Tullock, and Olson. Others argue, however, that this theory suffers from too great a concern with mathematical notation and too little concern with substantive relevance. This essay argues that both views are essentially correct - that although our understanding of politics (and of economics) remains in such a primitive state that we cannot ignore the practical necessity for considering other, less formal modes of inquiry, something has been accomplished. In particular, we argue that the current paths of development of political theory - the discovery of First Principles - are identical to those set forth by arguably the most successful set of "political engineers" in history - the founders of the American republic. However, we also argue that many of the shortcomings of contemporary theory derive from a failure to appreciate something that was apparent to those Founders; namely, the distinction between science and engineering and the proper role of empirical and experimental analysis.https://resolver.caltech.edu/CaltechAUTHORS:20170831-133619242A Game-Theoretic Interpretation of Sun Tzu's the Art of War
https://resolver.caltech.edu/CaltechAUTHORS:20170831-162255007
DOI: 10.7907/j2sxd-zjs13
Over twenty five hundred years ago the Chinese scholar Sun Tzu, in The Art of War, attempted to codify the general strategic character of conflict and, in the process, offer practical advice about how to win military conflicts. His advice is credited with having greatly influenced both Japanese military and business practices, as well as Mao Tse-Tung's approach to conflict and revolution. The question, however, is whether or to what extent Sun Tzu anticipated the implications of the contemporary theory of conflict -- game theory. The thesis of this essay is that he can be credited with having anticipated the concepts of dominant, minmax, and mixed strategies, but that he failed to intuit the full implications of the notion of equilibrium strategies. Thus, while he offers a partial resolution of "he-thinks-that-I-think" regresses, his advice remains vulnerable to a more complete strategic analysis.https://resolver.caltech.edu/CaltechAUTHORS:20170831-162255007How to Determine Intent: Lessons from L.A.
https://resolver.caltech.edu/CaltechAUTHORS:20170831-154408354
DOI: 10.7907/72eyf-6z708
In a series of decisions from 1970 to 1980, the United States Supreme Court shifted from an "effect" standard to an "intent" standard in racial and sex discrimination cases. Every step along the road from the Jackson, Mississippi swimming pool closing case' to the Mobile, Alabama city commission case2 was criticized by professors and policymakers. Indeed, the last step was so controversial that Congress overturned the Court's City of Mobile v. Bolden decision in the major amendment to the 1982 Voting Rights Act, allowing plaintiffs in voting rights cases to prevail by proving either a racially discriminatory intent or a racially discriminatory effect.
How much difference has the shift made? Do Supreme Court decisions during the century following the passage of the Fourteenth Amendment in 1868 throw any light on the Court's more recent course? How, exactly, have courts said intent must be proven, and how do their standards comport with logic and normal professional conduct among historians? What sorts of evidence are relevant to proving intent, and why are they relevant? How, in practice, should one go about evaluating hypotheses about the motives of the framers of governmental rules?
The best path to answers to such questions begins, it seems to me, with a detailed examination of one case. In August, 1988, the Mexican American Legal Defense and Educational Fund and the American Civil Liberties Union filed Yolanda Garza et al. v. County of Los Angeles, California in the United States District Court for the Central District of California. Shortly thereafter, the United States Department of Justice instituted a similar suit against the County.
Having testified as a historian and political scientist in six previous federal voting rights cases, I served as an expert witness for MALDEF and the ACLU on the question of whether, in recent years, the supervisorial district lines in Los Angeles county had been drawn with an intent to discriminate against Hispanics. This paper is a revised and much expanded version of the one I presented to the court.
As in other cases, Garza was complex enough to require the attention of a professional historian because no one incident or piece of evidence was, by itself, conclusive. 5 Decided in favor of the plaintiffs by Judge David V. Kenyon on June 4, 1990, Garza deserves detailed attention not only for the light it throws on the general theoretical questions noted above, but also because the record on motivation is extraordinarily rich, because Los Angeles county has one of the largest populations of any jurisdiction ever sued in a voting rights case, and because it is the most important voting rights suit involving Hispanics ever filed.https://resolver.caltech.edu/CaltechAUTHORS:20170831-154408354An Experimental Analysis of Nash Refinements in Signaling Games
https://resolver.caltech.edu/CaltechAUTHORS:20170831-160114059
DOI: 10.7907/p6hm5-2gb95
This paper investigates the refinements of Nash equilibrium in two person signaling game experiments. The experimental games cover the watershed of the nested refinements: Bayes-Nash, Sequential, Intuitive, Divine, Universally Divine, NWBR, and Stabel. In each game an equilibrium selection problem is defined in which adjacent refinements are considered.
The pattern of outcomes suggest that individuals select the more refined equilibria up to the divinity concept. However, an anomaly occurs in the game in which the stable equilibrium is a clear preference among the subjects. Since the concepts are nested this suggests that the outcomes are game specific. Sender behavior does not seem to follow any specific decision rule (e.g., Nash, minmax, PIR, etc.) while receiver actions tend to correspond to the Nash equilibrium outcomes.https://resolver.caltech.edu/CaltechAUTHORS:20170831-160114059Bayesian Economists...Bayesian Agents II: Evolution of Beliefs in the Single Sector Growth Model
https://resolver.caltech.edu/CaltechAUTHORS:20170831-164230983
DOI: 10.7907/p1mgg-f1g11
In "Bayesian Economists ... Bayesian Agents I" (BBI), we generalized the results on Bayesian learning based on the martingale convergence theorem from the repeated to the sequential framework. In BBI, we showed that the variability introduced by the sequential framework is sufficient under very mild identifiability conditions to circumvent the incomplete learning results that characterize the literature. In this paper, we demonstrate that result in the neo-classical single sector growth model under even weaker identifiability conditions. We study the evolution of agent-beliefs in that model and show that, under reasonable conditions, the dependence of the current capital stock on the previous capital stock induces enough variability for our complete learning results to become relevant. Not only does complete learning take place from the subjective point of view of the agents' priors, but it also takes place from the point of view of an objective observer (modeling economist) who knows the true structure.https://resolver.caltech.edu/CaltechAUTHORS:20170831-164230983Industrial Blackmail of Local Governments
https://resolver.caltech.edu/CaltechAUTHORS:20170831-161354547
DOI: 10.7907/cr111-m9a40
A dynamic model of inter-governmental competition for investment is presented, where the investment represents a potentially large source of tax revenue for the local governments, and the local productivity of investment is uncertain. A single firm decides where to locate its new plant in each period by conducting an auction, soliciting bids from the local governments. Equilibrium subsidies from the local governments are derived, as well as conditions under which the firm will switch locations between periods.
A second issue addressed in this paper is local government strategic investment in infrastructure. We consider a two-stage game in which local governments first choose a level of infrastructure (which is costly to build), then participate in the sequential auction described above. It is shown that, even if the costs of building the infrastructure are the same in each location, in equilibrium the local governments will choose different levels of infrastructure and the region which chooses the highest level will be better off. Moreover, when the level of infrastructure is endogenous in the manner described, federally administered programs designed to increase the level of infrastructure in the less attractive region will make the firm strictly better off, without necessarily increasing the payoffs to either of the two local governments.https://resolver.caltech.edu/CaltechAUTHORS:20170831-161354547Fictitious Play: A Statistical Study of Multiple Economic Experiments
https://resolver.caltech.edu/CaltechAUTHORS:20170831-163228230
DOI: 10.7907/hfz3a-12r02
This paper illustrates the use of a full Bayesian procedure to update an experimenter's belief over various economic behavioral hypotheses using data from a variety of (potentially very different) experiments. Our example uses experimental data to update our belief as to whether individuals select strategies according to fictitious play. We endow the experimenter with priors over the events that players act according to fictitious play and according to the Cournot process. We then numerically compute the likelihood function for each experiment by replicating the experimental design and running the experiment with robots that behave according to each of our hypotheses. Updating experiment by experiment shows that some of the experiments favor Cournot, but most of them favor fictitious play as the more likely hypothesis. This illustrates the limitations of a classical procedure that can take only one experiment into consideration since some of the experiments may be misleading. Indeed, when we did the overall updating using 9 experiments, we found that, for any priors, the overall posterior put probability very close to one on the individuals acting according to fictitious play. Given the heterogeneity in the payoffs and design of the experiments that we combine for that overall posterior, it is clear that there is no classical procedure that would offer the same type of information.https://resolver.caltech.edu/CaltechAUTHORS:20170831-163228230Entry and R & D Costs in Competitive Procurements and Contracting
https://resolver.caltech.edu/CaltechAUTHORS:20170901-155513164
DOI: 10.7907/k3a4d-b3018
A model of competitive procurements and contracting is presented. The key features of the model include pre-contract R&D, an endogenous number of symmetric firms, and a first-price sealed-bid procurement auction. The unique symmetric perfect free-entry equilibrium is characterized. If the R&D technology is variable scale with constant marginal returns, it is socially optimal for one firm to do all of the R&D and production. However, since the buyer considers only his own cost of procurement, the buyer will prefer to allow free entry, and the number of firms will usually be larger than is socially optimal. If the R&D technology is fixed-scale, the buyer's choices will be socially optimal if the buyer's opportunity cost of an alternative procurement is high. On the other hand, if the opportunity cost is low the buyer will choose a reservation price lower than the socially optimal value and a number of firms no larger than the socially optimal number. Certainly, the type of R&D technology plays an important role in determining optimal R&D and procurement policies for the buyer and for society.https://resolver.caltech.edu/CaltechAUTHORS:20170901-155513164An Alternative Statistical Measure for Racially Polarized Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170905-144549736
DOI: 10.7907/qx3wk-kzt57
Measurements of the existence and extent of racially polarized voting are often at the forefront of the evidence presented in vote dilution litigation. Previous models used in the measurement of racially polarized voting have been inadequate for a broad range of cases. The source of this inadequacy is that these models were not based upon assumptions about individual behavior.
A new model, which is based on reasonable assumptions about individual behavior and can be approximated by a varying parameters model, is derived. By contrasting the new model with the other models, it is shown that both the correlation coefficient and linear regression can lead to inaccurate, misleading or incorrect conclusions about the state of racial polarization in the electorate. After providing a straightforward statistical test for identifying misspecification and showing how to obtain estimates for this model by the method of maximum likelihood, the model is compared with an individual level data set of an election where racially polarized voting occurred (1988 California presidential primacy). A new measure for the estimation of racially polarized voting is then proposed.https://resolver.caltech.edu/CaltechAUTHORS:20170905-144549736Alternative Policies for Unemployment Insurance
https://resolver.caltech.edu/CaltechAUTHORS:20170905-140540957
DOI: 10.7907/hqrza-xzj92
The effects of a wage subsidy program on the duration of insured unemployment are investigated using data from a demonstration project conducted by the Illinois Department of Employment Security. UI claimants were offered a voucher that could be presented to potential employers as an inducement for their hire. Participation in the subsidy program was voluntary and eligibility was limited to a ten week period following the initial UI claim. In principle, the subsidy should increase the demand for the unemployed worker's services by reducing an employer's net wage costs. It may also have supply effects if the expiration of eligibility for the subsidy causes an increase in search effort, though it is also possible that the subsidy causes workers to adjust their reservation wage levels upward. In practice, subsidies have stigmatic effects that tend to lower participation rates by high-skilled workers. As a result, participants in a subsidy program have longer average durations of unemployment than non-participants. However, correcting for self-selection, we find that wage subsidies can substantially increase a worker's probability of reemployment and that the net benefits of such a program exceed its cost. In addition, wage subsidies are compared to a search bonus proposal which is also cost effective, but, due to differences in participation patterns, has rather different effects.https://resolver.caltech.edu/CaltechAUTHORS:20170905-140540957Razorbacks, Ticky Cows, and the Closing of the Georgia Open Range: The Dynamics of Institutional Change Uncovered
https://resolver.caltech.edu/CaltechAUTHORS:20170901-142302529
DOI: 10.7907/njx8z-e1p29
While a redistribution of property rights might enable society to capture potential efficiency gains, the inevitable distributional conflicts make the transformation far from automatic. As individuals who would be adversely affected by the change seek a priori contracts for compensation, those who anticipate net benefits must decide how much to pay, who should pay, and who should receive their payments. As free riding, strategic bargaining, or the expensive monitoring and enforcement of contracts ultimately block voluntary agreements, society becomes unable to adopt the new institutional structure that promised to increase social wealth. If distributional conflicts become so severe and threaten to prevent the implementation of an income-enhancing property rights arrangement, what type of mechanisms will evolve to work these conflicts out? When voluntary negotiations break down, the government is usually called in to implement publically what individuals could not accomplish in the private sector. How, therefore, does the political process influence the path of institutional and economic development? This paper explores the dynamics of institutional change in an attempt to explain better why the adoption of potentially productive institutions are delayed and why inefficient ones persist.
The paper provides a micro-analysis of the transformation from an open range to a closed range policy in postbellum Georgia. The traditional agricultural practice in Georgia from colonial times until after the Civil War allowed animals to roam the countryside freely and forced farmers to erect fences around their growing crops. All unfenced land, therefore, was considered common pasture that could be used by anyone. After the Civil War there was a concerted effort to eradicate the open range policy and to force all livestock owners to fence in their animals instead of forcing farmers to fence them out of the growing crops. According to estimates provided in the paper, switching to the closed range would have generated net benefits for specific regions for Georgia, but distributional conflicts, coupled with high transaction costs, made a voluntary agreement to close the range unattainable.
The empirical evidence shows that the Georgia legislature's role in facilitating the closing of the range was crucial. First, the legislature allowed countywide referenda on what became known as the fence question. Upon seeing that majority rule generally failed as a mechanism to facilitate the adoption of a relatively profitable institution, the legislative body manipulated the voting mechanism so as to guarantee compensation for a subset of the expected losers. By forcing the transfer of income from expected winners to expected losers, the state legislature was able to facilitate the adoption of the closed range policy that promoted more rapid agricultural development in postbellum Georgia.https://resolver.caltech.edu/CaltechAUTHORS:20170901-142302529The Changing Face of Tax Enforcement, 1978-1988
https://resolver.caltech.edu/CaltechAUTHORS:20170901-150225143
DOI: 10.7907/amfy6-5qy64
This article examines three aspects of tax administration that are widely thought to play a particularly critical role in tax enforcement: the examination (or audit) function, information reporting and the criminal enforcement process. A careful look at the IRS budget devoted to the first two of these demonstrates that a major shift in tax enforcement policy has occurred during the last decade---fewer people have been audited, but those who have been were punished more severely. This shift in enforcement policy raises important questions both of the efficacy and fairness of the tax administrative process. Although not apparent through budget allocations, a similar pattern has occurred with respect to criminal enforcement. In addition, the shift of IRS resources toward enforcement of non-tax crimes and the increased use of grand juries also raises important questions about these IRS criminal enforcement process.https://resolver.caltech.edu/CaltechAUTHORS:20170901-150225143Legislative Districting
https://resolver.caltech.edu/CaltechAUTHORS:20170905-131623600
DOI: 10.7907/079sf-jnf80
America never knew the rotten boroughs that John Locke called "gross absurdities" and condemned as being incompatible with the right of equal representation (Locke, 1812, at 433). Rotten boroughs were towns "of which there remain[ed] not so much as the ruins, where scarce so much housing as a sheepcote, or more inhabitants than a shepherd [were] to be found, [but that sent] as many representatives to the grand assembly of law-makers, as a whole county numerous in people, and powerful in riches" (id., at 432).
The United States did inherit from Britain the so-called Westminster system, in which legislators are elected, usually one apiece, from geographically defined districts, with the candidate receiving the most votes declared the winner. Perhaps the system was inevitable in a time with neither full-fledged political parties nor modem devices of transportation and communication. In any event, the system has been permanently embedded in American political thought and practice.https://resolver.caltech.edu/CaltechAUTHORS:20170905-131623600Incentive Procurement Contracts with Costly R&D
https://resolver.caltech.edu/CaltechAUTHORS:20170905-130852791
DOI: 10.7907/f1z2g-0we39
This paper provides a model of both R&D and production in procurement processes where firms invest in R&D and compete for a government procurement contract. The optimal incentive procurement contract is characterized to maximize the government's expected welfare. Explicit consideration of the R&D process changes the standard results in several ways. If the traditional Baron-Myerson (1982) type contract is used where there is costly R&D, the government buys too little from the contractor and pays too little. Raising the price paid encourages private R&D and raises the government's welfare. The form of the optimal procurement contract depends on the number of firms. With R&D and optimal procurement the government prefers more than one firm to invest in R&D and to bid for the production contract. But too much competition may discourage private R&D investment and leave the government worse off. Other features of optimal procurement and R&D expenditures are also discussed.https://resolver.caltech.edu/CaltechAUTHORS:20170905-130852791Efficient Trading Mechanisms with Pre-Play Communication
https://resolver.caltech.edu/CaltechAUTHORS:20170905-143503727
DOI: 10.7907/69xq2-h8e74
This paper studies the problem of designing efficient trading mechanisms when players may engage in pre-play communication. It is well known that equilibrium behavior can be affected, sometimes drastically, if players have the opportunity to exchange messages prior to playing some particular game. We investigate the relationship between efficiency, pre-play communication, and unique implementation. We identify a class of simple mechanisms which are immune to pre-play communication and show that any incentive efficient allocation can be uniquely implemented by such a mechanism.https://resolver.caltech.edu/CaltechAUTHORS:20170905-143503727Bidding Rings
https://resolver.caltech.edu/CaltechAUTHORS:20170901-140002664
DOI: 10.7907/t02sf-rt279
We characterize coordinated bidding strategies in two cases: a weak cartel, in which the bidders cannot make side-payments; and a strong cartel, in which the cartel members can exclude new entrants and can make transfer payments. The weak cartel can do no better than have its members submit identical bids. The strong cartel in effect reauctions the good among the cartel members.https://resolver.caltech.edu/CaltechAUTHORS:20170901-140002664Justifying Minority Preferences in Broadcasting
https://resolver.caltech.edu/CaltechAUTHORS:20170901-145119372
DOI: 10.7907/se9t3-enj88
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170901-145119372Marshallian VS. Walrasian Stability in an Experimental Market
https://resolver.caltech.edu/CaltechAUTHORS:20170905-154411211
DOI: 10.7907/nwvn3-30h47
Twelve markets were studied. All markets had downward sloping supply functions created by Marshallian-type external economics. The conditions were such that the Marshallian theory of dynamics gave predictions opposite to the Walrasian theory of dynamics. The market organizations studied were double auction, sealed bid/offer and (secant) tâtonnement. In all cases the Marshallian theory of dynamics was the better model.https://resolver.caltech.edu/CaltechAUTHORS:20170905-154411211The Report of the United States to the International Fiscal Association on the Costs of Tax Administration and Compliance
https://resolver.caltech.edu/CaltechAUTHORS:20170905-150053563
DOI: 10.7907/6hqn3-9h683
This is a report prepared for the International Fiscal Association on the costs of tax administration and compliance in the United States. At the federal level, we present comprehensive data on administrative costs and review recent estimates of compliance costs. At the state level, we present new data on the administrative costs of state income taxes and general sales taxes, and review the very limited data on state level compliance costs. We also discuss the growing role of tax preparers, including new empirical results of our own. Finally, we review the recently enacted "Taxpayer Bill of Rights."https://resolver.caltech.edu/CaltechAUTHORS:20170905-150053563An Experiment With Space-Station Pricing Policies
https://resolver.caltech.edu/CaltechAUTHORS:20170901-162947996
DOI: 10.7907/7kjc8-bdc71
In the late 1990s the National Aeronautics and Space Administration (NASA) plans to operate an earth orbiting space station. For decades into the future the station is expected to play a dominant role in U.S. space research and in the commercialization of space. If the expectations are correct, then the station will be a complex of potentially valuable resources and services. Much sentiment exists within the government that the allocation of those resources should be based on some sort of market-oriented policy (a more "business-like" approach). This paper is part of a larger project that is intended to ascertain what that policy might be.https://resolver.caltech.edu/CaltechAUTHORS:20170901-162947996Campaign Finance and the Constitution
https://resolver.caltech.edu/CaltechAUTHORS:20170905-153200563
DOI: 10.7907/a7et3-pak73
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170905-153200563The Demand for Tax Return Preparation Services
https://resolver.caltech.edu/CaltechAUTHORS:20170901-150955679
DOI: 10.7907/a6m3p-j6q49
In this paper we focus on taxpayer choices of return preparation services. Using aggregate nested logit techniques, we find that the demand for third party preparation responds to many factors: age, education, employment status, federal auditing, and tax return characteristics. Perhaps most important among these is federal auditing. Higher federal audit rates increase the demand for tax practitioner services, but do not affect the demand for other modes of third party assistance. More generally, as the tax burden increases, or as uncertainty about true tax liability increases, the demand for all modes of third party assistance increases.https://resolver.caltech.edu/CaltechAUTHORS:20170901-150955679Bargaining Costs and Failures in the Sealed-Bid Double Auction
https://resolver.caltech.edu/CaltechAUTHORS:20170901-132458629
DOI: 10.7907/d0qjt-6a395
This paper analyzes bilateral bargaining in the sealed-bid double auction with bargaining costs. There exists a multiplicity of equilibria to this game, all of which have unsatisfactory properties. Since anything seems possible, we focus on the completely mixed strategy equilibria (C.M.S.) but find that such equilibria require that the negotiator with the higher bargaining cost receive higher profits. Allowing the bargaining process to be dynamic does not entirely solve the problem because the offers in the dynamic game can demonstrate chaotic behavior. Moreover when failure costs are low there exist many infinite horizon C.M.S. equilibria. One feature of the C.M.S equilibrium is the existence of a significant probability of delay which is consistent with e1npirical reality. Finally if there is asymmetric information over bargaining costs, the negotiator with the higher bargaining costs obtains lower profits. Thus, asymmetric cost information leads to more plausible properties for most bargaining equilibria.https://resolver.caltech.edu/CaltechAUTHORS:20170901-132458629Seniority in Legislatures
https://resolver.caltech.edu/CaltechAUTHORS:20170901-140827674
DOI: 10.7907/af878-br061
We construct a stochastic game model of a legislature with an endogenously determined seniority system. We model the behavior of the legislators as well as their constituents in an infinitely repeated divide the dollar game. Each legislative session must make a decision on redistributional issues, modeled as a divide the dollar game. However, each session begins with a vote in which the legislators decide, by majority rule, whether or not to impose on themselves a seniority system. Legislative decisions on the redistributional issues are made by the Baron-Ferejohn rule: an agenda setter is selected by a random recognition rule (which in our model is a function of the seniority system selected), the agenda setter makes a proposal on redistributional issues, and the legislature then votes whether to accept or reject the agenda setters proposal. If the legislature rejects the proposal, another agenda setter is randomly selected, and the process is repeated. If the legislature accepts the proposal, the legislative session ends, and the voters in each legislative district vote whether to retain their legislator or throw it out of office. The voters' verdict determines the seniority structure of the next period legislature. We find a stationary equilibrium to the game having the property that the legislature imposes on itself a non trivial seniority system, and that legislators are always reelected.https://resolver.caltech.edu/CaltechAUTHORS:20170901-140827674Worldwide Persistence, Business Cycles, and Economic Growth
https://resolver.caltech.edu/CaltechAUTHORS:20170901-144231733
DOI: 10.7907/y4vc1-khn90
We study the time series properties of aggregate data drawn from the Penn World Tables using numerical Bayesian procedures which facilitate inference with small samples. We find substantial persistence in world aggregates, and some evidence for a world business cycle. Across economies, there is great dispersion in our measure of persistence of shocks to real gross domestic product. That we also find no evidence of a relationship between growth and persistence sheds light on which of two competing models of endogenous growth is likely to be able to explain the PWT data.https://resolver.caltech.edu/CaltechAUTHORS:20170901-144231733An Overlapping Generations Model Core Equivalence Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170901-161157147
DOI: 10.7907/vakga-p8980
The classical Debreu-Scarf core equivalence theorem asserts that in an exchange economy with a finite number of agents art allocation (under certain conditions) is a Walrasian equilibrium if and only if it belongs to the core of every replica of the exchange economy. The pioneering work of P. Samuelson has shown that such a result fails to be true in exchange economies with a countable number of agents.
This paper presents a Debreu-Scarf type core equivalence theorem for the overlapping generations (OLG) model. Specifically, the notion of a short-term core allocation for the overlapping generations model is introduced and it is shown that (under some appropriate conditions) an OLG model allocation is a Walrasian equilibrium if and only if it belongs to the short-term core of every replica of the OLG economy.https://resolver.caltech.edu/CaltechAUTHORS:20170901-161157147A Mathematical Proof of Duverger's Law
https://resolver.caltech.edu/CaltechAUTHORS:20170905-145247491
DOI: 10.7907/n9fr9-jvy72
A famous stylized fact in comparative politics, Duverger' s Law, is that electoral systems based on single ballot winner-take-all plurality voting will produce bipartisan competition. This paper presents an equilibrium model of elections in which this stylized fact emerges a logical implication of rational strategic voting behavior by individuals in a large heterogeneous electorate.https://resolver.caltech.edu/CaltechAUTHORS:20170905-145247491Testing Bayes Rule and the Representativeness Heuristic: Some Experimental Evidence
https://resolver.caltech.edu/CaltechAUTHORS:20170901-141503191
DOI: 10.7907/f1eyg-pg070
The psychological literature has identified a number of heuristics which individuals may use in making judgments or choices under uncertainty. Mathematically equivalent problems may be treated differently depending upon details of the decision setting (Gigerenzer et al. (1988), Hinz et al. (1988), Birnbaum and Mellers (1983), Ginossar and Trope (1987)) or upon how the decisions are framed (Tversky and Kahneman (1986)). The results presented in this paper are consistent with those findings and are unsettling. In equivalent problems subjects appear to adopt different strategies in response to observing different data. All problems were inference problems about populations represented by bingo cages and all randomization was operational and observed by the subjects. Thus one cannot explain the change of decision strategy by appeal to changing reference points nor should difference between surface and deep structure of problems apply (Wagenaar et al. (1988)). A striking observation from the experiments is the result of employing financial incentives. Some experiments included financial incentives for accuracy and some did not. In the latter experiments the number of nonsense or incoherent responses increased by a factor of three. The majority of subjects in both treatments behaved reasonably, but of those lacking financial incentives a larger proportion gave obviously absurd responses. This suggests that data from decision experiments in which no financial incentives were should be treated as possibly contaminated and statistical methods robust against outliers employed.https://resolver.caltech.edu/CaltechAUTHORS:20170901-141503191Voter Preference for Trade Policy Instruments
https://resolver.caltech.edu/CaltechAUTHORS:20170901-142822107
DOI: 10.7907/dy2mc-b3x65
We analyze voter preferences for tariffs and production subsidies. The distribution of tax revenues argument shows that voters with high direct tax burdens prefer tariffs to subsidies. The uncertainty argument demonstrates that if actual tariff and subsidy rates are chosen from the set of individually optimal rates then the range of tariff rates is smaller than the range of subsidy rates. Thus, tariffs might be preferred even though they are less efficient. Finally, the large country argument shows that if a country is large then voters whose income shares decline with more protection prefer tariffs to subsidies.https://resolver.caltech.edu/CaltechAUTHORS:20170901-142822107The Effect of Audit Rates on Federal Income Tax Filings and Collections, 1977-1986
https://resolver.caltech.edu/CaltechAUTHORS:20170901-152721831
DOI: 10.7907/xpncq-xzb82
This paper analyzes the effects of audit rates and certain other factors on federal income tax filings and collections. Using data drawn primarily from the Annual Reports of the Commissioner of Internal Revenue for the years 1977-1986, we investigate the overall performance of the federal revenue collection process and estimate that total IRS collections in 1986 would have risen by approximately forty billion dollars had the federal audit rate remained constant at its 1977 level during the intervening period.https://resolver.caltech.edu/CaltechAUTHORS:20170901-152721831Public and Private Information: An Experimental Study of Information Pooling
https://resolver.caltech.edu/CaltechAUTHORS:20170905-141058732
DOI: 10.7907/t4jca-fps61
This paper reports on an experimental study of-the way in which individuals make inferences from publicly available information. We compare the predictions of a theoretical model of a common knowledge inference process with actual behavior. In the theoretical model, "perfect Bayesians," starting with private information, take actions; an aggregate statistic is made publicly available; the individuals do optimal Bayesian updating and take new actions; and the process continues until there is a common knowledge equilibrium with complete information pooling. We find that the theoretical model roughly predicts the observed behavior, but the actual inference process is clearly less efficient than the standard of the theoretical model, and while there is some pooling, it is incomplete.https://resolver.caltech.edu/CaltechAUTHORS:20170905-141058732Bayesian Economist ... Bayesian Agents I: An Alternative Approach to Optimal Learning
https://resolver.caltech.edu/CaltechAUTHORS:20170901-161946053
DOI: 10.7907/40s5h-as292
We study the framework of optimal decision making under uncertainty where the agents do not know the full structure of the model and try to learn it optimally. We generalize the results on Bayesian learning based on the martingale convergence theorem to the sequential framework instead of the repeated framework for which results are currently available. We also show that the variability introduced by the sequential framework is sufficient under very mild identifiability conditions to circumvent the incomplete learning results that characterize the literature. We then question the type of convergence so achieved, and give an alternative Bayesian approach whereby we let the economist himself be a Bayesian with a prior on the priors that his agents may have. We prove that such an economist cannot justify endowing all his agents with the same (much less the true) prior on the basis that the model has been running long enough that we can almost surely approximate any agent's beliefs by any other's. We then examine a possibly weaker justification based on the convergence of the economist's measure on beliefs, and fully characterize it by the Harris ergodicity of the relevant Markov kernel. By means of very simple examples, we then show that learning, partial learning, and non-learning may all occur under the weak conditions that we impose. For complicated models where the Harris ergodicity of the Markov kernel in question can neither be proved nor disproved, the mathematical/statistical test of Domowitz and El-Gamal (1989) can be utilized.https://resolver.caltech.edu/CaltechAUTHORS:20170901-161946053Conflict and Stability in Anarchic International Systems
https://resolver.caltech.edu/CaltechAUTHORS:20170905-133457593
DOI: 10.7907/mrr19-r2h04
A considerable part of theory in international relations concerns the issue of whether cooperation and stability can emerge from the competition and self-interest of sovereign powers existing in a state of anarchy. Does anarchy, if ever, imply stability in the form of a balance-of-power, or does stability require restraints which arise from the complex nexus of interdependencies characterizing the contemporary world economy and its associated institutions? The analysis in this essay supposes that nation-states are each endowed with some infinitely divisible resource, which those states maximize and which also measures their ability to overcome adversaries in the event of conflict. In this context we reexamine and reformulate the realist view, by offering a noncooperative, extensive-form model of international conflict without exogenous mechanisms for the enforcement of agreements in order to uncover the conditions under which a balance-of-power as construed by our model ensures the sovereignty of all states in anarchic systems. Our primary conclusion is simple: there exists at least one world, albeit abstract and reminiscent of the frictionless planes with which we introduce the perspectives of physics, in which a balance-of-power ensures sovereignty.https://resolver.caltech.edu/CaltechAUTHORS:20170905-133457593Common Sense or Commonwealth? The Fence Law and Institutional Change in the Postbellum South
https://resolver.caltech.edu/CaltechAUTHORS:20170901-162948024
DOI: 10.7907/e3364-5y288
What causes individuals to change age-old economic, political, and social institutions? "Radical" historians claim that economic elites use their political power to impose institutions that enable them to extract the "labor surplus" more easily. This sharply conflicts with many economists' belief that economic growth comes about as society adopts a new regime of rules so as to capture potential efficiency gains. Whereas previous economists and historians have not addressed each other's concerns, this paper tests these contending hypotheses using an example common to both literatures - fence laws.
As demographic and economic changes permeated the postbellum South, many progressive farmers called on their state legislatures to adopt stock laws which would prohibit grazing animals on unfenced land. Focusing our attention on the same Georgia counties as previous historians have studied, we provide a more comprehensive analysis of the empirical data than has heretofore been given. Previous research on what contemporaries called the fence question has portrayed the conflict as one between the "haves" and the "have nots" - wealthy landowners against yeoman farmers, tenants, and laborers - or between contending "cultures" - believers in a pre-capitalistic "household mode of production" against partisans of national and international capitalistic market relations. Our investigation of the qualitative and quantitative evidence shows that the two-class interpretation is wrongly simple and the cultural gloss is simply wrong. The stock law created potential benefits which crossed class lines and there is little evidence that its opponents rejected the crass cash nexus. The debate, therefore, was not rooted in class conflict, but stemmed from the materialistic goals of individuals concerned about the equitable distribution of costs and benefits of fencing crops and animals.https://resolver.caltech.edu/CaltechAUTHORS:20170901-162948024Realism Versus Neoliberalism: A Formulation
https://resolver.caltech.edu/CaltechAUTHORS:20170901-155001026
DOI: 10.7907/g60kx-23t81
Although the debate between realism and neoliberalism offers deep insights and raises fundamental questions into the nature of international systems, it also offers the confusion that accompanies imprecisely formulated concepts and an imperfect application of subsidiary ideas. Using a noncooperative extensive-form game to model anarchic international systems, this essay seeks to resolve that debate by restating it in a more explicit and deductive context. Arguing that collective security corresponds to the system envisioned by neoliberals, we begin by differentiating between balance of power and collective security in terms of the strategies that characterize the foreign policies of countries. Next, we establish that both balance of power and collective security can correspond to equilibria in our game. Arguments about goals and institutions are then recast in terms of the different properties of these equilibria. In particular, a balance of power equilibrium does not guarantee every country's security, so in it countries must be vigilant about their relative share of resources. A collective security equilibrium, on the other hand, ensures everyone's sovereignty, and thereby allows absolute resource maximization. Unlike a balance of power equilibrium, however, a collective security equilibrium is not strong and it is not necessarily perfect, so the institutional structures facilitating the realization of mutual gains from the variety of cooperative "subgames" characterizing the world economy play a critical role in establishing the stability of that equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170901-155001026Equilibria Resistant to Mutation
https://resolver.caltech.edu/CaltechAUTHORS:20170901-133545488
DOI: 10.7907/38p02-apd45
The paper requires that equilibrium behavior for two person symmetric games be resistant to genetic evolution. In particular the paper assumes that the evolution of genotypes selecting a behavioral rule can be described according to some generalization of the replicator model. This paper defines an equilibrium concept, 'evolutionary equilibrium', which is defined as the limit of stationary points of the evolutionary process as the proportion of the population that mutates goes to zero. Then the set of evolutionary equilibria, as defined in the paper, is a nonempty subset of the set of perfect equilibria (and thus of the set of Nash equilibria) and a superset of the set of regular equilibria and the set of ESS.https://resolver.caltech.edu/CaltechAUTHORS:20170901-133545488The Evolution of Partisanship Among Immigrants
https://resolver.caltech.edu/CaltechAUTHORS:20170905-151355697
DOI: 10.7907/z56vg-dm021
This paper examines evidence on the partisanship of immigrants and second generation citizens as a first step in assessing the impact of recent compositional changes in the electorate. Drawing upon a sample of 574 Latino- and 308 Asian-Americans, we find that the longer Latino immigrants have been in the United States, the more likely they are to identify as Democrats and the more intensely they hold their partisan attachments. Asian immigrants, in contrast, exhibited no such trends in the direction of their party choice or in their partisan identity. We also find strong age-related gains in Democratic support and in partisan intensity among subsequent generations of Latinos. We strongly suspect that these too are experience-related, but we are not able to discount equally plausible cohort-based scenarios.https://resolver.caltech.edu/CaltechAUTHORS:20170905-151355697Selection Bias in Linear Regression, Logit and Probit Models
https://resolver.caltech.edu/CaltechAUTHORS:20170905-134432262
DOI: 10.7907/1nz72-m8808
Missing data are common in observational studies due to self-selection of subjects. Missing data can bias estimates of linear regression and related models. The nature of selection bias and econometric methods for correcting it are described. The econometric approach relies upon a specification of the selection mechanism. We extend this approach to binary logit and probit models and provide a simple test for selection bias in these models. An analysis of candidate preference in the 1984 U.S. presidential election illustrates the technique.https://resolver.caltech.edu/CaltechAUTHORS:20170905-134432262Information Aggregation in Two-Candidate Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170905-153759955
DOI: 10.7907/96gx6-bw273
Many interesting political institutions, such as campaigning, polls, and sequences of elections cannot be understood in the context of standard spatial competition models of elections with fully informed candidates and voters. To fill this void, we introduce a model of elections in which candidates are privately and asymmetrically informed about the electorate. This model differs from other incomplete information models, such as the rational expectations model, in that a full range of sequential strategic behavior is considered. We begin with a model in which candidates can constantly revise their positions before the election. In this case, one might expect each to "invert" the other's strategies and infer the other's private information, as is done in equilibrium with rational expectations. However, we find that each candidate, knowing the other will try to make such inferences, will follow a strategy which is not invertible. No information will leak from one candidate to the other. The outcome will be identical to a single-move election with incomplete information and no information aggregation will occur.
The introduction of a public poll changes the results in an interesting way. Candidates still use pooling strategies (strategies that are constant on their private information) to avoid leaking anything to the opponent but, contrary to the case without the poll, candidates learn about the electorate before the election. In equilibrium, candidates use mixed strategies (pure strategy equilibria do not exist) and the better informed player cannot prevent the lesser informed from learning from the poll. No private information is leaked but information aggregation occurs. We conclude with an examination of the effect on information aggregation of a sequence of elections. In the previous results, candidates moves were "free" in the sense that revisions were costless as in a "cheap talk" model. Now moves are not free and hiding information today in order to improve one's chances of winning tomorrow may lower one's chances today. We show that information aggregation may occur both through the results of the election (as with the poll) and through the leakage of private information. We also provide an example in which the strategic choices of the candidates are skewed away from the rational expectations equilibrium. Because of the asymmetric information and the strategic issues surrounding information leakage, behavior is different than would be observed in simple one-shot elections.https://resolver.caltech.edu/CaltechAUTHORS:20170905-153759955When is the Core Equivalence Theorem Valid?
https://resolver.caltech.edu/CaltechAUTHORS:20170901-160350264
DOI: 10.7907/24902-y0m54
In 1983 L. E. Jones exhibited a surprising example of a weakly Pareto optimal allocation in a two consumer pure exchange economy that failed to be supported by prices. In this example the price space is not a vector lattice (Riesz space). Inspired by Jones' example, A. Mas-Colell and S. F. Richard proved that this pathological phenomenon cannot happen when the price space is a vector lattice. In particular, they established that (under certain conditions) in a pure exchange economy the lattice structure of the price space is sufficient to guarantee the supportability of weakly Pareto optimal allocations by prices-i.e., they showed that the second welfare theorem holds true in an exchange economy whose price space is a vector lattice. In addition, C. D. Aliprantis, D. J. Brown and O. Burkinshaw have shown that when the price space of an exchange economy is a certain vector lattice, the Debreu-Scarf core equivalence theorem holds true, i.e., the sets of Walrasian equilibria and Edgeworth equilibria coincide. (An Edgeworth equilibrium is an allocation that belongs to the core of every replica economy of the original economy.) In other words, the lattice structure of the price space is a sufficient condition for avoiding the pathological situation occurring in Jones' example.
This work shows that the lattice structure of the price space is also a necessary condition. That is, "optimum" allocations in an exchange economy are supported by prices (if and) only if the price space is a vector lattice. Specifically, the following converse-type result of the Debreu-Scarf core equivalence theorem is established: If in a pure exchange economy every Edgeworth equilibrium is supported by prices, then the price space is necessarily a vector lattice.https://resolver.caltech.edu/CaltechAUTHORS:20170901-160350264Dynamic Tariffs with Asymmetric Information
https://resolver.caltech.edu/CaltechAUTHORS:20170901-143535221
DOI: 10.7907/ev9rf-aev27
Recent work in game theory has demonstrated how cooperative outcomes can be sustained when the game is played repeatedly, defectors are punished, but agents play non-cooperatively. This methodology is applied here to determine when two countries can sustain freer trade given that they determine trade policies non-cooperatively.
We focus on the role of asymmetric information. Countries have private information about the extent of their own protection, but the overall level of protection can be thought of as private information. Therefore, any agreement to eliminate or reduce tariffs is limited by the fact that countries can cheat on the agreement by using non-observable forms of protection.
Using import trigger strategies, cooperation (in the form of low tariffs) can be supported. There a.re periodic reversionary (high tariff) episodes which necessarily occur. They are not the result of mistakes, attempted manipulation, or misperception. Neither country cheats on the low tariff agreement, but reversions to high tariffs are triggered by the random variable.
In section V we examine a slightly different trigger strategy. Countries' strategies are based on their observations of the terms of trade. This alteration changes the results and in this case cooperation does not occur.https://resolver.caltech.edu/CaltechAUTHORS:20170901-143535221Expert Opinions and Taxpayer Compliance: A Strategic Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170901-153557344
DOI: 10.7907/qwqsw-02p94
In this paper we examine the incentives for taxpayers to claim risky deductions and to solicit expert opinions to support their positions, and for the tax agency to distinguish among individuals who do and do not solicit expert opinions for the purposes of auditing. We also consider the implications of an ex ante constraint on the tax agency which requires it to treat all taxpayers who take the deduction alike in terms of audit rates, whether or not they solicit an expert opinion. Finally, we examine the effects of regulations which limit the degree of riskiness for which a supporting opinion can be justified as well as the effects of changes in various penalty rates.https://resolver.caltech.edu/CaltechAUTHORS:20170901-153557344Strategy and Choice in the 1988 Presidential Primaries
https://resolver.caltech.edu/CaltechAUTHORS:20170905-152412682
DOI: 10.7907/qv2r8-5d988
In recent years, thinking about the American Presidential primaries has been dominated by the image of Carter's victory in 1976. Conventional wisdom in the eighties has advised Presidential candidates to focus on the early contests in Iowa and New Hampshire, and to at least match, or better yet, exceed the expectations that the press, pollsters, and pundits have for them in those states. The successful campaign, it was thought, had to force the competition out by the end of March in order to lock up the nomination before the convention. This common wisdom - the so-called "momentum theory" - will now have to be revised as a result of what happened in the 1988 primaries. While one candidate from each patty did eventually emerge victorious in 1988, no one followed the Carter script as closely as expected. The Democratic race was not clearly resolved until Dukakis managed consecutive victories over Jackson in Wisconsin (April 5), New York (April 19) and Pennsylvania (April 26). On the Republican side, even though the race was over after Super Tuesday, the conventional "momentum" story was still marred by the odd-and in the end, meaningless-outcome in Iowa Republican caucuses. Bush exceeded expectations in Iowa, but in a negative direction, and both Dole and Robertson were unable to convert their successes into any advantage in New Hampshire and the South.
In this paper, we use data from a series of 12 exit polls conducted by the Los Angeles Times to explain the course of the 1988 Democratic and Republican presidential primary campaigns. The Los Angeles Times sample of primaries includes the critical early Democratic and Republican contests in Iowa and New Hampshire, six Super Tuesday states, and the Democratic primaries in Illinois, New York, Pennsylvania and California. The story we tell is quite simple. Momentum in both races was slowed by regionalism, and in the Democratic contest specifically, by the non-strategic support that blacks and affluent liberal whites gave Jesse Jackson. Momentum accelerated in the New York and Pennsylvania primaries as moderate and Jewish voters strategically switched to Dukakis in order to block Jesse Jackson's nomination. Both Bush and Dukakis staked out positions for themselves near the center of the ideological spectrum in their respective parties. As it turned out, the distribution of voter preferences, combined with strategic complications typical of multicandidate races, served to make the middle an advantageous spot for the victorious candidates.https://resolver.caltech.edu/CaltechAUTHORS:20170905-152412682A Stone-Weierstrass Theorem without Closure under Suprema
https://resolver.caltech.edu/CaltechAUTHORS:20170901-135500931
DOI: 10.7907/nscgn-97p45
For a compact metric space X, consider a linear subspace A of C (X) containing the constant functions. One version of the Stone-Weierstrass theorem states that, if A separates points, then the closure of A under both minima and maxima is dense in C (X). Similarly, by the Hahn-Banach theorem, if A separates probability measures, A is dense in C (X). We show that if A separates points from probability measures, then the closure of A under minima is dense in C (X). This theorem has applications in Economic Theory.https://resolver.caltech.edu/CaltechAUTHORS:20170901-135500931State Income Tax Amnesties 1: Causes
https://resolver.caltech.edu/CaltechAUTHORS:20170901-151939526
DOI: 10.7907/xghen-f3m05
The purpose of this paper is to analyze empirically for the years 1980-88 the factors which led states with state income taxes to run tax amnesty programs. We find a principal factor to be the level of IRS auditing; in particular, we find that states have tended to "free-ride" on the IRS-if the IRS is active in a state, then that state is less likely to run a tax amnesty program. Indeed, our estimates indicate that had the IRS audit rate remained constant during the 1980-88 period (instead of falling by almost one-half) , then the cumulative probability that an average state would have a tax amnesty by 1988 would have fallen by almost one-half compared to its actual level.https://resolver.caltech.edu/CaltechAUTHORS:20170901-151939526Organizational Diseconomies of Scale
https://resolver.caltech.edu/CaltechAUTHORS:20170901-134249115
DOI: 10.7907/0kpxm-31v87
This paper models strategic behavior within firms. The principal (e.g., the firm's owner) is handicapped by not knowing as much about the firm's capabilities as the agent(s) (e.g., the manager). The agent can extract some rents from his private information. The principal can retrieve some of these rents at the expense of introducing a distortion, paying the agent less than the full value of his marginal product. As a result the firm operates inefficiently. The degree of this inefficiency varies with demand elasticity and with the length of the firm's managerial hierarchy. The costs of operating the hierarchy create a limit to the size of the firm.https://resolver.caltech.edu/CaltechAUTHORS:20170901-134249115The Distributive Effects of the Federal Energy Tax Act
https://resolver.caltech.edu/CaltechAUTHORS:20170906-143432906
DOI: 10.7907/5sc8m-gcb63
This paper examines the distributional consequences of the tax credits implemented by the Federal Energy Tax Act of 1978. The distributional effects are of interest both for their own sake, and because they have implications for the cost-effectiveness of the credits. If rates of return to conservation are higher for individuals who consume less housing, as earlier evidence suggests, then conservation incentive programs can achieve larger benefits for a given cost if they are distributionally more progressive.
We explain the amount of credit claimed by taxpayers using a tobit model, in which credits claimed are a function of variables that affect the net benefit of weatherization. We estimate the model using data from the 1979 Taxpayer Compliance Measurement Program conducted by the Internal Revenue Service. We find that credits claimed are significantly higher where winters are more severe, where energy prices are high or rising rapidly, and where individuals have higher incomes and spend more on housing.
Progressivity indices based on Lorenz-Gini measures of inequality reveal that the tax credits were somewhat regressive, even holding climate and energy prices constant. This suggests that the credits may have been ineffectively targeted. In addition, we find no evidence that the credits had a measurable incentive effect, suggesting that they have largely provided windfall gains to households who would have insulated anyway.https://resolver.caltech.edu/CaltechAUTHORS:20170906-143432906Monopoly Provision of Product Warranties
https://resolver.caltech.edu/CaltechAUTHORS:20170906-133509554
DOI: 10.7907/y7mrn-83m58
This article considers the problem of monopoly provision of product warranties when consumers are heterogeneous and when the probability of product malfunction depends on both the quality of the product and on the consumers' care. The optimal warranty contract is characterized to maximize the expected profit for the monopolistic seller. The properties of the optimal contract depend on the nature of the product. If the quality of the product is more important as a determinant of reliability than consumer care then standard results are obtained; that is, a positive correlation between warranties and reliability and between price and reliability are observed, and higher type buyers buy more expensive versions of the product with higher warranties. On the other hand, if consumer care is more important in increasing reliability, the results are exactly opposite; for example, there is a negative correlation between warranty coverage and reliability. Also, when consumer care is important, higher type buyers buy versions of the product with lower warranty and lower quality. Other features of the optimal warranty contract are also characterized in this paper.https://resolver.caltech.edu/CaltechAUTHORS:20170906-133509554Before Plessy, Before Brown: The Development of the Law of Racial Integration in Louisiana and Kansas
https://resolver.caltech.edu/CaltechAUTHORS:20170906-133514445
DOI: 10.7907/mgy48-f7k47
In the face of the Nixon-Reagan counterrevolution against liberal decisions of the Warren Court, some liberal judges and legal commentators have called for an increased reliance on state courts for the protection of civil rights and civil liberties. To gauge how well state courts and legislatures protected civil rights in the nineteenth century, I examined twenty school integration cases and numerous legislative and state constitutional convention actions in Louisiana and Kansas from 1868 through 1903.
Contrary to what Raoul Berger and others have asserted, black integrationists had many allies in the mainstream of the Republican party in the late 19th century. Not only did they pass laws prohibiting the exclusion of children from any school because of race, color, or previous condition of servitude, but they represented black plaintiffs in numerous school integration cases, most of which have previously been unknown to or at least little noticed by scholars. At least one judge ruled segregation contrary to the Fourteenth Amendment, while another came close to doing so. The arguments of lawyers, legislators, and black petitioners to legislative bodies were all similar and often quite sophisticated. In particular, the unpublished briefs in three Louisiana cases made clear how intermixed contentions based on state and national constitutions were. If the state constitution and laws created a right and the national constitution and laws prohibited unequal enjoyment of state-created rights, then legal inequities violated rights on both governmental levels simultaneously.
From 1877 on in Louisiana, and from 1903 on in Kansas, blacks lost the strong protection against unequal schools that they had enjoyed, at least de jure, earlier. Whether the reversals reflected shifts in white public opinion is unclear, for it was not the white populous that made the changes, but a new, younger set of white racist judges. Their ability to reverse or bypass earlier liberal judicial decisions or legal provisions demonstrates how fragile rights can be in the several states and undermines the empirical foundations of what might be called "the new states' rights."https://resolver.caltech.edu/CaltechAUTHORS:20170906-133514445Incentive Compatibility
https://resolver.caltech.edu/CaltechAUTHORS:20170906-145751798
DOI: 10.7907/k0pn8-m1r25
Incentive compatibility is described and discussed. A summary of the current state of understanding is provided. Key words are: incentive compatibility, game theory, implementation, mechanism, Bayes, Nash, and revelation.https://resolver.caltech.edu/CaltechAUTHORS:20170906-145751798The State of Social Science History in the Late 1980's
https://resolver.caltech.edu/CaltechAUTHORS:20170906-135824096
DOI: 10.7907/8sj1x-7mm18
Is social science history a dated fad, or has it been so fully accepted as to have become uncontroversial? Is it more or less popular with professors and graduate students today than in the recent past? Is its status higher at the most prestigious universities, or among their graduates, than at less highly-ranked colleges? What do historians and other social scientists see as the strengths and weaknesses, the achievements and deficiencies of social science history (hereafter referred to as "ssh")? To what degree do more traditional historians agree or disagree with social scientific historians and historically-oriented social scientists about these matters? How widespread is the teaching of statistics and theory in history departments, and how sophisticated is it, compared to the offerings in social science departments? Has the field become truly interdisciplinary?https://resolver.caltech.edu/CaltechAUTHORS:20170906-135824096Repeated Auctions of Incentive Contracts, Investment and Bidding Parity with an Application to Takeovers
https://resolver.caltech.edu/CaltechAUTHORS:20170906-142233328
DOI: 10.7907/344ch-yeg95
This paper considers a two-period model of natural monopoly and second sourcing. The incumbent supplier invests in the first period. After observing the incumbent's first-period performance, the buyer may breakout in the second. The investment may be transferable to the second source or not; and may be monetary or in human capital.
The paper determines whether the incumbent should be favored at the reprocurement stage, and how the slope of his incentive scheme should evolve over time. It results from our analysis that the gains from second sourcing are not as high as one might hope.
Last it reinterprets the second source as a raider, and the breakout as a takeover. It discusses the desirability of defensive tactics, and obtains a rich set of testable implications concerning the size of managerial stock options, the amount of defensive tactics, the firm's performance and the probability of a takeover.https://resolver.caltech.edu/CaltechAUTHORS:20170906-142233328Allocating Uncertain and Unresponsive Resources
https://resolver.caltech.edu/CaltechAUTHORS:20170906-135234942
DOI: 10.7907/mhs8g-89z20
We identify an important class of economic problems that arise naturally in several applications: the allocation of multiple resources when there are uncertainties in demand or supply, unresponsive supplies (no inventories and fixed capacities), and significant demand indivisibilities (rigidities). Examples of such problems include scheduling job shops, airports or super-computers, zero-inventory planning, and the allocation and pricing of NASA's planned Space Station. We show that the two most common organizations used to deal with this problem, markets and administrative procedures, can perform at very low efficiencies (60-65percent efficiency in a seemingly robust example). Thus, there is a need to design new mechanisms that more efficiently allocate resources in these environments. We develop and analyze two that arise naturally from auctions used in the allocation of single dimensional goods. These new mechanisms involve computer assisted coordination made possible by the existence of networked computers. Both mechanisms significantly improve on the performance of both administrative and market procedures.https://resolver.caltech.edu/CaltechAUTHORS:20170906-135234942Incentives and a Process Converging to the Core of a Public Goods Economy
https://resolver.caltech.edu/CaltechAUTHORS:20170906-140940957
DOI: 10.7907/yacdq-8jg46
The paper considers economies involving one public good, one private good, and constant returns to scale. It is shown that the process proposed earlier in Chander (1983, 1987a, and 1987b) always converges to an allocation which is in the core of the economy. This is then interpreted as an incentive property of the process and it is shown that there exists no process which always converges to the core and in which truth-telling constitutes a dominance equilibrium of the 'local incentive game'.https://resolver.caltech.edu/CaltechAUTHORS:20170906-140940957The Optimal Product-Mix for a Monopolist in the Presence of Congestion Effect: A Model and Some Results
https://resolver.caltech.edu/CaltechAUTHORS:20170906-144753357
DOI: 10.7907/dcyf5-6qx35
The paper develops a model of product differentiation in which the quality of a product may be negatively affected by the number of consumers buying it, as it is the case for any good affected by congestion. It is shown that for any positive degree of heterogeneity among the consumers, a monopolist will always find it more profitable to differentiate, i.e., to sell more than one quality of the product at different prices.https://resolver.caltech.edu/CaltechAUTHORS:20170906-144753357Renegotiation and the Form of Efficient Contracts
https://resolver.caltech.edu/CaltechAUTHORS:20170906-151944637
DOI: 10.7907/8x4wv-7kr95
Two parties may agree to a mutually binding contract that will govern their behavior after an uncertain event becomes known. As there is no agent who can both observe this uncertain outcome and enforce the contract, contingent agreements are precluded. However, the parties recognize that the uncertain event will be common knowledge for them, and that they will be able to renegotiate the contract voluntarily, provided that they both gain in doing so. When structuring the original contract they can foresee this renegotiation phase. Efficient contracts are those that perform best, when taking this into account. This paper studies the form of such efficient contracts. It is shown that it is always better to have a contract than it is to have none, no matter which party has the preponderence of bargaining strength in the renegotiation phase. We also study whether renegotiation can substitute completely for the absence of contingent contracts. We characterize a family of cases where it can. And we present some "second-best" results in others, where it cannot.https://resolver.caltech.edu/CaltechAUTHORS:20170906-151944637The Geographical Imperatives of the Balance of Power in 3-Country Systems
https://resolver.caltech.edu/CaltechAUTHORS:20170907-151136008
DOI: 10.7907/yreww-09686
This essay extends a cooperative game-theoretic model of balance of power in anarchic
international systems to include considerations of the asymmetry which geography
occasions in the offensive and defensive capabilities of countries. The two
substantive ideas which concern us are a formalization of the notion of a "balancer"
and that of a "central power." What we show is that in stable systems, only specific
countries (such as Britain in the 18th and the 19th centuries) can play the role of
balancer, and that the strategic imperatives of a central country (e.g., Germany in
the period 1871-1945) differ in important ways from those of "peripheral" countries.https://resolver.caltech.edu/CaltechAUTHORS:20170907-151136008Private Incentives in Social Dilemmas: The Effect of Incomplete Information and Altruism
https://resolver.caltech.edu/CaltechAUTHORS:20170907-151800608
DOI: 10.7907/b0q0b-b0t12
This paper analyzes the provision of discrete public goods when
individuals have altruistic preferences which others do not precisely know.
The problem is formulated and solved as a Bayesian game. In contrast to
standard social psychological approaches, based on such natural language terms
as greed, fear, and trust, the Bayesian approach provides a rigorous
mathematical treatment of social participation. This theory is shown to make
strong testable predictions that can integrate data collected across a wide
variety of natural and experimental settings. The al truism model is shown to
be supported by existing experimental data on binary voluntary contribution
games.https://resolver.caltech.edu/CaltechAUTHORS:20170907-151800608Presidential Management of the Bureaucracy: The Reform of Motor Carrier Regulation at the ICC
https://resolver.caltech.edu/CaltechAUTHORS:20170907-155902844
DOI: 10.7907/ea1bt-tqw98
Deregulation of the motor freight industry poses a serious challenge to social scientists'
understanding of how regulatory systems operate. In this research, it is argued that the dismantling
of the regulatory regime at the Interstate Commerce Commission was a product of presidential
management of the bureaucracy through guidance of the status quo. Reform was a function of
neither group influence nor a transition from one structure-induced equilibrium to another. None of
the variety of alternative explanations fare any better. It was the chief executive's ability to
capitalize on presidential resources through strategic utilization of political rules and processes that
was fundamental.
In a world characterized by imperfect information and bounded rationality, presidents have
opportunities for manipulating the status quo and circumventing the policy preference of regulatees
and legislators. Motor carrier reform is simply an extreme example of a general phenomenon.
Besides simply highlighting the need to incorporate additional actors into political economy models
of bureaucratic behavior, this research shows that careful attention must be paid to detailing how
these rules and processes affect bureaucratic performance.https://resolver.caltech.edu/CaltechAUTHORS:20170907-155902844Testing Theories of Legislative Voting: Dimensions, Ideology and Structure
https://resolver.caltech.edu/CaltechAUTHORS:20170908-135657742
DOI: 10.7907/63fpw-k3846
While dimensional studies of legislative voting find a single ideological dimension
(Schneider 1979, Poole and Rosenthal 1985b), regression estimates find constituency and party
dominant (Kau and Rubin 1979, Peltzman 1984), and ideology secondary (Kalt and Zupan 1984).
This paper rescales the dimensional findings to show their improved classification success over the
null hypothesis that votes are not unidimensional. With the rescaling, most votes are not explained
by one dimension, and several dimensions are important
Nevertheless, fewer dimensions are found than constituents' preferences suggest. Thus a
model is developed where transactions costs of building coalitions reduce the number of dimensions.
When legislative parties build internal coalitions to pass and defeat bills, voting on randomly drawn
bills has a single party-oriented dimension. And natural ideological dimensions are reinforced if
parties write bills and logroll along natural lines of cohesion.https://resolver.caltech.edu/CaltechAUTHORS:20170908-135657742The Determinants of Interest Group Membership: Estimating Choice-Based Probability Models of Contribution Decisions
https://resolver.caltech.edu/CaltechAUTHORS:20170907-143516544
DOI: 10.7907/b8yhv-k0251
There has been much theorizing about why individuals join interest groups. However, little
has been done to test the resulting propositions because of the difficulties associated with empirically
analyzing the joining decision. This deficiency is especially great when it comes to public or
symbolic interest groups. In this analysis, choice-based probability methods are employed that
permit the combination of data from the 1980 National Election Study with comparable information
about Common Cause members and the estimation of models of the participation calculus. Besides
demonstrating the applicability of the choice-based methodology, this analysis shows the primary
importance of political interest and policy preferences for the membership choice. Citizens who are
politically interested and have preferences that roughly match an organization's reputation find that
associational membership has both greater benefits and lower costs for them than it does for others.
An ability to pay is irrelevant, regardless of educational attainment and despite members' high
incomes. Organizational leaders deliberately keep the costs of membership low relative to most
citizens' ability to pay; this encourages potential contributors to join in order to learn about the
organization.https://resolver.caltech.edu/CaltechAUTHORS:20170907-143516544Competition on Many Fronts: A Stackelberg Signalling Equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20170907-142614953
DOI: 10.7907/s6fqa-87614
A single economic agent controls a variety of activities. Each activity is associated with a privately observed piece of information. The information is relevant to the actions he will take in this activity, and to the vulnerability of this activity to attack by another agent. Actions should be chosen so as partially to hide the private information, as well as to be efficient in the productive sense. This paper gives a characterization of the optimal association of actions to activities based on the private information available. Some applications are discussed.https://resolver.caltech.edu/CaltechAUTHORS:20170907-142614953A Decade of Experimental Research on Spatial Models of Elections and Committees
https://resolver.caltech.edu/CaltechAUTHORS:20170907-153928970
DOI: 10.7907/bc0gn-qhp30
The Euclidean representation of political issues and alternative outcomes, and
the associated representation of preferences as quasi-concave utility functions is by
now a staple of formal models of committees and elections. This theoretical
development, moreover, is accompanied by a considerable body of experimental research.
We can view that research in two ways: as a test of the basic propositions about
equilibria in specific institutional settings, and as an attempt to gain insights into
those aspects of political processes that are poorly understood or imperfectly
modeled, such as the robustness of theoretical results with respect to procedural
details and bargaining environments. This essay reviews that research so that we can
gain some sense of its overall import.https://resolver.caltech.edu/CaltechAUTHORS:20170907-153928970Three Problems of the Theory of Choice on Random Sets
https://resolver.caltech.edu/CaltechAUTHORS:20170907-145019226
DOI: 10.7907/x0ev8-14e12
This paper discusses three problems which are united not only by the common topic of
research stated in the title, but also by a somewhat surprising interlacing of the methods and
techniques used.
In the first problem, an attempt is made to resolve a very unpleasant metaproblem arising in
general choice theory: why the conditions of rationality are not really necessary or, in other words,
why in every-day life we are quite satisfied with choice methods which are far from being ideal. The
answer, substantiated by a number of results, is as follows: situations in which the choice function
"misbehaves" are very seldom met in large presentations.
In the second problem, an overview of our studies is given on the problem of statistical
properties of choice. One of the most astonishing phenomenon found when we deviate from scalar-extremal
choice functions is in stable multiplicity of choice. If our presentation is random, then a
random number of alternatives is chosen in it. But how many? The answer isn't trivial, and may be
sought in many different directions. As we shall see below, usually a bottleneck case was considered
in seeking the answer. It is interesting to note that statistical information effects the properties of the
problem very much.
The third problem is devoted to a model of a real life choice process. This process is
typically spread in time, and we gradually (up to the time of making a final decision) accumulate
experience, but once a decision is made we are not free to reject it. In the classical statement (i.e.
when "optimality" is measured by some number) this model is referred to as a "secretary problem",
and a great deal of literature is devoted to its study. We consider the case when the notions of
optimality are most general. As will be seen below, the best strategy is practically determined by
only the statistical properties of the corresponding choice function rather than its specific form.https://resolver.caltech.edu/CaltechAUTHORS:20170907-145019226Adverse Selection and Renegotiation in Procurement
https://resolver.caltech.edu/CaltechAUTHORS:20170907-141300917
DOI: 10.7907/gzx4m-b7s90
As was shown by Dewatripont, optimal long-term contracts are generally not sequentially optimal. The parties ex-post renegotiate them to their mutual advantage. This paper fully characterizes the equilibrium of a simple two-period procurement situation and studies the extent to which renegotiation reduces ex-ante welfare: i) A central result is that, like in the non-commitment case, the second period allocation is optimal for the principal conditionally on his posterior beliefs about the agent. ii) The first period allocation exhibits an increasing amount of pooling when the discount factor grows. iii) With a continuum of types, it is never optimal to induce full separation. The paper also analyzes whether renegotiated long-term contracts yield outcomes resembling those under either unnegotiated long-term contracts or a sequence of short-term contracts and it links the analysis with the multiple unit durable good monopoly problem.https://resolver.caltech.edu/CaltechAUTHORS:20170907-141300917The Reintegration of Political Science and Economics and the Presumed Imperialism of Economic Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170908-134843776
DOI: 10.7907/m88qf-94d81
No discipline can claim a greater impact on contemporary political theorizing
than that of economics, whether that theorizing concerns the study of legislatures,
elections, international affairs, or judicial processes. This essay questions,
however, whether this impact is a form of "economic imperialism," or the logical
development of two disciplines whose artificial separation in the first part of this
century merely allowed the development and refinement of the rational choice paradigm,
unencumbered by the necessity for considering all of reality. Indeed, applications to
specific substantive political matters -- most notably collective and cooperative
processes where game theory proves most relevant -- reveal the paradigm's
incompleteness. These applications, however, illuminate the necessary theoretical
extensions, which is no longer the sole domain of the economist.https://resolver.caltech.edu/CaltechAUTHORS:20170908-134843776Testing the Democratic Hypothesis in the Provision of Local Public Goods
https://resolver.caltech.edu/CaltechAUTHORS:20170906-153557950
DOI: 10.7907/jyth7-tgm96
The financing of local public goods in French communities can be
viewed, until 1980, as a one dimensional choice. We propose a model to
formalize this choice which results in the best choice of the "median" agent
in a population in Which two types of citizens have been distinguished.
Those who pay and those who do not pay the "taxe professionnelle". A
translog specification of the model is estimated using data about 36
communities near the city of Toulouse, France. The democratic hypothesis
according to which both types of agents mentioned above have the same
weight in the decision process is rejected. Moreover, we do not reject the
hypothesis that this non democratic bias decreases with the size of the city.https://resolver.caltech.edu/CaltechAUTHORS:20170906-153557950Mechanism Design with Incomplete Information: A Solution to the Implementation Problem
https://resolver.caltech.edu/CaltechAUTHORS:20170907-152638547
DOI: 10.7907/jh4qb-j1857
The main result of this paper is that the multiple equilibrium problem in mechanism design can be avoided in private value models if agents do not use weakly dominated strategies in equilibrium. We show that in such settings, any incentive compatible allocation can be made the unique equilibrium outcome to a mechanism. We derive a general necessary condition for unique implementation which implies that the positive result for private value models applies with considerably less generality to common value settings and to situations in which an agent's information does not index the agent's preferences.https://resolver.caltech.edu/CaltechAUTHORS:20170907-152638547The Rationality Uninformed Electorate: Some Experimental Evidence
https://resolver.caltech.edu/CaltechAUTHORS:20170906-162649254
DOI: 10.7907/2qm0h-kp137
This essay reports on a series of twenty four election experiments in which voters are allowed to decide between voting retrospectively and purchasing contemporaneous information about the candidate challenging the incumbent. Each experiment consists of a series of election periods in which dummy candidates choose spatial positions which represent either their policy while in office or a promise about policy if elected. Subjects (voters) are told the value to them of the incumbent's policy, but they must decide, prior to voting, whether or not to purchase information about the value of the challenger's promise. In general, our data conform to reasonable expectations: voters purchase less information and rely more on retrospective knowledge when the candidates' strategies are stable, and their likelihood of purchasing information during periods of instability is tempered by the likelihood that their votes matter, by the reliability of the information available for purchase, and by the degree of instability as measured by changes in each voter's welfare.https://resolver.caltech.edu/CaltechAUTHORS:20170906-162649254Cramer-Rao Bounds for Misspecified Models
https://resolver.caltech.edu/CaltechAUTHORS:20170823-162930200
DOI: 10.7907/7j7ke-zja80
In this paper, we derive some lower bounds of the Cramer-Rao type for the covariance
matrix of any unbiased estimator of the pseudo-true parameters in a parametric model that may be
misspecified. We obtain some lower bounds when the true distribution belongs either to a
parametric model that may differ from the specified parametric model or to the class of all distributions
with respect to which the model is regular. As an illustration, we apply our results to the normal
linear regression model. In particular, we extend the Gauss-Markov Theorem by showing that
the OLS estimator has minimum variance in the entire class of unbiased estimators of the pseudo-true
parameters when the mean and the distribution of the errors are both misspecified.https://resolver.caltech.edu/CaltechAUTHORS:20170823-162930200The Route to Activism is through Experience: Contributor Mobilization in Interest Groups
https://resolver.caltech.edu/CaltechAUTHORS:20170907-144204258
DOI: 10.7907/j8ghv-ys872
Why members are organizational activists has received little attention, despite its obvious
importance for many associations. In this analysis, a theory of experiential search is applied to the
activist decision calculus of Common Cause members. Most such volunteers join an organization as.
members of the rank and file, learn about the group and its operations, and then decide to become
activists. They are largely motivated by what they learn about those benefits that accrue exclusively
to activists. Their actions also may suggest a tendency toward organizational oligarchy-but one
that is strongly tempered by the presence of other factors shaping the conditional decision calculus.https://resolver.caltech.edu/CaltechAUTHORS:20170907-144204258Choosing Among Public Interest Groups: Membership, Activism, and Retention in Political Organizations
https://resolver.caltech.edu/CaltechAUTHORS:20170906-160437459
DOI: 10.7907/wntkc-pm808
Contemporary scholars who have explored why citizens join organizations have employed
assumptions that are untenable for understanding other member choices. An analysis of data on four
contrasting public interest groups demonstrates that it is possible to develop a general perspective for
explaining member decision-making in organizations. Decisions about which association to join,
whether or not to stay, and whether to be an activist or to remain in the rank and file can all be
understood as reflections of a process in which imperfectly informed citizens join a group, learn
more about it, and subsequently make more knowledgeable choices. The experiential search
perspective provides a coherent explanation for a host of interrelated citizen decisions.https://resolver.caltech.edu/CaltechAUTHORS:20170906-160437459Theories and Tests of Blind Bidding in Sealed-bid Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170906-154413142
DOI: 10.7907/x766n-4cd70
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170906-154413142Moral Hazard, Financial Constraints and Sharecropping in El Oulja
https://resolver.caltech.edu/CaltechAUTHORS:20170907-140631840
DOI: 10.7907/4n008-pd755
This paper develops a theory of sharecropping which emphasizes the dual role of moral hazard in the provision of effort and financial constraints. The model is compatible with a large variety of contracts as observed in the region of El Ouija in Tunisia.
Using an original set of data including financial data, various tests of the theory are realized. The role of financial constraints in the explanation of which type of contract is selected (as well as its implications that financial constraints affect effort and therefore output) are strongly supported by the data.https://resolver.caltech.edu/CaltechAUTHORS:20170907-140631840Elections, Coalitions, and Legislative Outcomes
https://resolver.caltech.edu/CaltechAUTHORS:20170908-170013615
DOI: 10.7907/rp24m-ekx15
This paper develops a multi-stage game-theoretic model of three-party competition under proportional representation. The final policy outcome of the game is generated by a non-cooperative bargaining game between the parties in the elected legislature. This game is essentially defined by the vote shares each party receives in the general election, and the parties' electoral policy positions. At the electoral stage parties and voters are strategic in that they take account of the legislative implications of any electoral outcome. We solve for equilibrium electoral positions by the parties and final policy outcomes.https://resolver.caltech.edu/CaltechAUTHORS:20170908-170013615Madison's Theory of Representation
https://resolver.caltech.edu/CaltechAUTHORS:20170908-160651624
DOI: 10.7907/sa2vm-gqj50
There is renewed interest among political scientists in institutional design issues. Madison was a careful student of political institutions, and his ideas on electoral issues are of interest to contemporary scholars. We consider first the contrast between Madisonian and Public Choice approaches, and then some of Madison's theories about specific problems of electoral institutional design. Finally, we relate Madisonian concepts to some of the present controversies about the terms of office for elected officials, the Voting Rights Act and apportionment.https://resolver.caltech.edu/CaltechAUTHORS:20170908-160651624Inference in Econometric Models with Structural Change
https://resolver.caltech.edu/CaltechAUTHORS:20170911-143306646
DOI: 10.7907/ej3f4-ywh67
This paper extends the classical Chow (1960) test for structural change in linear regress ion models to a wide variety of nonlinear models, estimated by a variety of different procedures. Wald, Lagrange multiplier-like, and likelihood ratio-like test statistics are introduced. The results allow for heterogeneity and temporal dependence of the observations.
In the process of developing the above tests, the paper also provides a compact presentation of general unifying results for estimation and testing in nonlinear parametric econometric models.https://resolver.caltech.edu/CaltechAUTHORS:20170911-143306646Collective Decision-Making and Standing Committees: An Informational Rationale for Restrictive Amendment Procedures
https://resolver.caltech.edu/CaltechAUTHORS:20170911-151433490
DOI: 10.7907/mzs93-2xa36
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170911-151433490Organizational Maintenance and the Retention Decision: A Theory of Experiential Search
https://resolver.caltech.edu/CaltechAUTHORS:20170908-145218330
DOI: 10.7907/dg6yw-gck30
Understanding why members leave or remain in groups has received little attention despite
its fundamental importance for organizational maintenance. In this analysis, a theory of experiential
search is proposed and applied to Common Cause. Group participation is conceptualized as a
process by which imperfectly informed decision-makers learn about the organizations they join.
This framework makes quitting understandable and provides a link between the initial membership
choice and follow-up decisions.https://resolver.caltech.edu/CaltechAUTHORS:20170908-145218330Sincere Voting in Models of Legislative Elections
https://resolver.caltech.edu/CaltechAUTHORS:20170911-142742977
DOI: 10.7907/ft6pf-eeh44
An assumption of sincere voting for one's most preferred candidate is frequently invoked in models of electoral competition in which the elected legislature consists of more than a single candidate or party. Voters, however, have preferences over policy outcomes--which are determined by the ex post elected legislature--and not over candidates per se. This observation provokes the following question. For what methods of translating election results into legislative policy outcomes is sincere voting rational in the legislative election? This paper provides the answer. One of the principal implications is that for sincerity to be rational, there necessarily exists a candidate for office whose electoral platform is the final legislative outcome, whether or not that candidate is elected to the legislature.https://resolver.caltech.edu/CaltechAUTHORS:20170911-142742977The Natural Rate in a Share Economy
https://resolver.caltech.edu/CaltechAUTHORS:20170911-153623191
DOI: 10.7907/xe8y2-36r91
Will the natural rate 0f unemployment be lower in the share economy described by Martin Weitzman than in a wage economy? We examine this question for a search economy with an equilibrium unemployment rate, a version 0f Salop's (1979) quits model.
Equilibrium unemployment is the same in both economies. We also examine firms' short-run adjustment to shocks. Share-economy firms adjust output less than wage-economy firms for both demand shocks and labor-supply shocks. Depending on whether rapid output adjustment is stabilizing, a share economy may be more or less stable than a wage economy.https://resolver.caltech.edu/CaltechAUTHORS:20170911-153623191Nash Implementation Using Undominated Strategies
https://resolver.caltech.edu/CaltechAUTHORS:20170908-151520400
DOI: 10.7907/3fyfp-rks37
This paper provides a characterization of fully implementable outcomes using
undominated Nash equilibrium, i.e. a Nash equilibrium in which no one uses a weakly
dominated strategy. The analysis is conducted in general domains in which agents
have complete information. Our main result is that with at least three agents any
social choice function or correspondence obeying the usual no veto power condition
is implementable unless some players are completely indifferent over all possible
outcomes. This result is contrasted with the more restrictive implementation
findings with either (unrefined) Nash equilibrium or subgame perfect equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170908-151520400Inflation and Expectations in Experimental Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170911-150800321
DOI: 10.7907/yb2y7-krm33
A total of nine experimental markets were studied. Seven of these involved eleven to twelve periods of inflation at a constant percentage and then two or three periods of no inflation. Two experiments involved no inflation for twelve periods and then inflation at a constant rate for three periods. In all but three markets, participants were asked to guess the mean price of the upcoming market period before they had any information about the parameters for that period. The subject with the best guess was given a financial reward in addition to any profit earned in the market.
Convergence properties are compared. Rational expectations models are tested and the structure of forecasts are studied. In general the rational expectations models capture much of what is observed but paradoxes exist in the data and in the application of the models.https://resolver.caltech.edu/CaltechAUTHORS:20170911-150800321Expert Witnesses and Intent
https://resolver.caltech.edu/CaltechAUTHORS:20170911-145616658
DOI: 10.7907/wpg3c-c1e71
How should a historian or a judicial scholar try to determine the intent of defendants in a racial or sex discrimination lawsuit, or the framers of a law or constitutional provision? What can we learn by examining paradigm cases from the employment and voting rights areas, and the classic case of the intentions of the framers of the Fourteenth Amendment?
Making use of models drawn from statistics and from rational choice theory, I examine the general contours of the Sears sex discrimination case, a voting rights suit from Selma, Alabama, and Raoul Berger's attempt to nullify Fourteenth Amendment jurisprudence in his Government by Judiciary. Sears' and Berger's methods and evidentiary conventions are shown to lead to biased results.https://resolver.caltech.edu/CaltechAUTHORS:20170911-145616658Preparing for the Improbable: Safety Incentives and the Price-Anderson Act
https://resolver.caltech.edu/CaltechAUTHORS:20170911-133209283
DOI: 10.7907/74ekw-6ph89
The Price-Anderson Act requires commercial nuclear power plants to maintain (approximately) $660 million in off-site accident coverage through two forms of insurance: market-provided private insurance and self-insurance in the form of retrospective assessments of reactor owners. We examine how changes in retrospective assessments influence the safety incentives of nuclear reactor owners. As one would expect, increases in self-insurance premiums increase the incentive to install safety systems more quickly. However, a more important conclusion is that self-insurance premiums as a function of reactor riskiness, rather than equal payments by reactor owners, yield a higher level of safety than under the current law.https://resolver.caltech.edu/CaltechAUTHORS:20170911-133209283Laws of Large Numbers for Dependent Non-Identically Distributed Random Variables
https://resolver.caltech.edu/CaltechAUTHORS:20170908-163911965
DOI: 10.7907/3gv6t-s2w13
This paper provides L` and weak laws of large numbers for uniformly integrable
L1-mixingales. The L1-mixingale condition is a condition of asymptotic weak temporal dependence
that is weaker than most conditions considered in the literature. Processes covered by the laws of
large numbers include martingale difference, Φ(.), ρ(.) and α(•) mixing, autoregressive moving
average, infinite order moving average, near epoch dependent, L1-near epoch dependent, and
mixingale sequences and triangular arrays. The random variables need not possess more than one
moment finite and the L1-mixingale numbers need not decay to zero at any particular rate. The proof
of the results is remarkably simple and completely self-contained.https://resolver.caltech.edu/CaltechAUTHORS:20170908-163911965Sophisticated Sincerity: Voting Over Endogenous Agendas
https://resolver.caltech.edu/CaltechAUTHORS:20170911-154858256
DOI: 10.7907/rxbfr-6xp42
The empirical findings on whether or not legislators vote strategically are mixed. This is at least partly due to the fact that to establish any hypothesis on strategic voting, legislators' preferences need to be known; and these are typically private data. In this note it is shown that, under complete information, if decision-making is by the amendment procedure and if the agenda is set endogenously, then sophisticated (strategic) voting over the resulting agenda is observationally equivalent to sincere voting. The voting strategies, however, are sophisticated. This fact has direct implications for empirical work on sophisticated voting.https://resolver.caltech.edu/CaltechAUTHORS:20170911-154858256Regulation and the Theory of Legislative Choice: The Interstate Commerce Act of 1887
https://resolver.caltech.edu/CaltechAUTHORS:20170911-162446178
DOI: 10.7907/qcbqw-vy854
This article concerns the economic incidence of the Interstate Commerce Act of 1887 (ICA). Our focus is the short-haul pricing constraint, a provision of the ICA that prohibited railroads from charging higher rates to isolated, primarily agrarian shippers than it charged to intercity shippers of similar commodities. Utilizing the event study methodology, we find that the impending passage of the ICA generated a distribution of abnormal returns to railroads and shipping firms that is consistent with the theoretical implications of our analysis of the short- haul pricing constraint (SHPC). However, early interpretations of the SHPC by the Interstate Commerce Commission reduced some of the abnormal returns to railroads in a manner that is inconsistent with the hypothesis that the short-haul pricing constraint was an important mechanism of early railroad regulation. The analysis does support a multiple-interest interpretation of the Interstate Commerce Act and has implications for the positive theory of regulation.https://resolver.caltech.edu/CaltechAUTHORS:20170911-162446178Political Participation of Ethnic Minorities in the 1980s
https://resolver.caltech.edu/CaltechAUTHORS:20170908-154322007
DOI: 10.7907/txf9e-ekq40
Currently political participation, especially voter registration and
turnout, varies substantially with ethnicity. Blacks and non-Hispanic whites
participate at roughly equal rates, while Latinos and Asian-Americans are
substantially less active, this variation may reflect cultural factors, or
it may be the spurious product of differences in the distribution of
non-ethnic determinants of participation, including socioeconomic and
demographic characteristics, variables reflecting immigration history,
including citizenship, and measures of group identification. Using data
collected in 1984 on samples of California's black, Latino, Asian-American,
and non-Hispanic white populations, we conclude that these other variables
fully account for lower Latino participation rates. Even with such controls,
however, Asian-Americans remain less likely to vote. Although non-citizens
participate less than citizens, they do engage in non-electoral activities,
Finally, we speculate on the future political impact of Latinos and
Asian-Americans, by projecting participation rates under several scenarios.https://resolver.caltech.edu/CaltechAUTHORS:20170908-154322007The Economic Incidence of the Interstate Commerce Act of 1887: A Theoretical and Empirical Analysis of the Short-haul Pricing Constraint
https://resolver.caltech.edu/CaltechAUTHORS:20170911-160156080
DOI: 10.7907/0mb4d-xwh10
The public and private interest hypotheses permeate contemporary regulatory analyses. Both theories are used to explain the inception of the first major federal regulatory agency, the Interstate Commerce Commission (ICC). According to the public and private interest hypotheses, the regulations promulgated by the ICC benefited either railroads or shippers. This paper presents an alternative view consistent with the multiple interest theory of regulation. It is demonstrated that the major regulatory instrument of the ICC, the short-haul pricing constraint (SHPC), altered the equilibria of railroad markets in a way which benefitted the class of shippers (short-haul shippers) facing monopolistic railroad markets. The SHPC also benefitted some railroads by increasing the correspondence between unregulated, cooperative and regulated, noncooperative levels of long-haul shipments. The proposition that the ICC benefited short-haul shippers and railroads is supported by an empirical analysis of the effects of the inception of federal regulation and implementation of the SHPC on stock prices. The results of the paper indicate that the public and private interest interpretations of the ICC are neither contradictory nor complete, but instead are complementary. A theoretical and empirical analysis of the chief regulatory mechanism of the ICC provides this synthesis.https://resolver.caltech.edu/CaltechAUTHORS:20170911-160156080The Effect of Tax and Audit Rates on Compliance with the Federal Income Tax, 1977-85
https://resolver.caltech.edu/CaltechAUTHORS:20170911-141344635
DOI: 10.7907/gk55y-d2k84
This paper develops a game-theoretic model of the effects of state and federal income tax rates and audit rates on compliance with the federal income tax. Using data drawn primarily from the Annual Reports of the Commissioner of Internal Revenue for the years 1977-85, we find empirical confirmation of the model's prediction that increases in the tax rates increase compliance. We also investigate the overall performance of the federal revenue collection process and find that the entire IRS estimate of the increase in individual noncompliance during 1977-85 is more than accounted for by the decrease in auditing over the same period.https://resolver.caltech.edu/CaltechAUTHORS:20170911-141344635Electing Legislatures
https://resolver.caltech.edu/CaltechAUTHORS:20170908-163132276
DOI: 10.7907/w7pch-hc458
A "legislature" is defined to be an assembly of at least two elected officials which selects final
policy outcomes. Legislative elections therefore concern the electoral choice of such an assembly.
The classical two-candidate, single-district, model of electoral competition is not a legislative election
in the sense of this essay. In the classical model the legislature comprises the winning candidate:
this agent has monopolistic control of the legislative decision-making machinery, and implements his
winning policy. With this system, voters have a straightforward "best" voting rule for any pair of
candidate positions offered in the election: vote sincerely. In the multi-stage legislative electoral
system, final outcomes depend on the entire composition of the legislature and the specifics of
legislative decision-making. With such a system, voters' decisions are considerably less
straightforward, which in turn complicates candidates' strategic choices. This paper presents a fairly
technical review of the spatial-theoretic literature on legislative elections.
The paper was commissioned by Norman Schofield for the conference on Coalition Theory and
Public Choice (Fiesole, Italy: May 1987). On the one hand, the task was easy: the literature is small
and much of it involves my own work. On the other hand, the task was difficult: the literature is
small and much of it involves my own work. In any event, I am grateful to Professor Schofield for
giving me the opportunity and incentive to raise some issues with which I have long been
concerned. He is in no way responsible for any errors or omissions the paper might contain. I feel
perfectly free, however, to blame him for the appearance of self-indulgence that the essay surely
has.https://resolver.caltech.edu/CaltechAUTHORS:20170908-163132276The Design of Mechanisms to Allocate Space Station Resources
https://resolver.caltech.edu/CaltechAUTHORS:20170908-151906378
DOI: 10.7907/68k2y-nze20
This paper demonstrates the use of applied organizational design to investigate possible
mechanisms to allocate the resources of Space Station. First, a specific laboratory experimental
environment (testbed) and baseline policy are developed using the salient technical features of the
Space Station and past Space Shuttle experiences. The use of priority contracts to assist in
contingent rescheduling of resources due to supply curtailments is established. Next, generalized
versions of an English auction and Vickrey-Groves type sealed bid auction are designed and
developed to allocate scheduled resource use and priority. Finally, these mechanisms are tested and
evaluated in the testbed. The data demonstrates that the expected efficiency increases significantly
using the auction mechanisms rather than allocations from first-come-first-served processes.
However, the auction mechanisms do not produce outcomes near the 100% level of efficiency.
Several results are dedicated to the revenue generating properties of the mechanisms and individual
bidding behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170908-151906378Different Preferences, Different Politics: A Demand-and-Structure Explanation
https://resolver.caltech.edu/CaltechAUTHORS:20170911-135520458
DOI: 10.7907/bxeyz-3pb57
Different types of legislative politics are explained in this paper by the distribution of legislators' demands. Demands are legislators' willingness to pay for victory on a bill, with votes on other issues, effort, or work. Different demand distributions require different institutions and "politics" for the legislators to obtain the results they want.
The types of politics can be largely identified with Lowi's typology of interest-group interaction. Distributive politics combines many individual projects, each with a small intensely favorable minority and a large, slightly opposed majority. Since no one project could pass on its own, compound bills are created that benefit all legislators (Weingast 1979).
Redistributive issues have two large intensely opposed groups. Their politics are conflict, mobilizations of one's partisans, and efforts to obtain the votes of the few indifferents (Schneider 1979).
Regulative politics have two forms. Simple regulative issues have small intense groups for and against the bill, and a vast majority of indifferents. Each side appeals to the indifferents, creating a natural arena for vote-trading. Complex regulative issues allow the distribution of demand to change as the bill proposal is modified. They often involve novel legislation, whose consequences are not clear. Those dominating the agenda control the nature of the bill to maximize their gains and assure a majority for passage (Shepsle and Weingast 1984). Vote-trading also occurs, since most legislators are indifferent.https://resolver.caltech.edu/CaltechAUTHORS:20170911-135520458Predicting Partisan Redistricting Disputes
https://resolver.caltech.edu/CaltechAUTHORS:20170911-140445445
DOI: 10.7907/a1hrw-qn813
Partisan redistricting disputes are relatively rare occurrences. This paper explores the factors that lead to partisan disputes over congressional redistricting plans. In previous work single party control of both houses of the state legislature and the governorship emerged as a key correlate of partisan redistricting in the 1980s. This paper presents an interactive statistical model of partisan redistricting plans. The basic conclusion is that in addition to single party control, the nature of the voting role and the political competitiveness of the states effect the likelihood of partisan redistricting.https://resolver.caltech.edu/CaltechAUTHORS:20170911-140445445The Multiple Unit Double Auction
https://resolver.caltech.edu/CaltechAUTHORS:20170912-131831695
DOI: 10.7907/10wx6-yvn55
The note outlines institutional features of an open outcry market that permit multiple unit or "block" trades. The purpose of the note is to introduce the process as a tool to be used in economics experiments. The detailed rules governing the multiple unit double auction (MUDA) are stated for the case where offers are tendered by voice as opposed to through a computer. The results of markets that use the rules are reported as a demonstration that convergence to equilibrium price traditionally observed in experimental markets remains with the use of the MUDA process.https://resolver.caltech.edu/CaltechAUTHORS:20170912-131831695Are we a Nation of Tax Cheaters? New Econometric Evidence on Tax Compliance
https://resolver.caltech.edu/CaltechAUTHORS:20170911-170537748
DOI: 10.7907/yfjwd-43e95
The theoretical basis for the economic approach to tax compliance has, at least until recently, been inadequate, and the limited empirical work based on it is seriously flawed. In this paper we briefly review both, as well as new theoretical and, especially, empirical work on the tax compliance problem. With respect to the latter we present preliminary results based cm a state-level, time-series, cross-section data set drawn in part from the annual reports of the Commissioner of Internal Revenue.https://resolver.caltech.edu/CaltechAUTHORS:20170911-170537748Invisible-Hand Explanations Reconsidered
https://resolver.caltech.edu/CaltechAUTHORS:20170912-135330205
DOI: 10.7907/fb2d6-rdy06
Edna Ullmann-Margalit introduced the notion of an invisible hand explanation (I-H explanation) to the philosophical literature in 1978, and made a distinction between "aggregate" and "functional-evolutionary" (F-E) forms of I-H explanations. The present paper produces a substantially refined analysis of the forms and functions of I-H explanations. Sections (1) and (2) introduce the ideas of I-H and aggregate I-H explanation, respectively.
Section (J) argues that no one form of explanation can serve the explanatory functions Ullmanri-Margalit attributes to aggregate explanations, and divides those explanatory functions between genetic and "systematic-dispositional" explanations. Section (4) identifies difficulties with the idea of F-E explanation in the social realm, and shows that any I-H explanations fitting the F-E mold would constitute simply a special class of "aggregate" explanation.https://resolver.caltech.edu/CaltechAUTHORS:20170912-135330205The Analysis of Committee Power: An Application to Senate Voting on the Minimum Wage
https://resolver.caltech.edu/CaltechAUTHORS:20170911-133744911
DOI: 10.7907/gtmj4-26n11
A widely noted empirical regularity of congressional behavior is that standing committees exert disproportionate influence on congressional choices. The observed phenomenon has a number of labels-committee influence, committee power, and (from the parent chamber's perspective) deference to committees-and a large body of theoretical and empirical research has sought to determine when and why it exists. This paper takes as given only the weakest form the observation, namely, that committee power exists sometimes. It does not directly address the questions of when and why committee power exists, although much of the relevant literature is reviewed. Rather, it focuses on a prerequisite to the resolution of disputes about committee power. How can committee power be assessed empirically? Section I reviews three classes of explanations and identifies obstacles to convincing empirical tests of the accounts. Section II introduces an econometric approach for analyzing committee power. Section III applies the technique to a sequence of votes on minimum wage legislation in the Senate in 1977. Section IV extends the technique to multi-dimensional choice spaces. Section V is a discussion and summary.https://resolver.caltech.edu/CaltechAUTHORS:20170911-133744911Steam-Electric Scale Economies and Construction Lead Times
https://resolver.caltech.edu/CaltechAUTHORS:20170911-164339811
DOI: 10.7907/wfqs9-cr410
There was a widespread belief in the 1970s that the construction of coal and nuclear generating units exhibited positive economies of scale. Recent empirical literature has confirmed this belief for coal plants. But these studies have not considered the relationships among cost, plant size, and the building period. This paper derives and estimates a model in which construction cost and lead time are jointly determined. Constant returns to scale are not rejected for nuclear units. While coal units may exhibit positive returns to scale, because larger plants take longer to build, these returns are lower than previously estimated.https://resolver.caltech.edu/CaltechAUTHORS:20170911-164339811Liability Rules and Pretrial Settlement
https://resolver.caltech.edu/CaltechAUTHORS:20170913-141634119
DOI: 10.7907/ghjz6-1sf90
The effect of different liability rules on the pretrial behavior of litigants to a civil suit is analyzed. The interaction is modeled as a game of incomplete information, where both the plaintiff and the defendant know whether or not they were negligent in actions leading to the accident. Selection criteria are used to refine the set of sequential equilibria of the game.https://resolver.caltech.edu/CaltechAUTHORS:20170913-141634119A Rationale for Restrictive Rules
https://resolver.caltech.edu/CaltechAUTHORS:20170914-153400400
DOI: 10.7907/t5g8n-t5w69
Congressmen often claim to dislike restrictions on their opportunities to offer amendments to legislation in the Committee of the Whole. Yet restrictive rules of various forms not only are quite common but often are voted into existence explicitly or implicitly. Whenever a modified closed rule from the Rules Committee receives a majority vote, members explicitly accept the restrictions that such rules place on amendments. Whenever a bill is passed under suspension of the rules, the requisite two-thirds vote is an implicit acceptance of restrictions, because the vote has the effect of not only passing the legislation, but passing it unamended. The frequency with which such procedures are used in the House of Representatives suggests that restrictions on the ability to amend are not abhorred after all. Thus the question: why do members of a democratic and decentralized legislature tolerate, indeed choose, restrictive rules?
This paper ad dresses the question with a simple theoretical model based on a large class of empirical situations. The central argument is that restrictive rules are effective institutional devices for congressmen to initiate and maintain pareto optimal outcomes in areas of policy where, in the absence of such rules, initiation and maintenance of policies would be difficult.https://resolver.caltech.edu/CaltechAUTHORS:20170914-153400400Generalized Inverses and Asymptotic Properties of Wald Tests
https://resolver.caltech.edu/CaltechAUTHORS:20170913-143641248
DOI: 10.7907/ctkf9-4ew54
We consider Wald tests based on consistent estimators of g-inverses of the asymptotic covariance matrix ∑ of a statistic that is n^1/2-asymptotically normal distributed under the null hypothesis. Under the null hypothesis and under any sequence of local alternatives in the column space of ∑, these tests are asymptotically equivalent for any choice of g-inverses. For sequences of local alternatives not in the column space of ∑ and for a suitable choice of g- inverse, the asymptotic power of the corresponding Wald test can be made equal to zero or arbitrarily large. In particular, the test based on a consistent estimator of the Moore-Penrose inverse of ∑ has zero asymptotic power against sequences of local alternatives in the orthogonal complement to the column space of ∑.https://resolver.caltech.edu/CaltechAUTHORS:20170913-143641248Slave Breeding
https://resolver.caltech.edu/CaltechAUTHORS:20170914-141634237
DOI: 10.7907/2v7yj-z4998
This paper reviews the historical work on slave breeding in the ante-bellum United States. Slave breeding consisted of interference in the sexual life of slaves by their owners with the intent and result of increasing the number of slave children born. The weight of evidence suggests that slave breeding occurred in sufficient force to raise the rate of growth of the American slave population despite evidence that only a minority of slave-owners engaged in such practices.https://resolver.caltech.edu/CaltechAUTHORS:20170914-141634237Rational Choice in Experimental Markets
https://resolver.caltech.edu/CaltechAUTHORS:20170913-163001013
DOI: 10.7907/hhzr1-8rs56
The theory of rational behavior has several different uses. (i) It is used at the most fundamental level of experimental methodology to induce preferences used as parameters in models. (ii) It appears repeatedly in experimentally successful mathematical models of complex phenomena such as speculation, bidding and signaling. (iii) It is used as a tool to generate ex-post models of results that are otherwise inexplicable. (iv) It has been used as a tool to successfully design new institutions to solve specific problems. When tested directly the theory can be rejected. It is retained because neither an alternative theory nor an alternative general principle accomplishes so much.https://resolver.caltech.edu/CaltechAUTHORS:20170913-163001013Endogenous Agenda Formation in Three-Person Committees
https://resolver.caltech.edu/CaltechAUTHORS:20170913-153417196
DOI: 10.7907/vdv5r-3s803
This paper analyzes a 3-person voting game in which two or three players have the ability to choose alternatives to be considered. Once the set of possible alternatives and the structure of the voting procedure are known, the players can solve for the outcome. Thus, the actual choice over outcomes takes place in the choice of alternatives to be voted on, i.e., the agenda. An equilibrium to this agenda-formation game is shown to exist under different assumptions about the information relative to the order of the players in the voting game. Further, this equilibrium is computed and found to possess certain features which are attractive from a normative point of view.https://resolver.caltech.edu/CaltechAUTHORS:20170913-153417196Laboratory Experiments in Economics: The Implications of Posted-Price Institutions
https://resolver.caltech.edu/CaltechAUTHORS:20170913-164929717
DOI: 10.7907/bgpp5-g9665
A laboratory experimental methodology has been developing in economics in recent years. The nature of the methodology is to integrate clearly motivated but largely subjectively determined human decisions with the organizational features of markets. The article summarizes the nature of the incentive system and how market organization can be used as an independent variable. Initial basic research results that involved the assessment of the effects of posted price organization demonstrated that the effect of the institution is to raise prices and lower market efficiency. The existence of such effects and the close proximity of the laboratory posted price institution and rate posting institution required by the government in several industries has led to a series of policy related experiments. The results have also led to more basic research efforts on seemingly unrelated topics.https://resolver.caltech.edu/CaltechAUTHORS:20170913-164929717A Note on Allowed and Realized Rates of Return in the Electric Utility Industry
https://resolver.caltech.edu/CaltechAUTHORS:20170913-140820995
DOI: 10.7907/hp10m-3q074
Most empirical investigations of electric utility behavior use the realized rate of return as a proxy for the allowed rate of return, We examine the validity of this assumption and investigate the relationship of the allowed and realized rates to the cost of capital between 1973 and 1982, We use two measures of the cost of capital: one based on returns to book equity, the other derived from a market price of equity. While realized and allowed rates were generally higher than the book measure throughout the period, both of the rates of return were less than the market price of capital after 1979. We also find firms did not earn their allowed rate of return after 1974. Therefore, the use of the realized rate as a proxy for the allowed rate in empirical models will lead to biased parameter estimates. To help correct this bias, we give data for allowed rates.https://resolver.caltech.edu/CaltechAUTHORS:20170913-140820995A Note on the Independence of Irrelevant Alternatives in Probabilistic Choice Models
https://resolver.caltech.edu/CaltechAUTHORS:20170915-141556593
DOI: 10.7907/s1j8v-enq27
The purpose of this note is to show that there is no necessary relationship between the independence of irrelevant alternatives (IIA) property and stochastic independence of the errors in probabilistic choice models.https://resolver.caltech.edu/CaltechAUTHORS:20170915-141556593Selecting the Best Linear Regression Model: A Classical Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170913-142645530
DOI: 10.7907/5nn18-dvb07
In this paper, we apply the model selection approach based on Likelihood Ratio (LR) tests developed in Vuong (1985) to the problem of choosing between two normal linear regression models which are not nested in each other. First we compare our model selection procedure to other model selection criteria. Then we explicitly derive the procedure when the competing linear models are non-nested and neither one is correctly specified. Some simplifications are seen to arise when both models are contained in a larger correctly specified linear regression model, or when at least one competing linear model is correctly specified. A comparison of our model selection tests and previous non-nested hypothesis tests concludes the paper.https://resolver.caltech.edu/CaltechAUTHORS:20170913-142645530The Economics of Space Station
https://resolver.caltech.edu/CaltechAUTHORS:20170912-150213282
DOI: 10.7907/65nfx-ytv91
In this paper I will examine some of the economic and management issues which must be addressed if Space Station is to effectively and efficiently pursue the myriad goals that have been chosen for it. I will characterize and evaluate in a somewhat stylized fashion three possible policies: an "engineering" approach, an "economics" approach, and a systematic custom design approach. I will use Space Station as an example to highlight some of the major economic issues facing large scale multipurpose research and development efforts, the analytical capabilities we now have to address these issues, and the (non-engineering) research that needs to be done to advance the successful long-term development of space.https://resolver.caltech.edu/CaltechAUTHORS:20170912-150213282Cartel Enforcement with Uncertainty About Costs
https://resolver.caltech.edu/CaltechAUTHORS:20170912-141652274
DOI: 10.7907/keft3-2tk29
What cartel agreements are possible when firms have private information about production costs? In order for a cartel agreement to work it must take into account the incentives for firms to misrepresent their cost information and it must provide sufficient reward so that no firm wishes to defect. For private cost uncertainty we characterize the set of cartel agreements with side payments that can be supported as Bayesian Nash equilibria. We show that if defection results in either Cournot or Bertrand competition the incentive problems in large cartels are severe enough to prevent the cartel from achieving the monopoly outcome. In contrast, with common cost uncertainty, the incentive problems become less severe in large cartels, allowing perfect collusion.https://resolver.caltech.edu/CaltechAUTHORS:20170912-141652274Econometric Modeling of a Stackelberg Game with an Application to Labor Force Participation
https://resolver.caltech.edu/CaltechAUTHORS:20170915-135937957
DOI: 10.7907/wkyqn-xhh62
Following Bjorn and Vuong (1984), a model for dummy endogenous variables is derived from a game theoretic framework where the equilibrium concept used is that of Stackelberg. A distinctive feature of our model is that it contains as a special case the usual recursive model for discrete endogenous variables [see e.g., Maddala and Lee (1976)]. A structural interpretation of this latter model can then be given in terms of a Stackelberg game in which the leader is indifferent to the follower's action. Finally, the model is applied to a study of husband/wife labor force participation.https://resolver.caltech.edu/CaltechAUTHORS:20170915-135937957The Returns to Insulation Upgrades: Results from a Mixed Engineering Econometric Model
https://resolver.caltech.edu/CaltechAUTHORS:20170914-162617345
DOI: 10.7907/372a6-nhk98
This paper estimates a new model of residential electricity demand. It differs from previous work in two ways. First, we utilize individual monthly billing data in a pooled time-series/cross-section framework. Second, we use an engineering/thermal load technique to model the household apace-heating technology. This allows more precise separation of the effects of economic variables from those of weather, and permits simulation of the effects of various conservation policies.
We estimate the model using data from the Pacific Northwest, and use the results to analyze three conservation measures: a price increase, a reduction in thermostat settings, and an improvement of insulation levels. We find average rates of return for insulation upgrades of 4.9 percent for ceilings and 8.3 percent for walls.https://resolver.caltech.edu/CaltechAUTHORS:20170914-162617345A Theory of Auditing and Plunder
https://resolver.caltech.edu/CaltechAUTHORS:20170915-144256949
DOI: 10.7907/hq4vr-2ev86
Taxpayers know their income but the IRS does not. The IRS can audit taxpayers to discover their true income, but auditing is costly. We characterize optimal policies for the IRS when it is free to choose tax levies, audit probabilities and penalties. The main results are that optimal policies involve taxes which are monotonically increasing in reported incomes and audit probabilities are monotonically decreasing in reported income. In general optimal schemes involve stochastic auditing of reports and rebates for telling the truth. A theory of optimal plundering is described.https://resolver.caltech.edu/CaltechAUTHORS:20170915-144256949Comparing Models of Electric Utility Behavior
https://resolver.caltech.edu/CaltechAUTHORS:20170914-150200908
DOI: 10.7907/qwvge-2nt80
Several models of electric utility behavior have been suggested and tested. Among them are profit and revenue maximization and cost and revenue minimization. The latter being the stated objective of many public utilities. These four models are compared empirically by examining power plant choice from 1970 to 1977. The net present value (profit) model yields the highest estimated likelihood and its parameters are consistent with a priori theory. Firms were attempting to maximize their return, while minimizing fixed and variable costs. Also, I find no evidence that the difference between the allowed rate of return and the cost of capital influenced technology choice.https://resolver.caltech.edu/CaltechAUTHORS:20170914-150200908Theories of Price Formation and Exchange in Double Oral Auctions
https://resolver.caltech.edu/CaltechAUTHORS:20170913-133503410
DOI: 10.7907/hwega-5cv60
We provide a theory to explain the data generated by Double Oral Auctions. The primary conclusion suggested by Double Oral Auction experiments is that the quantities exchanged and the prices at which transactions take place converge to, or near to, the values predicted by the competitive equilibrium model. Our theory predicts convergence to the competitive equilibrium and provides an explanation of disequilibrium behavior. The predictions of our theory fit the data better than do the predictions of Walrasian, Marshallian or game theoretic models.https://resolver.caltech.edu/CaltechAUTHORS:20170913-133503410The Economics of Tax Compliance: Fact and Fantasy
https://resolver.caltech.edu/CaltechAUTHORS:20170915-143627931
DOI: 10.7907/ad6ms-wqh65
This paper reviews the current state of theoretical and empirical knowledge regarding compliance with the federal income tax laws. We focus on the validity of certain myths that have come to dominate tax compliance discussions. Toward that end, we discuss three general categories--empirical work, theoretical methodology and fiscal policy recommendations--that seem to require more careful assessment and formulation.https://resolver.caltech.edu/CaltechAUTHORS:20170915-143627931An Empirical Analysis of Federal Income Tax Auditing and Compliance
https://resolver.caltech.edu/CaltechAUTHORS:20170912-150755319
DOI: 10.7907/fbxdh-p1n34
This paper provides empirical evidence on the relationship between compliance with the Federal Income Tax and auditing by the Internal Revenue Service. It combines a cross-section data set related to 1969 individual returns assembled by the IRS with data taken from the Annual Report of the Commissioner of Internal Revenue. We find strong support for an economic approach to tax compliance, but one that incorporates the IRS as a strategic actor. In particular, while audits may have a deterrent effect on noncompliance, we find that they are themselves, in the majority of cases, responsive to the pattern of compliance.https://resolver.caltech.edu/CaltechAUTHORS:20170912-150755319Equilibrium Search Models as Simultaneous Move Games
https://resolver.caltech.edu/CaltechAUTHORS:20170914-160405405
DOI: 10.7907/07g4h-06450
In many of the existing equilibrium search models, sellers as a group are leaders and buyers as a group are followers to the extent that the latter are assumed to know the distribution of prices but not the price-seller correspondence before they make their information acquisition decisions. A natural way to weaken this strong version of "rational expectations" is to treat the problem as a simultaneous move game in which buyers must make their information acquisition decisions before they see the actual distribution of prices. This paper explores the implications of this modification of the existing literature in the context of Salop and Stiglitz's well-known model of monopoliatically competitive price dispersion (1977) and the model of equilibrium comparison shopping due to Wilde and Schwartz (1979). It considers both finitely many consumers and arbitrarily large numbers of consumers in both cases, and characterizes necessary and sufficient conditions for the existence of various mixed and pure strategy equilibria in each case. This yields a coherent integration of many of the known results as well as the derivation of a number of new results.https://resolver.caltech.edu/CaltechAUTHORS:20170914-160405405Man in the Public Sector
https://resolver.caltech.edu/CaltechAUTHORS:20170912-153630936
DOI: 10.7907/9sjnj-chw06
This paper addresses the question of how to model government behavior. The central thought is• that in principle the same behavioral model should apply to the behavior of individuals in the private sector as well as the public sector. The paper starts, therefore, with an outline of the contours of a general model of individual behavior. Use is thereby made of the so-called interest function approach that I developed in On the Interaction Between State and Private Sector (Amsterdam: North-Holland, 1983) and which is somewhat further elaborated in this paper. The model is subsequently applied to the behavior of the individuals that makeup the government organization, bureaucrats and politicians. The potential importance of the approach is indicated by a short survey of the theoretical and empirical results obtained with it so far.https://resolver.caltech.edu/CaltechAUTHORS:20170912-153630936Parameterization and Two-Stage Conditional Maximum Likelihood Estimation
https://resolver.caltech.edu/CaltechAUTHORS:20170915-134433552
DOI: 10.7907/ccwfw-y7228
This paper considers the case where, after appropriate reparameterization, the probability density function can be factorized into a marginal density function and a conditional density function such that one of them involves fewer parameters. Then, two types of two-stage conditional maximum-likelihood estimators, 2SCMLEI and 2SCMLEII, can be considered according to whether the marginal or the conditional density has fewer parameters. Our first result indicates that, under some identification assumptions, there is a connection between the number of parameters in the marginal (or conditional) density functions under the two reparameterizations. Moreover, conditions for asymptotic equivalence and numerical equivalence between these two-stage estimators and the FIML estimator are obtained. Finally, examples are provided to illustrate our results.https://resolver.caltech.edu/CaltechAUTHORS:20170915-134433552Political Power in the International Coffee Organization: A Research Note
https://resolver.caltech.edu/CaltechAUTHORS:20170914-152422470
DOI: 10.7907/t2q0c-0rt35
In recent decades, international commodity agreements have been proposed as a way of promoting "development." Behrman, McNicol and others have analyzed them from a purely economic point of view. Fisher, Krasner, and others have adopted a more political perspective. In this article, we seek to advance the political analysis of such agreements. We do so by studying the allocation of export entitlements in the International Coffee Organization (ICO). In the ICO, as in other international organizations, political processes replace markets in the allocation of source resources. In the case of the ICO, allocational decisions are made by majority rule. Given the possibility of strategic behavior in such political environments, game theory should provide a useful set of tools for the analysis of such institutions. A particular interest of this article is the appropriateness of a specific solution concept--the Shapley value--to the analysis of politically contrived allocations under the ICO.https://resolver.caltech.edu/CaltechAUTHORS:20170914-152422470Congressional Roll Call Voting Strategies: Application of a New Test to Minimum Wage Legislation
https://resolver.caltech.edu/CaltechAUTHORS:20170914-155520482
DOI: 10.7907/b4t6s-vdz16
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170914-155520482On Bayesian Implementable Allocations
https://resolver.caltech.edu/CaltechAUTHORS:20170912-132538243
DOI: 10.7907/0fhs2-c5n57
This paper identifies several social choice correspondences which are and are not fully implementable in economic environments when agents are incompletely informed about the environment. We show that in contrast to results in the case of complete information, neither efficient allocations nor core allocations define implementable social choice correspondences. We also identify conditions under which the Rational Expectations Equilibrium correspondence is implementable. We extend the concepts of fair allocations and Lindahl allocations to economies with incomplete information, and show that envy-free allocations and Lindahl allocations are implementable under some conditions while fair allocations are not.https://resolver.caltech.edu/CaltechAUTHORS:20170912-132538243Likelihood Ratio Tests for Model Selection and Non-Nested Hypotheses
https://resolver.caltech.edu/CaltechAUTHORS:20170913-150620147
DOI: 10.7907/gx67y-we971
In this paper, we propose a classical approach to model selection. Using the Kullback-Leibler Information measure, we propose simple and directional likelihood-ratio tests for discriminating and choosing between two competing models whether the models are nonnested, overlapping or nested and whether both, one, or neither is misspecified. As a prerequisite, we fully characterize the asymptotic distribution of the likelihood ratio statistic under the most general conditions.https://resolver.caltech.edu/CaltechAUTHORS:20170913-150620147Evidence of Block Switching in Demand Subject to Declining Block Rates - A New Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170914-161640436
DOI: 10.7907/9319h-2fz70
This paper considers the problem of forecasting demand subject to a non-linear rate schedule. We develop an empirical model of electricity demand subject to a quantity determined rate schedule and suggest a new procedure to estimate population taste variation. Using micro-level data from the 1975 Washington Center for Metropolitan Studies (WCMS) survey, we provide evidence on the prevalence and extent of block switching.https://resolver.caltech.edu/CaltechAUTHORS:20170914-161640436Reconciliation and the Size of the Budget
https://resolver.caltech.edu/CaltechAUTHORS:20170915-145457916
DOI: 10.7907/jea61-qj648
Reconciliation has become a regular feature of the congressional budget process. We address the question of whether or under what conditions the budget process with reconciliation (modeled as selection of the size of the budget first and its division second) produces smaller budgets than a piecemeal appropriations process in which the size of the budget is determined residually. The theoretical result is that reconciliation sometimes results in relatively large budgets. A testable implication of the theory is that given a choice of how stringently reconciliation is to be employed, congressmen will jointly consider preferences and the expected outcomes under the available institutional arrangements and select the arrangement (usually a rule) that yields the most favorable outcome. Empirical results from the budget process in the House from 1980-83 are generally supportive of the hypothesis of rational choice of institutional arrangements which is derived from the theory.https://resolver.caltech.edu/CaltechAUTHORS:20170915-145457916Market Failure
https://resolver.caltech.edu/CaltechAUTHORS:20170912-134741993
DOI: 10.7907/rjxey-cy967
Market failure is described and discussed. A summary of the current state of understanding is provided. Key words are: market failure, public good, externalities, rational expectations, information, monopoly, and competitive equilibrium.https://resolver.caltech.edu/CaltechAUTHORS:20170912-134741993Pivot Mechanisms in Probability Revelation
https://resolver.caltech.edu/CaltechAUTHORS:20170913-170459300
DOI: 10.7907/3eqpm-6km75
The Groves mechanism and k^th price auctions are well-known examples of pivot mechanisms. In this paper an analogous pivot mechanism is defined for probability revelation and then the Bayesian equilibria are characterized for the three pivot mechanisms. The main result is that in Bayesian games with these pivot mechanisms, equilibria must satisfy a simple fixed point condition. The result does not require signal ordering properties and thus generalizes and simplifies results by Milgrom and others. When the fixed point is unique there is "no regret." The result also holds for games less structured than Bayesian games (where the common knowledge and consistency assumptions are relaxed).
The pivot mechanism in probability revelation is shown to generalize and characterize proper scoring rules. The characterization yields an optimization of research incentives for proper scoring rules and suggests that under some conditions the new mechanisms, which are pivot mechanisms but not proper scoring rules, outperform proper scoring rules.https://resolver.caltech.edu/CaltechAUTHORS:20170913-170459300Game Forms for Nash Implementation of General Social Choice Correspondences
https://resolver.caltech.edu/CaltechAUTHORS:20170915-133024179
DOI: 10.7907/a7ntz-bnj28
Several game forms are given for implementing general social choice correspondences (SCC's) which satisfy Haskin's conditions of monotonicity and No Veto Power. The game forms have smaller strategy spaces than those used in previously discovered mechanisms: the strategy for an individual consists of an alternative, two subsets (of alternatives), and a player number. For certain types of economic and political SCC's, including a-majority rule, the Walrasian, and Lindahl correspondence, the strategy space reduces to an alternative and a vector, where the number of components of the vector is at most twice the dimension of the alternative space.https://resolver.caltech.edu/CaltechAUTHORS:20170915-133024179Chapter I: The Supremacy of Equal Rights
https://resolver.caltech.edu/CaltechAUTHORS:20170912-140126509
DOI: 10.7907/smz30-khx98
The black and white abolitionist agitation of the school integration issue in Massachusetts from 1840 to 1855 gave us the first school integration case filed in America, the first state Supreme Court decision reported on the issue, and the first state-wide law banning racial discrimination in admission to educational institutions. Who favored and who opposed school integration, and what arguments did each side make? Were the types of arguments that they offered different in different forums? Were they different from 20th century arguments? Why did the movement triumph, and why did it take so long to do so? What light does the struggle throw on views on race relations held by members of the antebellum black and white communities, on the character of the abolitionist movement, and on the development of legal doctrines about racial equality? Perhaps more generally, how should historians go about assessing the weight of different reasons that policymakers adduced for their actions, and how flawed is a legal history that confines itself to strictly legal materials? How can social scientific theory and statistical techniques be profitably applied to politico-legal history? Part of a larger project on the history of court cases and state and local provisions on racial discrimination in schools, this paper introduces many of the main themes, issues, and methods to be employed in the rest of the book.https://resolver.caltech.edu/CaltechAUTHORS:20170912-140126509An Experimental Analysis of Public Goods Provision Mechanisms with and without Unanimity
https://resolver.caltech.edu/CaltechAUTHORS:20170914-133142840
DOI: 10.7907/msckd-0m869
The paper reports on an experimental investigation of four methods of allocating public goods. The two basic processes studied are direct contribution and a public goods auction process. Both of these processes are studied with and without an additional unanimity feature. The results suggest that the auction process outperforms direct contribution. The effect of unanimity is to decrease the efficiency of both processes. Much of the paper is focused on an analysis of these results.https://resolver.caltech.edu/CaltechAUTHORS:20170914-133142840Plea Bargaining and Prosecutorial Discretion
https://resolver.caltech.edu/CaltechAUTHORS:20170912-150759711
DOI: 10.7907/2gpw5-rp238
A model of plea bargaining with asymmetric information is presented. The prosecutor's private information consists of the strength of the case, while the defendant's private information is his or her own guilt or innocence. A sequential equilibrium is computed, in which a fraction of cases are dismissed because they are too likely to involve an innocent defendant; in the remaining cases, the prosecutor's offer of a sentence in exchange for a plea of guilty signals the strength of the case. I then ask whether the prosecutor (and society) might be better off if constrained to make the same offer to all defendants, regardless of the strength of the case. It is shown that, depending upon other features of the criminal justice system and upon the preferences of society, either of these regimes may be preferred to the other. In particular, it is possible that unlimited discretion is disadvantageous for the prosecution (since it carries with it the requirement of sequential rationality).https://resolver.caltech.edu/CaltechAUTHORS:20170912-150759711Speculation or Specification? A Note on Flanigan and Zingale
https://resolver.caltech.edu/CaltechAUTHORS:20170915-142001159
DOI: 10.7907/bygxx-n4f38
Flanigan and Zingale's "Alchemists' Gold" reviewed the classic problem of inferring individual-level relationships from aggregate data, attacked the specification error approach to it, and sought to replace Goodman's ecological regression with an informal and largely untestable procedure first proposed by Shively in 1974. They illustrated the Shively strategy by comparing evidence from the national CPS/SRC survey to state-level aggregate data from the 1968 and 1972 elections.
In this note, I seek to show that their criticisms are misconceived and that ecological regression can, in many typical circumstances, lead to acceptably precise and much less vague judgments about the individual relationships than the method that Flanigan and Zingale promoted. I spell out, apparently for the first time in the cliometric literature, the mechanisms for using data on other variables in order to improve the estimates of the parameters of principal interest, and employ these techniques to recalculate the way 1968 major party voters behaved in the 1972 U.S. presidential election. The multivariate ecological regression figures come pleasingly close to reproducing the "true" percentages of loyalists and defectors drawn from the survey. Making use of all the available data and of well-developed and powerful statistical tests, ecological regression is far preferable to the Shively tactic.https://resolver.caltech.edu/CaltechAUTHORS:20170915-142001159The Tax Compliance Game: Toward an Interactive Theory of Law Enforcement
https://resolver.caltech.edu/CaltechAUTHORS:20170914-143759958
DOI: 10.7907/fnn49-d6737
The existing paradigm for the economic analysis of tax compliance provides an inadequate theory of the revenue collection process. Even as a purely economic model, its exclusive focus on individual taxpayers' decision-making promotes an unduly restrictive vision of the compliance problem and potential responses to it. In this paper we outline a more comprehensive theoretical basis for analyzing tax compliance, and illustrate it with a simple model. We believe our approach to be a significant improvement in the economic theory of law enforcement because it views the noncompliance problem as an interactive system. In our theoretical construct, individual decision-making not only depends upon and responds to the detection and punishment structure, but, unlike prior models, we also explicitly include the law enforcement agency--in this case the Internal Revenue Service--as an important interactive element. Initially we outline our general approach and its differences from the existing economic law enforcement paradigm. We then detail a simple model and its results and compare these results both to the prior literature and to some of our ongoing research in an effort to illustrate how our theoretical construct may affect predictions. Finally, we describe potential extensions of the model, examine its robustness with respect to various underlying assumptions and offer suggestions for further research, including possible applications to other law enforcement contexts.https://resolver.caltech.edu/CaltechAUTHORS:20170914-143759958Did Rising Out-Migration Cause Fertility to Decline in Antebellum New England? A Life-Cycle Perspective on Old-Age Security Motives, Child Default, and Farm-Family Fertility
https://resolver.caltech.edu/CaltechAUTHORS:20170913-134919588
DOI: 10.7907/6sgw5-8et09
A model of fertility based on the life-cycle model of intertemporal optimization is presented in which fertility, children's schooling, saving, and bequest planning are simultaneously determined. The paper hypothesizes that sometime shortly after the beginning of the nineteenth century, Americans began to adopt this life-cycle strategy and abandon the older, traditional family-based system of providing for old age. As a consequence the overall fertility rate began to fall. The change in attitudes was, it is argued, triggered by the high incidence of 'child default' as young adults left the seaboard states for land in the west. The change to life-cycle strategies was gradual and proceeded at different rates in different parts of the country. This differential timing of the 'life-cycle transition' allows empirical tests to be based on cross-sections of state data drawn from the 1840 U.S. Census. The model is shown to predict well. An alternative hypothesis, Richard Easterlin's 'target-bequest model' is rejected by these tests.https://resolver.caltech.edu/CaltechAUTHORS:20170913-134919588Agendas, Strategic Voting, and Signaling with Incomplete Information
https://resolver.caltech.edu/CaltechAUTHORS:20170912-143216300
DOI: 10.7907/2h7ha-fxg91
The literature on agendas with sincere and strategic voting represents an important contribution to our understanding of committees, of institutions, and of the opportunities to manipulate outcomes by the manipulation of institutions. That literature, though, imposes an assumption that may be unrealistic in many situations; namely, that everyone knows the preferences of everyone else. In this essay we apply Bayesian equilibrium analysis to show that the properties of agendas that others derive assuming complete information do not hold necessarily under incomplete information. First, a Condorcet winner need not be selected, even if nearly everyone on the committee most prefers it. Second, the "2 step theorem," that any outcome reachable in n voting stages via some amendment agenda is reachable in two stages under sophisticated voting, need not hold. Third, nonbinding votes such as straw polls, can critically effect final outcomes.https://resolver.caltech.edu/CaltechAUTHORS:20170912-143216300Must Historians Regress? An Answer to Lee Benson
https://resolver.caltech.edu/CaltechAUTHORS:20170914-165402310
DOI: 10.7907/g1wms-6dh93
In a recent symposium published in Historical Methods, Lee Beneon, once a forceful proponent of more systematic methods and models in history, announced that it had all been a mistake. Applied to human affairs, Occam's Razor merely result a in useless carnage. Women are too complex and contradictory to fit in to any simple theories, much less laws, and any attempt to formulate or test any such statements merely misleads. To redeem social science and save the world, we should abandon the attempt to postulate abstract, value-free, and often mathematicized hypotheses and to validate them through the use of sophisticated statistical techniques. Instead, we should return to a Marxism cleansed of the notion of class conflict, and to a combination of easy arithmetic methods that everyone can understand and careful analysis of qualitative data.
In his 1961 Concept of Jacksonian Democracy, Benson now claims, he was wrong to adopt the ideas of voting cycles from economics, because the polity is more complicated than the economy, and to pretend that he had arrived at this conclusions on the basis of "hard" data, when he had used a mixture of qualitative and quantitative information. That historians have been wrong to adopt regression analysis to relate aggregate voting to socioeconomic indices he attempts to demonstrate by commenting on one table from my Shaping of Southern Politics.
None of Benson's contention s except his mea culpa will stand up to rigorous examination. His conception of science is distorted, the economy is more complex than he imagines, and an extensive analysis of his scattershot critique of my table and of regression analysis reveals that he is wrong in every p articular. His charges of scholarly irresponsibility collapse when set against his own practice. Indeed, it seems possible that he found no evidence of a class basis for political divisions in Concept because he used a poor measure of it and performed no systematic multivariate tests.
Benson's general anti-scientific stance would force historians and social scientists to abjure the use of powerful methods and theories, would make generalizations and the replication of results impossible, and would cause history needlessly to regress to a pre-scientific stage. The fallacy has not been in our mistransference of scientific modes of thought and analysis, but in his own misunderstanding, misconceptions, and mistaken arguments.https://resolver.caltech.edu/CaltechAUTHORS:20170914-165402310Rules, Subjurisdictional Choice, and Congressional Outcomes: An Event Study of Energy Taxation Legislation
https://resolver.caltech.edu/CaltechAUTHORS:20170914-140001083
DOI: 10.7907/r1mhj-x1n18
Formal models in political science are increasingly attentive to institutional features that ostensibly play a crucial part in shaping political outcomes. Propositions yielded by these models have proven difficult to test, however. This study has two aims. Its substantive objective is to extend the spatial model of legislatures to illuminate the mechanisms of influence by committees on congressional outcomes. A broader methodological purpose is to introduce to political science a new and promising technique for testing formal models. Event studies are based on the belief that many political outcomes affect the economic welfare of nongovernmental actors and that, accordingly, actors with a vested interest in public policies respond rationally to changing political expectations. The technique is illustrated by testing formally derived propositions about the effects of rules and of subjurisdictional choice (the Ways and Means Committee's decision about the dimensions of its jurisdiction in which to propose legislation) on Congress's 1974 decision regarding taxation of oil and gas firms. The strong empirical results not only support the theory but also offer promising implications for continued development and testing of formal models of politics and political economy.https://resolver.caltech.edu/CaltechAUTHORS:20170914-140001083Responsibility, Liability, and Incentive Compatibility
https://resolver.caltech.edu/CaltechAUTHORS:20170914-142446945
DOI: 10.7907/49ny5-azy38
In this paper I ask what should be the assignment of liability for risks of toxic chemicals, and more generally. I develop a theory of liability, based on two principles. The first is responsibility as own-cost-bearing and is justified on the grounds of fairness. The second is efficiency and is justified on the grounds of welfare. These two principles provide a joint foundation to the theory of incentive compatibility, which is an important consideration in the design of liability systems.https://resolver.caltech.edu/CaltechAUTHORS:20170914-142446945Presidential Influence on Congressional Appropriations Decisions
https://resolver.caltech.edu/CaltechAUTHORS:20170913-161141071
DOI: 10.7907/swsdw-7qr45
We investigate the extent to which possession of the veto allows the president to influence congressional decisions regarding regular annual appropriations legislation. The most important implication of our analysis is that the influence the veto conveys is asymmetrical: it allows the president to restrain Congress when he prefers to appropriate less to an agency than they do; it does not provide him an effective means of extracting higher appropriations from Congress when he prefers to spend more than they do. This asymmetry derives from Constitutional limitations on the veto, the sequencing of the appropriations process provided by the Budget and Accounting Act of 1920, and the presence of a de facto reversionary expenditure level contained in continuing resolutions (Fanno, 1966). We find strong support for this proposition in a regression of presidential requests upon congressional appropriations decisions.https://resolver.caltech.edu/CaltechAUTHORS:20170913-161141071The Bankruptcy of Conventional Tax Timing Wisdom is Deeper than Semantics: A Rejoinder to Professors Kaplan and Warren
https://resolver.caltech.edu/CaltechAUTHORS:20170913-161741394
DOI: 10.7907/r087s-qzt38
The Haig-Simons ideal is an important normative concept. But using it requires that one specify a method of measuring the value of changes in wealth. I use market value and present value, the concepts of value employed in modern finance theory. Professors Kaplow and Warren disagree with a result that I show follows from those concepts of value: That the CFIT implements the Haig-Simons ideal in a non-general-equilibrium setting. But their critique is ineffective because they do not present an alternative concept of value and give reasons for using it in the definition of the Haig-Simons ideal instead of market value or present value. It is questionable whether such an alternative concept can be constructed that is also consistent with the idea of value contained in modern finance theory.
Professors Kaplow and Warren generally agree with my position that it is important to take general equilibrium effects into account in assessing alternative tax policies. But their attempt to make a general equilibrium argument for the equivalence of the CFIT and yield exemption fails. In fact, using their approach reinforces the conclusion in my original article that the equivalence holds in a non-general-equilibrium setting only for breakeven transactions.https://resolver.caltech.edu/CaltechAUTHORS:20170913-161741394Toward "Total Political History"
https://resolver.caltech.edu/CaltechAUTHORS:20170914-164531615
DOI: 10.7907/03fab-n7693
How is the segmented and disoriented world of contemporary historical scholarship, in particular, that of American political history, to be reintegrated and revived? Instead of imposing a substantive synthesis, which would narrow the discipline's focus by excluding many interesting topics, I propose that historians adopt a common approach--rational choice theory--that has proven useful in economics and political science.
Using notions drawn from rational choice and examples primarily from the American Civil War and Reconstruction period, I examine the assumptions behind and arguments for three theories in intellectual/cultural history--republicanism, "political culture," and positive/negative liberalism. I then try to spell out some of the implications of rational choice models for the study of electoral, legislative, judicial, and administrative behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170914-164531615The Efficacy of Registration Drives
https://resolver.caltech.edu/CaltechAUTHORS:20170919-150907040
DOI: 10.7907/87wpy-6qt34
This paper compares the voting rates of those who are enrolled by registration drives in comparison to the rates of those who register themselves. The central question is whether the return from registration drives in terms of the number of voters they yield is worth the effort? In addition, the paper looks at the demographic profile of the group-registered voters to discover how they differ from self-registered individuals. The data set consists of 108,653 individuals in Los Angeles County who registered between the 1980 and 1982 elections. The results indicate that 41% of those registered by registration drives actually voted as compared to 57% of the self-registered. It also appears that the group-registered voters are younger and more frequently minority.https://resolver.caltech.edu/CaltechAUTHORS:20170919-150907040The Competitive Effects of Resale Price Maintenance
https://resolver.caltech.edu/CaltechAUTHORS:20170915-165557217
DOI: 10.7907/xny20-9tt35
The competitive effects of resale price maintenance (RPM) are theoretically diverse. RPM can cause allocation distortions or promote productive efficiencies in the distribution process. Moreover, extant cross-sectional empirical evidence is incapable of distinguishing among the potentially disparate effects of RPM. This paper conducts hypothesis tests of the alternative theories of RPM. The empirical framework relates estimates of the effects of RPM for a cross-section of observations to necessary conditions of the alternative models. This analysis indicates that RPM is used both to foster cartels and promote efficiencies in the distribution process. This result is consistent with the growing body of case study analysis that suggests that RPM is used for a variety of reasons. This result also questions the current per se illegal status of RPM in the antitrust laws. Evidence is also provided concerning the strategic interaction between manufacturers and dealers in the distribution process and the use of financial data in analyzing propositions in industrial organization.https://resolver.caltech.edu/CaltechAUTHORS:20170915-165557217Linearity of the Optimal Income Tax: A Generalization
https://resolver.caltech.edu/CaltechAUTHORS:20170919-144337163
DOI: 10.7907/a6jm0-kaz10
In an earlier paper, we examined the nature of individual and collective preferences over alternative income tax schedules in the context of a simple model in which individuals respond to high tax rates by working in an untaxed "sheltered" sector of the economy. There we established the social optimality of a linear income tax among the set of tax schedules that are continuous, nondecreasing convex functions of income. Here we relax the restrictions on tax schedules, most importantly allowing schedules to have concave (decreasing marginal tax rate) as well as convex (increasing marginal tax rate) regions. In fact, we prove that a linear income tax is socially preferred to any nonlinear lower semi-continuous tax schedule.https://resolver.caltech.edu/CaltechAUTHORS:20170919-144337163Sophistication, Myopia, and the Theory of Legislatures: An Experimental Study
https://resolver.caltech.edu/CaltechAUTHORS:20170918-150426158
DOI: 10.7907/wybar-75s67
Legislatures typically make decisions in stages: for example, first by subsets of members (in committees) and then by the full membership (on the floor). But different theories of two-stage decision-making employ different assumptions about the degree of foresight committee members exercise during the first stage. This paper reviews the relevant theories and reports on several experiments that test whether committees acting in a larger legislature make decisions consistent with the hypotheses of sophisticated or myopic behavior. Under diverse conditions--including open and closed rules, and homogeneous and heterogeneous preferences—the predictions of sophisticated behavior are superior not only to those of myopic behavior, but also to several other competing hypotheses. Implications of the findings for future theoretical developments are discussed, as are reservations regarding generalizing about real-world legislatures on the basis of laboratory observations.https://resolver.caltech.edu/CaltechAUTHORS:20170918-150426158The Charitable Contribution Deduction: A Politico-Economic Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170918-141010449
DOI: 10.7907/7sz81-1fe89
Policy analysis of the charitable contribution deduction has focused on two aspects. First, the deduction gives a larger subsidy to high-income individuals. Second, the activities subsidized are often public goods or create positive externalities. The focus on those two traits has led some economists to test the deduction using traditional cost allocation criteria for public goods such as Lindahl equilibrium. A leading paper finds that a tax credit better approximates the Lindahl criteria than a deduction. This paper shows that the opposite may be true if the taxes raised to fund the revenue loss from the deduction are even slightly progressive.
This finding suggests that the deduction may be a political bargain outcome that benefits a wide range of groups. The second part of this paper discusses qualitatively how a political bargain theory can explain the role chosen for the deduction in conjunction with other methods of subsidy and direct government provision.https://resolver.caltech.edu/CaltechAUTHORS:20170918-141010449The Taxation of Risky Investments: An Asset Pricing Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170918-161518700
DOI: 10.7907/a5er4-52w73
Some economists have argued that offsetting effects on risk and return may make capital income taxes nondistorting. This paper performs three tasks. First, the conditions under which the argument is true are studied in an asset pricing model that unlike earlier models allows the timing of depreciation deductions to vary and incorporates the effectiveness and distribution of government expenditures. One result is that it is plausible that the nondistortion result holds regardless of that timing or of the distribution and effectiveness of expenditures if the pre-tax riskless rate is zero.
A second task concerns the cases where the pre-tax riskless rate is not negligible and the nondistortion result does not hold. Then the degree and pattern of distortion depends on the general equilibrium impact of taxes and expenditures on average risk aversion and on the pre-tax riskless rate. An interesting result emerges concerning the impact of the timing of depreciation allowances. When average risk aversion stays constant, the conventionally expected effect that faster write-offs result in more investment will occur if and only if the pre-tax riskless rate falls when timing is accelerated. This is true because in the absence of any change in the pre-tax riskless rate, changes in depreciation timing cause changes in risk and expected return that exactly off set each other.
Finally, the paper shows that the failure to add a premium for "capital risk" to the standard economic depreciation allowance based on expected decline in asset value does not change that result unless the income tax system has the pathology of allowing used asset sales to be tax free. The current U.S. tax system seems to be free of that pathology.https://resolver.caltech.edu/CaltechAUTHORS:20170918-161518700Limited Information Estimators and Exogeneity Tests for Simultaneous Probit Models
https://resolver.caltech.edu/CaltechAUTHORS:20170919-133926717
DOI: 10.7907/6e6gv-k1725
A two-step maximum likelihood procedure is proposed for estimating simultaneous probit models and is compared to alternative limited information estimators. Conditions under which these estimators attain the Cramer-Rao lower bound are stated. Simple tests of exogeneity are proposed and are shown to be asymptotically equivalent to one another and to have the same local asymptotic power as classical tests based on the limited information maximum likelihood estimator.https://resolver.caltech.edu/CaltechAUTHORS:20170919-133926717Speculation and Price Stability Under Uncertainty: A Generalization
https://resolver.caltech.edu/CaltechAUTHORS:20170919-141552441
DOI: 10.7907/yn7hk-dzm61
Since Friedman maintained that profitable speculation necessarily stabilizes prices, the necessary and sufficient conditions for his conjecture to hold have been derived following ex post analyses. However, within these frameworks, no uncertainty is involved.
In this paper we assume the nonspeculative excess demand functions are always linear but with random slopes and intercepts (i. i. d. across time). Employing dynamic programming approaches, the optimal complete speculation sequence for a monopolistic speculator (which maximizes his long-run expected profits) can be characterized. Furthermore, Friedman's conjecture holds under this sequence.
As for competitive speculation cases, we consider three variants arising from deviations of the monopolistic case. Of these, two models establish the property that Friedman's conjecture holds for optimal speculation sequences. However, since this conjecture might be falsified for the other model, a necessary condition is derived. Also, an example is given which shows that, if uncertainties are involved, a destabilizing optimal speculation sequence exists even with linear nonspeculative excess demand functions.https://resolver.caltech.edu/CaltechAUTHORS:20170919-141552441Structural Instability of the Core
https://resolver.caltech.edu/CaltechAUTHORS:20170908-145130668
DOI: 10.7907/e2cny-p7059
Let σ be a q-rule, where any coalition of size q, from the society of size n, is decisive. Let w(n,q)=2q-n+1 and let W be a smooth 'policy space' of dimension w. Let U〖(W)〗^N be the space of all smooth profiles on W, endowed with the Whitney topology. It is shown that there exists an 'instability dimension' w*(σ) with 2≦w*(σ)≦w(n,q) such that:
1. (i) if w≧w*(σ), and W has no boundary, then the core of σ is empty for a dense set of profiles in U(W)N (i.e., almost always),
2. (ii) if w≧w*(σ)+1, and W has a boundary, then the core of σ is empty, almost always,
3. (iii) if w≧w*(σ)+1, then the cycle set is dense in W, almost always,
4. (iv) if w≧w*(σ)+2 then the cycle set is also path connected, almost always.
The method of proof is first of all to show that if a point belongs to the core, then certain generalized symmetry conditions in terms of 'pivotal' coalitions of size 2q-n must be satisfied. Secondly, it is shown that these symmetry conditions can almost never be satisfied when either W has empty boundary and is of dimension w(n,q) or when W has non-empty boundary and is of dimension w(n,q)+1.https://resolver.caltech.edu/CaltechAUTHORS:20170908-145130668Passing the President's Program: Public Opinion and Presidential Influence in Congress
https://resolver.caltech.edu/CaltechAUTHORS:20170918-154335090
DOI: 10.7907/pqn5t-wf122
Correlations between legislative support scores and presidential popularity do not accurately reflect the relationship between public opinion and presidential influence in Congress. Presidents make strategic choices to expend their public prestige to obtain congressional approval of programmatic initiatives. Previous studies have ignored such choices as well as other features of the strategic environment which tend to lower the apparent legislative success rates of popular presidents. A model of presidential and congressional behavior is proposed and it is estimated that a one percent increase in a president's public support level increases the president's legislative approval rate by approximately one percent (holding program size fixed).https://resolver.caltech.edu/CaltechAUTHORS:20170918-154335090A Note on the International Coffee Agreement
https://resolver.caltech.edu/CaltechAUTHORS:20170918-143601562
DOI: 10.7907/h6gw7-jak79
This research note develops a model of the institutional features of the international coffee agreement and analyzes the allocation of export quotas under the terms of the agreement in 1982. It suggests that the agreement can be viewed as a weighted majority voting game. It employs the assumption of rationality to predict how allocations should be made given the rules of the agreement and tests the model by determining whether the allocations which passed (failed) fell within (outside) of the solution of the game.https://resolver.caltech.edu/CaltechAUTHORS:20170918-143601562Simultaneous Equations Models for Dummy Endogenous Variables: A Game Theoretic Formulation with an Application to Labor Force Participation
https://resolver.caltech.edu/CaltechAUTHORS:20170919-140310752
DOI: 10.7907/f4sq0-rtn82
A game theoretic approach for formulating simultaneous equations models for dummy endogenous variables is proposed and applied to a study of husband/wife labor force participation. A distinctive feature of our approach is that the simultaneous model is derived from optimizing behavior as an outcome of a game between two players. The equilibrium concept used is that of Nash. In addition, we show that the logical consistency conditions implied by usual simultaneous equation models with structural shift actually rules out simultaneity for the problem we consider; in our model, no logical consistency conditions are implied on the parameters.https://resolver.caltech.edu/CaltechAUTHORS:20170919-140310752Speculative Holdings under Linear Expectation Processes---A Mean-Variance Approach
https://resolver.caltech.edu/CaltechAUTHORS:20170919-144858256
DOI: 10.7907/zedhe-z7874
In this paper, we considered a discrete time abstract market model where the associated commodity is storable. Also, instead of assuming expected profit maximizing speculators, we assumed they employed mean-variance approaches.
Within this framework, given a non-degenerate quadratic inventory cost function and a linear expectation process, the optimal speculative carryover may be decomposed into four components of which two are special features arising from mean-variance considerations.
Furthermore, assuming a linear non-speculative excess demand function, Friedman's conjecture (i.e., profitable speculation necessarily stabilizes prices) holds from an ex ante point of view.https://resolver.caltech.edu/CaltechAUTHORS:20170919-144858256Stochastic Simulation of Labor Demand Under Wage Subsidization
https://resolver.caltech.edu/CaltechAUTHORS:20170918-152625166
DOI: 10.7907/grmee-8x614
The impact of a system of wage subsidies, funded by unemployment insurance vouchers, is evaluated by combining a set of disaggregated industry labor demand models with an input/ output model. The program is shown to increase employment initially by lowering the cost of labor to firms. The disposable income of workers is increased which acts as a macroeconomic stimulus. The success of the subsidy program depends to some extent on the degree to which demand induced by greater consumer spending is able to sustain higher employment levels. Overall, it is estimated that a four-quarter wage subsidy equivalent to 30 percent of prevailing industry wages results in a long run decline in unemployment in excess of 1 percent.https://resolver.caltech.edu/CaltechAUTHORS:20170918-152625166Sophisticated Committees and Structure-Induced Equilibria in Congress
https://resolver.caltech.edu/CaltechAUTHORS:20170919-132731264
DOI: 10.7907/wzfct-vx803
This article addresses the somewhat narrower topic of whether a theory of legislatures a la Shepsle, can usefully and intelligibly accommodate the diversity in real-world legislatures, and whether in doing so it can retain its ability to predict political outcomes. I argue that Shepsle's theory is indeed useful for understanding Congress, in spite of its various limitations and simplifications, some of which are defended and others of which are corrected, Ultimately, I show how a proposed theoretical extension, while abstract, nevertheless says something concrete about how the institutionalization in Congress can stabilize congressional outcomes. The essay begins with a nontechnical review of Shepsle's theory and its main result, and proceeds to extend the theory to situations in which committees are more attentive and responsive to the preferences of noncommittee members. New theoretical results are presented for "simple institutional arrangements" (SIAs} with sophisticated committees, and Fenno's House committees and Polsby's comments on institutionalization are reconsidered in light of the revised theory.https://resolver.caltech.edu/CaltechAUTHORS:20170919-132731264Products Liability, Corporate Structure and Bankruptcy: Toxic Substances and the Remote Risk Relationship
https://resolver.caltech.edu/CaltechAUTHORS:20170918-164344186
DOI: 10.7907/qz2gs-xn296
This paper first develops criteria by which courts can distinguish between product related risks that profit maximizing firms can and cannot be expected to discover. It then argues that imposing the former--"knowable"--set of risks on firms reduces accident costs and creates no problems that corporate and bankruptcy law cannot adequately solve. In contrast, imposing the latter--"remote"—set of risks has no effect on reducing accident costs and tends to produce unsolvable problems of the kind that characterize the current asbestos cases. The paper concludes by arguing that courts are wrong to create these problems because the victims of remote risks lack a tenable distributional or moral claim to have private firms reimburse them.https://resolver.caltech.edu/CaltechAUTHORS:20170918-164344186Imperfect Information and the Tender Offer Auction
https://resolver.caltech.edu/CaltechAUTHORS:20170915-155631380
DOI: 10.7907/taqrg-6bz64
Managers of companies for which takeover bids have been or are likely to be made--"targets"--engage in a variety of tactics designed to minimize the likelihood of a takeover or increase the price an acquirer must ultimately pay. The welfare effects of these tactics are in dispute. This paper considers one such tactic, the running of "auctions" by managers of a target after an initial bid has been made; an auction is held if, as is often the case, the target's managers can interest other companies in bidding. This paper argues that auctions reduce welfare because they dampen search for suboptimally run firms, and do not have a comparative advantage over unregulated markets in moving corporate assets to their highest valued uses. Further, the shareholders of targets do not have property rights to the gains from takeovers that auctions could be viewed as protecting, Hence, the law that now permits auctions to occur should be changed.https://resolver.caltech.edu/CaltechAUTHORS:20170915-155631380Sequential Elections with Limited Information
https://resolver.caltech.edu/CaltechAUTHORS:20170919-152059788
DOI: 10.7907/v23kd-qbn10
We develop theoretically and test experimentally a one dimensional model of two candidate competition with incomplete information. We consider a sequence of elections in which the same general issue predominates from election to election, but where the voters have no contemoporaneous information about the policy positions adopted by the candidates, and where the candidates have no contemporaneous information about the preferences of the voters. Instead, participants have access only to contemporaneous endorsement data of an interest group, and to historical policy positions of the previous winning candidates. We define a stationary rational expectations equilibrium to the resulting (repeated) game of incomplete information, and show that in equilibrium, all participants, voters and candidates alike, end up acting as if they had complete information: Voters end up voting for the correct candidate, and candidates end up converging to the median voter.https://resolver.caltech.edu/CaltechAUTHORS:20170919-152059788Two-Stage Conditional Maximum Likelihood Estimation of Econometric Models
https://resolver.caltech.edu/CaltechAUTHORS:20170919-135009248
DOI: 10.7907/36n4e-6tb16
Recent works on Maximum Likelihood (ML) estimation have focused on the behavior of the ML estimator when the model is possibly misspecified [Gourieroux, Monfort and Trognon (1984), Vuong (1983), White (1982, 1983a, b)]. This paper studies a general method, called two-stage conditional maximum likelihood (2SCML) estimation, for generating consistent estimates. In particular, asymptotic properties of 2SCML estimators are derived under correct and incorrect specification of the statistical model. Necessary and sufficient conditions for asymptotic efficiency of 2SCML estimators for all or some of the parameters are obtained. It is also argued that 2SCML estimators can readily be used to construct tests for exogeneity and model misspecification of the Hausman (1978) and White (1982) type. Examples are given to illustrate the applicability of the method. These include the linear simultaneous equation model, the simultaneous probit model and the simple Tobit model.https://resolver.caltech.edu/CaltechAUTHORS:20170919-135009248Equilibrium Selection in Signaling Games
https://resolver.caltech.edu/CaltechAUTHORS:20170915-161826033
DOI: 10.7907/yf3x1-6g428
We present a refinement of the set of sequential equilibria [Kreps & Wilson (1982)] for generic signaling games based on rationality postulates for off-the-equilibrium-path beliefs. This refinement concept eliminates equilibria which Kreps (1985) and others dismiss on intuitive grounds. In addition, we derive a characterization of the set of stable equilibria [Kohlberg and Mertens (1982)] for generic signaling games in terms of equilibrium strategies and restrictions on beliefs. Examples are given which differentiate the predictions of these equilibrium concepts.https://resolver.caltech.edu/CaltechAUTHORS:20170915-161826033Generalized Symmetry Conditions at a Core Point
https://resolver.caltech.edu/CaltechAUTHORS:20170918-145109056
DOI: 10.7907/scsnd-3xh89
Previous analyses have shown that if a point x is to be a core of a majority rule voting game in Euclidean space, when preferences are smooth, then the utility gradients must satisfy certain restrictive symmetry conditions. In this paper these results are generalized to the case of an arbitrary voting rule, and necessary and sufficient conditions, expressed in terms of "pivotal" coalitions, are obtained.https://resolver.caltech.edu/CaltechAUTHORS:20170918-145109056Price Effects of Energy-Efficient Technologies: A Study of Residential Demand for Heating and Cooling
https://resolver.caltech.edu/CaltechAUTHORS:20170918-155414017
DOI: 10.7907/xjhp1-47t60
This paper applies a mixed engineering and econometric model to empirically analyze behavioral interaction with new energy-efficient appliances and thermal improvements. The hypothesis is that energy efficient technologies lower the effective price of the services they provide and consequently reduce electricity consumption by smaller amounts than would be anticipated in engineering estimates. The approach incorporates prior engineering knowledge about the interactive effects of weather, appliance efficiencies, and thermal integrity of dwellings to explore treatment groups in an experiment conducted in Florida.https://resolver.caltech.edu/CaltechAUTHORS:20170918-155414017Pay and Performance in Baseball: Modeling Regulars, Reserves and Expansion
https://resolver.caltech.edu/CaltechAUTHORS:20170919-160534802
DOI: 10.7907/cvsge-4y440
Although the relationship between pay and performance in baseball has been convincingly demonstrated by Scully, a number of unresolved questions remain. Using a large sample of player salaries from contracts on file at the American League office, new estimates of this relationship are reported. The primary findings are as follows. First, while Scully's basic results are qualitatively robust, the salary elasticities for various performance and experience variables are substantially lower for our sample and specification. Second, for most variables, recent performance, as well as career average, contributes to the explanation of salary differences. Third, expansion has a significant effect on salary structure, and, in our model, makes it statistically invalid to estimate a single salary equation from pooled time-series data that includes an expansion year.https://resolver.caltech.edu/CaltechAUTHORS:20170919-160534802A Technique for Estimating Legislators' Ideal Points on Concrete Policy Dimensions
https://resolver.caltech.edu/CaltechAUTHORS:20170918-162928744
DOI: 10.7907/dp4sk-33x28
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170918-162928744Hedging as 'Speculation on the Basis'
https://resolver.caltech.edu/CaltechAUTHORS:20170918-144245730
DOI: 10.7907/mxj4v-t7r51
Holbrook Working has described hedging as "speculation on the basis" and has argued that traders engage in hedging because they believe they can do better from hedging than from not hedging, in contrast to the Keynes-Hicks-Kaldor view that hedging is an activity undertaken to shift risk to other traders.
In this paper, we derive necessary and sufficient conditions for hedging ("speculation on the basis") to be preferred by all risk averse traders to speculating on the cash market. When the futures market is in fact a forward market, then short hedging is p referred to not hedging by all risk averse traders if and only if there is a contango on the market. In the case of a "true" futures market, it is shown that short hedging is preferred to an unhedged position by all risk averse traders if and only if the "Houthakker effect" is sufficiently strong. These results are derived for the "all or nothing" case of comparisons between an unhedged and a completely hedged position.https://resolver.caltech.edu/CaltechAUTHORS:20170918-144245730The Economic Value of Resource Flexibility
https://resolver.caltech.edu/CaltechAUTHORS:20170915-151842832
DOI: 10.7907/a4tfn-kkr55
The economic value of resource flexibility is discussed in the context of a model of a two-stage production process with stochastic variability. A unique optimal barrier policy is characterized with any level of capacity flexibility. The more flexible the capacities, the lower the optimal inventory limit and the higher the profit of the producer. A diffusion approximation of the model is also discussed.https://resolver.caltech.edu/CaltechAUTHORS:20170915-151842832A Note on Taxation, Development and Representative Government
https://resolver.caltech.edu/CaltechAUTHORS:20170915-154757538
DOI: 10.7907/cadhs-w5m12
This article proposes a model representing the relationship between economic actors and revenue seeking governments. Given a need for revenues, the model predicts the allocation of the new tax burden and patterns of control over public policy.
The model is motivated by the history of the rise of parliaments in Western Europe. It is extended to urban and developmental politics. It is designed to employ the techniques of "neo-classical" economics to explore themes which have been developed most clearly in Marxist writings.
Most importantly, the model suggests the way in which, given a need for revenues, specific fractions of the private sector can gain control over public policy. And it characterizes precisely the factors which yield differences in the ability of economic agents to employ the market to defect from the tax-levying state. The analysis thus gives insight into both the origins and the limitations of political democracy.https://resolver.caltech.edu/CaltechAUTHORS:20170915-154757538Cournot Oligopoly with Information Sharing
https://resolver.caltech.edu/CaltechAUTHORS:20170918-132505537
DOI: 10.7907/zb522-k0839
This paper studies the incentives for information sharing among firms in a Cournot oligopoly facing a linear uncertain demand and an affine conditional expectation information structure. No information sharing is found to be the unique equilibrium in two cases in which the signals with equal precision are assumed indivisible and infinitely divisible. However, the nonpooling equilibrium converges to the situation where the pooling strategies are adopted as the amount of information increases. Hence, the efficiency is achieved in the competitive equilibrium as the number of the firm become large.https://resolver.caltech.edu/CaltechAUTHORS:20170918-132505537Tournament Methods in Choice Theory
https://resolver.caltech.edu/CaltechAUTHORS:20170918-135038477
DOI: 10.7907/fbbtk-sx278
Choice procedures using the notion of tournament matrix are investigated in the framework of general choice theory. Tournament procedures of multicriterial choice are introduced and studied. New characteristic conditions for describing some tournament and other essentially nonclassical choice functions are obtained. The comparison of tournament and graph-dominant choice mechanisms is established.https://resolver.caltech.edu/CaltechAUTHORS:20170918-135038477More on Harsanyi's Utilitarian Cardinal Welfare Theorem
https://resolver.caltech.edu/CaltechAUTHORS:20170918-163836570
DOI: 10.7907/q0pxs-qmw64
If individuals and society both obey the expected utility hypothesis and social alternatives are uncertain, then the social utility must be a linear combination of the individual utilities, provided the society is indifferent when all its members are. This result was first proven by Harsanyi [4] who made implicit assumptions in the proof not actually needed for the result (see [5]). This note presents a straightforward proof of Harsanyi's theorem based on a separating hyperplane argument.https://resolver.caltech.edu/CaltechAUTHORS:20170918-163836570An Efficient Market Mechanism with Endogenous Information Acquisition: Cournot Oligopoly Case
https://resolver.caltech.edu/CaltechAUTHORS:20170915-150746484
DOI: 10.7907/hpk6d-g3j38
The question of how an efficient competitive equilibrium could be reached through a pricing mechanism in which the information acquisition is endogenously determined is addressed. The traditional oligopoly market is extended to include an ex ante information market when there is uncertainty either in the cost function or the demand function. Equilibrium behavior is characterized in a two-stage noncooperative game involving n production firms and m research firms in the industry. As the environment becomes more competitive, meaning, both the information market and the tangible good market become large, the equilibrium random price of the product converges almost surely to its competitive price level with certainty and consequently the total social welfare (consumer plus producer surplus) is maximized.https://resolver.caltech.edu/CaltechAUTHORS:20170915-150746484The Scope of the Hypothesis of Bayesian Equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20170919-145551174
DOI: 10.7907/nz1n2-wkn56
What behavior can be explained as the Bayes equilibrium of some game? The main finding is almost anything. Given any Bayesian (coordination) game with positive priors and given any vector of nondominated strategies, there is an increasing transformation of each utility function such that the given vector of strategies is a Bayes (Nash) equilibrium of the transformed game. Any nondominated behavior can be rationalized as Bayes equilibrium behavior. Some comments on the implications of these results for game theory are included.https://resolver.caltech.edu/CaltechAUTHORS:20170919-145551174Elections with Limited Information: A Multidimensional Model
https://resolver.caltech.edu/CaltechAUTHORS:20170919-154054641
DOI: 10.7907/ebpx6-ssr61
We develop a game theoretic model of 2 candidate competition over a multidimensional policy space, where the participants have incomplete information about the preferences and strategy choices of other participants. The players consist of the voters and the candidates. Voters are partitioned into two classes, depending on the information they observe. Informed voters observe candidate strategy choices while uninformed voters do not. All players (voters and candidates alike) observe contemporaneous poll data broken down by various subgroups of the population.
The main results of the paper give conditions on the number and distribution of the informed and uninformed voters which are sufficient to guarantee that any equilibrium (or voter equilibrium) extracts all information.https://resolver.caltech.edu/CaltechAUTHORS:20170919-154054641Unanimous Consent Agreements: Going along in the Senate
https://resolver.caltech.edu/CaltechAUTHORS:20170915-153916911
DOI: 10.7907/wqbwh-0yq59
In recent decades the U. S. Senate has made increasing use of complex unanimous consent agreements (UCAs) to set a time for a final vote on legislation (thereby precluding filibusters) and to specify, for example, who may offer what amendments. Because of the numerous dilatory tactics permitted in the absence of a UCA, controversial legislation is typically doomed unless a prior agreement has been reached. Thus the norm of consent to unanimous consent requests (UCRs) is puzzling. This paper addresses the puzzle with a decision-theoretic model of consent which yields what appears to be a rather stringent condition for objection to a UCR. Two actual cases of objection are analyzed and seem quite consistent with comparative statics results derived from the model. A concluding discussion considers UCAs as instances of endogenously chosen institutions which provide Senate leaders with opportunities to induce cooperative behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170915-153916911A Retrospective on Retrospective Voting
https://resolver.caltech.edu/CaltechAUTHORS:20170919-155239224
DOI: 10.7907/ka0z7-36045
This paper critically reviews the extensive literature on retrospective voting in response to economic conditions. Each of the major types of analyses which have been performed—time series analyses of national vote totals, presidential popularity, and cross-sectional analyses of individual survey responses—have raised several interesting and important questions. The answers which have been obtained, however, are only partial and limited, as each of these approaches entails serious problems of estimation and interpretation. Further progress in this area, we argue, requires explicit treatment of conceptual and statistical issues which have hindered previous research: the dynamic formulation of expectations and preferences, the incidence of policy (and nonpolicy) effects across the population, notions of incumbering and political responsibility.https://resolver.caltech.edu/CaltechAUTHORS:20170919-155239224Economic Analysis of Brain Drain
https://resolver.caltech.edu/CaltechAUTHORS:20170918-141006507
DOI: 10.7907/saz7r-2d590
In this paper, we consider the possibility of signaling and individual's two-stage decision procedures within an asymmetric information framework to investigate the brain drain phenomena. The results indicate the relationship between an individual's ability and the signal he will choose at rational expectations equilibrium. Also, the persons who will remain abroad are identified. Where the ranking of the universities provides the signal to domestic employers, we can therefore interpret these results in such a way that there is an association between students of a particular quality and corresponding qualities of universities they will choose to attend to attain Ph. D.'s. Moreover, we can predict whether these graduating Ph.D.'s choose to return home or remain abroad.https://resolver.caltech.edu/CaltechAUTHORS:20170918-141006507Existence of Permutation Cycles and Manipulation of Choice Functions
https://resolver.caltech.edu/CaltechAUTHORS:20170918-142711751
DOI: 10.7907/zcrwa-ba378
Let σ be a social preference function, and let v (σ) be the Nakamura number of σ. If W is a finite set of cardinality at least v (σ) then it is shown that there exists an acyclic profile P on W such that σ (P) is cyclic. Any choice function which is compatible with a can then be manipulated. A similar result holds if W is a manifold (or a subset of Euclidean space) with dimension at least v (σ) - 1.https://resolver.caltech.edu/CaltechAUTHORS:20170918-142711751Optimal Research for Cournot Oligopolists
https://resolver.caltech.edu/CaltechAUTHORS:20170915-164128100
DOI: 10.7907/jdn82-4bs44
We extend the classical Cournot model to take account of uncertainty in either the cost function or the demand function. By undertaking research, firms can acquire private (asymmetric) information to help resolve their uncertainty and make a more informed production decision. The model is a two stage game: in the first stage research levels are chosen, and in the second stage, conditional on private research outcomes, production decisions are made.
We find that for a linear, continuous information structure there is a unique Nash equilibrium to the game. In the equilibrium there may be an inefficient amount of aggregate research and there may be incomplete pooling as well.
The model specializes to the classical case when the cost of research is zero (and each firm gains essentially the same information by doing an infinite amount of research) or when the cost of research is so high no firm undertakes research.https://resolver.caltech.edu/CaltechAUTHORS:20170915-164128100Voting Operators in the Space of Choice Functions
https://resolver.caltech.edu/CaltechAUTHORS:20170918-132850289
DOI: 10.7907/k79pf-zr643
Assuming the individual and collective opinions are given as choice functions, a new formalization of the voting problem is considered. The notions of a local functional operator and the closedness of domains in choice-functional space relative to local operators are introduced. The problem of voting is reduced to an analysis of three kinds of operator classes and their mutual relations.
The functional analogues well known in the theory of Arrow's paradox results are established.https://resolver.caltech.edu/CaltechAUTHORS:20170918-132850289Settlement and Litigation Under Alternative Legal Systems
https://resolver.caltech.edu/CaltechAUTHORS:20170915-163111340
DOI: 10.7907/81733-hx598
We consider a situation in which one party (the plaintiff) has a legally admissible claim for damages from another party (the defendant). The level of damage is known to the plaintiff; the defendant knows only its distribution, which is assumed to be continuous on some range. Before a trial takes place, the plaintiff makes a settlement demand. If the defendant rejects the demand, the court settles the dispute. We characterize the plaintiff's settlement demand policy and the defendant's probability of rejection policy for both separating and pooling equilibria. In the separating equilibrium, the defendant correctly infers the level of true damage from the settlement demand made by the plaintiff. In this case we show that, under risk neutrality, the equilibrium probability of a trial (as a function of true damages) is independent of the allocation of litigation costs. We also analyze the comparative statics of the equilibrium policies and compare them for specific litigation cost allocation systems.https://resolver.caltech.edu/CaltechAUTHORS:20170915-163111340Asymmetric Arbitrage and the Pattern of Futures Prices
https://resolver.caltech.edu/CaltechAUTHORS:20170918-162413469
DOI: 10.7907/rb1ct-0g817
Since Keynes first argued that backwardation was the normal state of affairs on futures markets, there have been several theoretical explanations for its existence. In particular, Fort and Quirk have shown that the "Houthakker effect" can lead to a backwardation equilibrium. In this paper, we consider another argument for backwardation suggested by Houthakker, namely, asymmetric arbitrage. Our conclusions are generally negative, despite its intuitive appeal. Specifically, in a world with an equal number of short and long hedgers, with identical utility functions and densities over cash and futures prices, if the futures market is a forward market, then in a rational expectations framework, asymmetric arbitrage has no effect on the pattern of futures (or cash) prices. If we are dealing with a true futures market, under the above assumptions, arbitrage will act to encourage short hedging and to discourage long hedging only under some restrictive conditions. Moreover, further quantitative restrictions must be imposed in order to derive a backwardation equilibrium under asymmetric arbitrage.https://resolver.caltech.edu/CaltechAUTHORS:20170918-162413469Agrarian Politics and Development
https://resolver.caltech.edu/CaltechAUTHORS:20170920-142841518
DOI: 10.7907/ps49v-a0w02
Rural insurrections in Third World nations transformed the study of agrarian politics into a recognized subfield of political development. They also discredited prevailing development theories and while rendering development studies a subfield of political economy.
This essay reviews the major approaches to the study of agrarian politics. It emphasizes two major weaknesses: the assumption that development implies the demise of the rural sector and the inability of most "economic" approaches to incorporate institutional features of peasant societies, thereby creating a wasteful disjuncture between political economy and anthropology in the study of rural societies.
The collective choice approach, it is argued, rectifies these weaknesses and generates a fruitful agenda for new research into the political economy of Third World nations.https://resolver.caltech.edu/CaltechAUTHORS:20170920-142841518Heterogeneity in Models of Electoral Choice
https://resolver.caltech.edu/CaltechAUTHORS:20170920-155944617
DOI: 10.7907/0pd8p-9va06
Heterogeneity or the presence of a variety of decision rules in a population has usually been ignored in voting research. A method for handling heterogeneous preferences using rank order data is developed and applied to a simple issue-voting model. The estimated average effect of partisanship is substantially higher when the assumption of homogeneity is relaxed, though many self-identified partisans also use ideological criteria to evaluate candidates and many independents rely on partisan criteria.https://resolver.caltech.edu/CaltechAUTHORS:20170920-155944617The Winner's Curse and Cost Estimation Bias in Pioneer Projects
https://resolver.caltech.edu/CaltechAUTHORS:20170920-144446307
DOI: 10.7907/336bx-96046
Cost overruns are almost as massive and almost as pervasive on private "first of a kind" projects as on defense contracts. This paper examines the cost overrun problem in terms of the methodology of cost estimation, abstracting from moral hazard problems. The main results of the paper are these: first, if cost estimators are an unbiased estimation methodology, then under certain monotonicity conditions, this will produce an observed cost underestimation bias, because cost estimates are used as a guide to project decision making; second, the more uncertainty there is with respect to the costs of a project, the larger will be the observed cost underestimation bias, assuming the estimator uses an unbiased estimation methodology; third, in bottoms up estimation, the most accurate of cost estimation methodologies, there is a built in cost underestimation bias because the value of information is not incorporated into the cost estimate; fourth, the size of the underestimation bias in bottoms up estimation definitely-increases with uncertainty only under rather stringent conditions on the construction production function.https://resolver.caltech.edu/CaltechAUTHORS:20170920-144446307Profitable Speculation and Linear Excess Demand
https://resolver.caltech.edu/CaltechAUTHORS:20170919-164532686
DOI: 10.7907/sq49a-56c51
Since Friedman maintained that profitable speculation necessarily stabilizes prices, there had been many debates. Farrell concluded these debates by showing that (i) for a two-period model, any continuous negatively sloped non-speculative excess demand function would validate Friedman's conjecture if there is no lag structure, and (ii) for a T-period model with T≥3, negatively sloped linear non-speculative excess demand is necessary and sufficient for Friedman's conjecture to be true if there is no lag structure. Later, Schimmler generalized Farrell's results to lag-responsive nonspeculative excess demand cases.
However, there are some problems in Farell's and Schimmler's approaches which invalidate their proofs. In this paper, we will point out these problems and show that after correcting these slips, Farrell's two results are in fact correct. Also, we will redo Schimmler's problem for time-independent non-speculative excess demand functions. The conclusions derived are (i) for two-period models, any continuously differentiable non-speculative excess demand f(P_t,P_(t-1)) with f_1 (P_t,P_(t-1) )<0,f_2 (P_t,P_(t-1) )≤0 [where f_(t-s+1) (P_t,P_(t-1) )=∂f(P_t,P_(t-1) )/〖∂P〗_s , s=t-1, t] will validate Friedman's conjecture; (ii) for T-period models (T≥3), within the class of twice-continuously differentiable functions, linear non-speculative excess demand functions f(P_t,P_(t-1),⋯P_(t-T+1)) satisfying f_1<0,f_2=f_3=⋯= f_(t-T+1)=0 represent necessary and sufficient conditions for Friedman's conjecture to be true.https://resolver.caltech.edu/CaltechAUTHORS:20170919-164532686Price-Conveyed Information vs. Observed Insider Behavior: A Note on Rational Expectations Convergence
https://resolver.caltech.edu/CaltechAUTHORS:20170919-163219147
DOI: 10.7907/swecv-syw29
The recent experimental results of Plott and Sunder (1982) and Friedman, Harrison and Salmon (1983) on the ability of single commodity markets to "reveal" the underlying state to initially uninformed traders were potentially influenced by a design in which the set of informed traders was held constant throughout the life of the market. Hence the performance of uninformed traders in the market might have been predicated on their knowledge of, and the observed behavior of, the informed traders. The experiment discussed below is a replication of one market in Plott and Sunder (1982), with the added feature that the traders who were to be informed of the state differed from period to period. The results are equivalent to those of Plott and Sunder (1982) in the price dynamics, while less conclusive regarding the acquisition and use of the state-price correspondence by uninformed traders.https://resolver.caltech.edu/CaltechAUTHORS:20170919-163219147On the Existence of Cournot Equilibrium
https://resolver.caltech.edu/CaltechAUTHORS:20170920-133247687
DOI: 10.7907/fwc3g-bf816
This paper examines the existence of n-firm Cournot equilibrium in a market for a single homogeneous commodity. It proves that if each firm's marginal revenue declines as the aggregate output of other firms increases (which is implied by concave inverse demand) then a Cournot equilibrium exists, without assuming that firms have nondecreasing marginal cost or identical technologies. Also, if the marginal revenue condition fails at a "potential optimal point," there is a set of firms such that no Cournot equilibrium exists. The paper also contains an example of nonexistence with two nonidentical firms, each with constant returns to scale production.https://resolver.caltech.edu/CaltechAUTHORS:20170920-133247687An Equilibrium Model of Tax Compliance with a Bayesian Auditor and some 'Honest' Taxpayers
https://resolver.caltech.edu/CaltechAUTHORS:20170920-155049868
DOI: 10.7907/jsd2p-mrw27
Empirical work on tax compliance has yielded conservative estimates of unreported taxable income in the U.S. that average 10 to 15 percent of total taxable income for recent years. Moreover, it is held by many that the rate of noncompliance has been growing dramatically. This problem is widely perceived as one of eroding ethics—more and more people are ceasing to comply voluntarily and are instead acting "strategically" in response to the structure of the U.S. income tax laws. We propose a simple model of tax compliance in which an exogenously given fraction of taxpayers comply voluntarily, while the remainder behave strategically. We distinguish between a general decision to act strategically and a specific decision not to report honestly. This is done in an equilibrium setting where the IRS is allowed to adjust its audit policy in response to taxpayer behavior. Because the audit policy of the IRS is endogenous and thus co-determined with the reporting behavior of potential noncompliers, several non-intuitive results emerge. In particular, we find that an increase in the fraction of strategic taxpayers decreases the likelihood that a given strategic taxpayer fails to comply. In fact, the decrease in the likelihood of underreporting exactly offsets the increase in the fraction of strategic taxpayers, so that aggregate compliance (and net tax revenues) are unaffected.https://resolver.caltech.edu/CaltechAUTHORS:20170920-155049868Sequential Equilibrium Detection and Reporting Policies in a Model of Tax Evasion
https://resolver.caltech.edu/CaltechAUTHORS:20170919-161356687
DOI: 10.7907/zncgz-j6d73
Noncompliance with tax laws and other forms of criminal activity have typically been treated as equivalent; both have been modeled in decision-theoretic terms, with the same probability of detection applying to all agents. However, noncompliance with tax laws is different from other criminal activities because taxpayers are required to submit a preliminary accounting of their behavior, while potential criminals obviously are not. This preliminary round of information transmission differentiates individuals, and raises the possibility that it may not be optimal to apply the same probability of detection to all taxpayers.
We develop a game-theoretic model of income tax compliance in which the taxpayer possesses private information about his own income, while the IRS knows only the probability distribution according to which the taxpayer's income is realized. By investing effort, the IRS can (stochastically) verify a taxpayer's income. We characterize the sequential equilibrium for this game, which consists of a reporting rule for the individual taxpayer, and a verification policy for the IRS.
Our equilibrium has the feature that taxpayers with greater true income under-report less than those with lower true income, and efforts at verification are lower the greater is reported income. If individuals can be classified on the basis of some observable characteristic which is related to opportunities for income, we find that classes of taxpayers who enjoy greater opportunities for high income under-report to a greater extent; accordingly, more effort is devoted to their investigation. This is to be distinguished from the former result, which applies to different types of taxpayers within the same class.https://resolver.caltech.edu/CaltechAUTHORS:20170919-161356687On the Meaning of the Preponderance Test in Judicial Regulation of Chemical Hazard
https://resolver.caltech.edu/CaltechAUTHORS:20170920-135918033
DOI: 10.7907/d0gcb-rkv58
As usually defined, the preponderance test is a standard of proof which directs the jury to accept the plaintiff's version of the disputed facts if they are more probably true than not. But what happens when the most important disputed "facts" are judgments about probability? This paper offers an interpretation of the preponderance test which can be applied to this situation.
In the example of the paper, B is the benefit of a drug, C is the health cost if it is a teratogen, and p is the probability of teratogenicity. The contested "fact" is the magnitude of p, the probability of harm. In the interpretation considered by the paper, the jury finds in favor of the plaintiff if the jury decides that it is more likely than not that p is greater than B/C. This definition of the preponderance test does not quite minimize expected costs, and compared with expected cost minimization it is likely to be biased toward under-protection when the health costs are high compared with the benefits. But when the mean and median of the second order probability of p are the same, the definition coincides with expected cost minimization. It is also shown that under a criterion of expected cost minimization, contrary to Posner, judicial error costs are not in general the same and the number of erroneous judgments favoring undeserving plaintiffs is not likely to be the same as the number of erroneous judgments favoring undeserving defendants.https://resolver.caltech.edu/CaltechAUTHORS:20170920-135918033Classification Theorem for Smooth Social Choice
https://resolver.caltech.edu/CaltechAUTHORS:20170920-141017060
DOI: 10.7907/4xz6g-t8k54
A classification theorem for voting rules on a smooth choice space W of dimension w is presented. It is shown that, for any non-collegial voting rule, σ, there exist integers v*(σ), w*(σ) (with v*(σ)https://resolver.caltech.edu/CaltechAUTHORS:20170920-141017060Impossible Choices
https://resolver.caltech.edu/CaltechAUTHORS:20170919-163158691
DOI: 10.7907/7n3dk-sfc33
This paper was written for, and (in part) presented at, a symposium at McKenna College in which Mr. Calabresi took part. The paper begins with a discussion of a number of ambiguities in the treatment of choice in Calabresi and Bobbit's Tragic Choices and then proceeds to develop in two different, but I think complementary, directions. On the one hand, I use their shifts of position as an occasion, or opportunity, to work out what seems to me a more realistic account of how decision-makers choose among the alternatives they encounter. On the other hand, I suggest that the shifts of position that are visible at the surface of the argument are traceable to deeper tensions among the unstated, and perhaps not fully recognized, metaphysical presuppositions on which the argument rests.https://resolver.caltech.edu/CaltechAUTHORS:20170919-163158691Obstruction, Germaneness and Representativeness in Legislatures
https://resolver.caltech.edu/CaltechAUTHORS:20170920-151055384
DOI: 10.7907/1hts8-yr249
Obstruction has often been regarded as an abhorrent feature of American legislatures, but few attempts have been made to specify the conditions under which it occurs or the precise nature and degree of its putative evil. This paper presents a theory of decentralized decision-making that specifies the necessary and sufficient conditions for sophisticated obstruction by committees. The assumptions of the theory are embedded in a simulation model which generates preferences and status quo points, identifies outcomes under competing behavioral assumptions, and estimates the representativeness of outcomes as a function of legislators' ideal points. The results call for rejection of the hypothesis that obstruction leads to unrepresentative outcomes. A discussion of the House's discharge petition examines the findings in a richer congressional context.https://resolver.caltech.edu/CaltechAUTHORS:20170920-151055384Practical Implications of Game Theoretic Models of R and D
https://resolver.caltech.edu/CaltechAUTHORS:20170920-160742050
DOI: 10.7907/pmnaq-p1m95
The purpose of this paper is to survey recent game theoretic models of research and development, and to ask whether they yield practical implications or testable hypotheses. Although individual models have unambiguous implications, the array of existing models still generates considerable controversy. This heightens the interest in and need for empirical tests of these theories.https://resolver.caltech.edu/CaltechAUTHORS:20170920-160742050Turbulence, Cost Escalation, and Capital Intensity Bias in Defense Contracting
https://resolver.caltech.edu/CaltechAUTHORS:20170920-153210299
DOI: 10.7907/d6m91-dh281
[No abstract]https://resolver.caltech.edu/CaltechAUTHORS:20170920-153210299A Model of Tax Compliance Under Budget-Constrained Auditors
https://resolver.caltech.edu/CaltechAUTHORS:20170919-170653419
DOI: 10.7907/m19yv-3z206
In the midst of various taxpayer "revolts" and federal budget deficits of unprecedented magnitude, noncompliance with federal and state income tax laws has become an issue of significant policy concern. If the IRS' budget is limited, the probability that any individual taxpayer will be audited depends on the behavior of other taxpayers. Thus the problem of compliance involves a "congestion" effect, which generates strategic interaction among taxpayers as well as between taxpayers and the IRS. This paper reflects an initial attempt to explore how the combination of a strategic IRS and asymmetric information affects the traditional theoretical results on tax compliance behavior.https://resolver.caltech.edu/CaltechAUTHORS:20170919-170653419Voters, Absent and Present: A Review Essay
https://resolver.caltech.edu/CaltechAUTHORS:20170920-132151472
DOI: 10.7907/e9rps-t4827
A review of two recent books on the history of voting participation in America displays some of the conceptual and methodological advances as well as some of the frailties which are characteristic of the "new political history." After summarizing the explanations which Bumbarn and Kleppner offer for the collapse of northern white turnout in the early part of the twentieth century, its partial revival during the 1930s, and its decline since 1960, I evaluate the theories and methods they use in order to determine how well-founded their conclusions are. Adopting a rational choice-inspired standpoint rather than their sociological approaches suggests interpretations of the early twentieth century and 1960-1980 changes which are somewhat at variance with theirs.https://resolver.caltech.edu/CaltechAUTHORS:20170920-132151472A Groves-Like Mechanism in Risk Assessment
https://resolver.caltech.edu/CaltechAUTHORS:20170920-154439911
DOI: 10.7907/h567m-6v306
This paper links two research areas that have developed independently—incentives compatibility for public goods and elicitation of subjective probabilities. An analogy between incentives for reporting information in the two areas leads to the discovery of a new mechanism, based on the Groves mechanism, for eliciting subjective probabilities. In the public goods area, the analogy provides an extension of the basic theorem of truthful response to the more general case when one's true valuation of the public good is state dependent. In the risk assessment area, the analogy provides a generalization of the traditional reporting mechanisms, proper scoring rules, and in doing so establishes a representation theorem for them.
The paper considers three goals which a principal might have while choosing a transfer mechanism. These goals are: information pooling, strong research incentives for the agents, and identifiability of the agent with the best information. For two structures of information and the specific cases considered, the new mechanism performs well, compared with four traditional mechanisms, in achieving these goals.https://resolver.caltech.edu/CaltechAUTHORS:20170920-154439911Misspecification and Conditional Maximum Likelihood Estimation
https://resolver.caltech.edu/CaltechAUTHORS:20170920-161718102
DOI: 10.7907/30jn1-p6y19
Recently White (1982) studied the properties of Maximum Likelihood estimation of possibly misspecified models. The present paper extends Andersen (1970) results on Conditional Maximum Likelihood estimators (CMLE) to such a situation. In particular, the asymptotic properties of CMLE's are derived under correct and incorrect specification of the conditional model. Robustness of conditional inferences and estimation with respect to misspecification of the model for the conditioning variables is emphasized. Conditions for asymptotic efficiency of CMLE's are obtained, and specification tests a la Hausman (1978) and White (1982) are derived. Examples are also given to illustrate the use of CMLE's properties. These examples include the simple linear model, the multinomial logit model, the simple Tobit model, and the multivariate logit model.https://resolver.caltech.edu/CaltechAUTHORS:20170920-161718102Litigation of Settlement Demands Questioned by Bayesian Defendants
https://resolver.caltech.edu/CaltechAUTHORS:20170920-135310836
DOI: 10.7907/bgkb4-ykf40
This paper analyzes a stylized model of pretrial settlement negotiations in a personal-injury case. It is assumed that the prospective plaintiff knows the severity of his injury but that the prospective defendant has incomplete information. As a result of this information asymmetry a proportion of slightly-injured plaintiffs are tempted to inflate their settlement demands and a proportion of such demands are rejected by suspicious defendants. By analogy with other models of adverse selection (e. g., Rothschild-Stiglitz (1976)), the presence of slightly-injured plaintiffs imposes a negative externality on plaintiffs with genuine severe injuries since defendant s cannot identify the severely-injured and sometimes reject their reasonable demands, forcing them into costly litigation. A filing fee imposed on minor claims is shown to displace the equilibrium but, paradoxically, to cause an increase in the frequency of litigation.
This model differs from recent contributions to the literature on pretrial negotiations under incomplete information. Unlike P'ng (1983) and Bebchuk (1983), the uninformed litigant in this model learns from the observed equilibrium behavior of the informed litigant. Unlike Ordover-Rubinstein (1983) and Salant-Rest (1982), settlement demands are endogenous.https://resolver.caltech.edu/CaltechAUTHORS:20170920-135310836Uncertainty and Shopping Behavior: An Experimental Analysis
https://resolver.caltech.edu/CaltechAUTHORS:20170920-150120256
DOI: 10.7907/bmg9m-8hh08
This paper reports experimental tests of three search equilibrium models. These models which differ only in the search strategies available to the buyers have qualitatively different predictions, that is, equilibria: price distributions, single price equilibria at the competitive price and at the monopoly price and two price equilibria. The experimental outcomes generally were consistent with the models' predictions. This suggests that debate on the utility of this class of models should shift to the realism of the models' assumptions rather than focus on their ability to characterize market outcomes. Also, since the basic models have been validated, the project of analyzing experimentally the results of relaxing some of their assumptions seems worthwhile.https://resolver.caltech.edu/CaltechAUTHORS:20170920-150120256Bargaining Theory for Games with Transferable Value
https://resolver.caltech.edu/CaltechAUTHORS:20170920-152358814
DOI: 10.7907/8t24j-yfh12
This paper presents an existence proof of a bargaining equilibrium set B*, in the case of games with transferable value, by making use of the Knaster Kuratowski Mazurkiewicz (KKM) Theorem. As a corollary proof of existence of the usual bargaining set B1 is obtained. Whereas previous proofs of B1 existence have made use of fixed point arguments, use of the KKM theorem provides an insight into possible extensions of the existence proof to the nontransferable value case.https://resolver.caltech.edu/CaltechAUTHORS:20170920-152358814Full Nash Implementation of Neutral Social Functions
https://resolver.caltech.edu/CaltechAUTHORS:20170921-132224090
DOI: 10.7907/g0bq8-ewe77
This paper characterizes neutral social functions that are fully implementable. A necessary condition for full implementation under either the Nash equilibrium concept or the strong Nash equilibrium concept is that the neutral social function being implemented be monotonic and simple. If a neutral monotonic social function is simple and the set of winning coalitions is nondictatorial then the social function is fully implementable by a set of Nash equilibria. For finite alternative sets a neutral monotonic social function will be fully implementable by a set of strong Nash equilibria if and only if it is simple and dictatorial.https://resolver.caltech.edu/CaltechAUTHORS:20170921-132224090Ethnicity and Electoral Choice: Mexican-American Voting Behavior in the California 30th Congressional District
https://resolver.caltech.edu/CaltechAUTHORS:20170921-150226745
DOI: 10.7907/qcwxe-tfn79
The 1982 election in California offers a unique natural experiment in ethnic and racial block voting. The race in the 30th Congressional District matched a well-financed Anglo Republican, John Rousselot, against an incumbent Hispanic, Marty Martinez, in a predominantly Hispanic seat. On the ballot with Martinez and Rousselot were the successful Republican candidates for Governor and the U. S. Senate, George Deukmejian and Pete Wilson, and the losing Democratic candidates, Tom Bradley (who is Black) and Jerry Brown. These variations in the race and ethnicity of the candidates on the ballot in 1982 can be used to estimate the impact of ethnic and racial consideration in voting decisions. The data for this study were gathered in two surveys of the 30th Congressional District of California. The first was a telephone survey of 455 respondents administered during the third week of October, 1982. The second was a poll of 409 voters as they left the voting booth on election day.https://resolver.caltech.edu/CaltechAUTHORS:20170921-150226745Fairness, Self-Interest, and the Politics of the Progressive Income Tax
https://resolver.caltech.edu/CaltechAUTHORS:20170921-135438907
DOI: 10.7907/q4svm-57n48
All advanced democracies have adopted income taxes with considerable progression in marginal tax rates. To explain this we examine the nature of individual and collective preferences over alternative tax schedules, in the context of a simple two-sector model. We first consider the case of altruistic or "sociotropic" citizens who view the income tax as a means of achieving a fairer or more egalitarian distribution of income. We show that greater marginal-rate progressivity may well be less fair; that a "fairest" tax, however defined, is always a linear or "flat-rate" schedule in which all incomes are taxed at the same marginal rate; and that with a purely sociotropic electorate there exists a flat-rate schedule which is a majority equilibrium. We then show that with "self-interested" voters who seek to minimize their own tax burdens, greater marginal-rate progression may well be preferred by middle-and upper-income voters; that for middle-income citizens the optimal schedule is a sharply progressive one; and that within the set of individually optimal schedules there exists a majority equilibrium, which is a progressive schedule which minimizes the burden on median-income or middle class citizens, at the expense of lower-and upper-income taxpayers.https://resolver.caltech.edu/CaltechAUTHORS:20170921-135438907Assessing the Partisan Effects of Redistricting
https://resolver.caltech.edu/CaltechAUTHORS:20170921-150957562
DOI: 10.7907/233hz-8nq06
The purpose of this paper is to assess the reality behind the politician's perception that redistricting matters. There are, of course, many dimensions to that perception since redistricting has many effects. This paper will focus on the impact that boundary changes have on the partisan composition of seats. In order to do this, it will be necessary to specify what the expected partisan effects of redistricting are and how they can be measure. Thus, the paper first explains how the impact of redistricting will vary with the strategy of particular plans. Following this, there is an exploration of some techniques for measuring the partisan impact of boundary changes, and then a detailed analysis of the most important Congressional redistricting in 1982—the Burton plan in California.https://resolver.caltech.edu/CaltechAUTHORS:20170921-150957562Covering, Dominance, and Institution Free Properties of Social Change
https://resolver.caltech.edu/CaltechAUTHORS:20170921-143236008
DOI: 10.7907/ktzhs-d3673
This paper shows that different institutional structures for aggregation of preferences under majority rule may generate social choices that are quite similar, so that the actual social choice may be rather insensitive to the choice of institutional rules. Specifically, in a multidimensional setting, where all voters have strictly quasi concave preferences, it is shown that the "uncovered set" contains the outcomes that would arise from equilibrium behavior under three different institutional settings. The three institutional settings are two candidate competition in a large electorate, cooperative behavior in small committees, and sophisticated voting behavior in a legislative environment where the agenda is determined endogenously. Because of its apparent institution free properties, the uncovered set may provide a useful generalization of the core when a core does not exist. A general existence theorem for the uncovered set is proven, and for the Downsian case, bounds for the uncovered set are computed. These bounds show that the uncovered set is centered around a generalized median set whose size is a measure of the degree of symmetry of the voter ideal points.https://resolver.caltech.edu/CaltechAUTHORS:20170921-143236008Common Knowledge and Consensus with Aggregate Statistics
https://resolver.caltech.edu/CaltechAUTHORS:20170921-144910869