Abstract: We consider cosmological evolution from the perspective of quantum information. We present a quantum circuit model for the expansion of a comoving region of space, in which initially-unentangled ancilla qubits become entangled as expansion proceeds. We apply this model to the comoving region that now coincides with our Hubble volume, taking the number of entangled degrees of freedom in this region to be proportional to the de Sitter entropy. The quantum circuit model is applicable for at most 140 e-folds of inflationary and post-inflationary expansion: we argue that no geometric description was possible before the time t_1 when our comoving region was one Planck length across, and contained one pair of entangled degrees of freedom. This approach could provide a framework for modeling the initial state of inflationary perturbations.

Publication: arXiv
ID: CaltechAUTHORS:20170228-191741490

]]>

Abstract: Some modern cosmological models predict the appearance of Boltzmann Brains: observers who randomly fluctuate out of a thermal bath rather than naturally evolving from a low-entropy Big Bang. A theory in which most observers are of the Boltzmann Brain type is generally thought to be unacceptable, although opinions differ. I argue that such theories are indeed unacceptable: the real problem is with fluctuations into observers who are locally identical to ordinary observers, and their existence cannot be swept under the rug by a choice of probability distributions over observers. The issue is not that the existence of such observers is ruled out by data, but that the theories that predict them are cognitively unstable: they cannot simultaneously be true and justifiably believed.

Publication: arXiv
ID: CaltechAUTHORS:20170209-151733308

]]>

Abstract: ATLAS and CMS have each reported a modest diphoton excess consistent with the decay of a broad resonance at ~ 750 GeV. We show how this signal can arise in a weakly coupled theory comprised solely of narrow width particles. In particular, if the decaying particle is produced off-shell, then the associated diphoton resonance will have a broad, adjustable width. We present simplified models which explain the diphoton excess through the three-body decay of a scalar or fermion. Our minimal ultraviolet completion is a weakly coupled and renormalizable theory of a singlet scalar plus a heavy vector-like quark and lepton. The smoking gun of this mechanism is an asymmetric diphoton peak recoiling against missing transverse energy, jets, or leptons.

Publication: arXiv
ID: CaltechAUTHORS:20160113-132530546

]]>

Abstract: We study dark matter (DM) which is cosmologically long-lived because of standard model (SM) symmetries. In these models an approximate stabilizing symmetry emerges accidentally, in analogy with baryon and lepton number in the renormalizable SM. Adopting an effective theory approach, we classify DM models according to representations of $SU(3)_C\times SU(2)_L\times U(1)_Y \times U(1)_B\times U(1)_L$, allowing for all operators permitted by symmetry, with weak scale DM and a cutoff at or below the Planck scale. We identify representations containing a neutral long-lived state, thus excluding dimension four and five operators that mediate dangerously prompt DM decay into SM particles. The DM relic abundance is obtained via thermal freeze-out or, since effectively stable DM often carries baryon or lepton number, asymmetry sharing through the very operators that induce eventual DM decay. We also incorporate baryon and lepton number violation with a spurion that parameterizes hard breaking by arbitrary units. However, since proton stability precludes certain spurions, a residual symmetry persists, maintaining the cosmological stability of certain DM representations. Finally, we survey the phenomenology of effectively stable DM as manifested in probes of direct detection, indirect detection, and proton decay.

Publication: arXiv
ID: CaltechAUTHORS:20150731-184249348

]]>

Abstract: Many modern cosmological scenarios feature large volumes of spacetime in a de Sitter vacuum phase. Such models are said to be faced with a "Boltzmann Brain problem" - the overwhelming majority of observers with fixed local conditions are random fluctuations in the de Sitter vacuum, rather than arising via thermodynamically sensible evolution from a low-entropy past. We argue that this worry can be straightforwardly avoided in the Many-Worlds (Everett) approach to quantum mechanics, as long as the underlying Hilbert space is infinite-dimensional. In that case, de Sitter settles into a truly stationary quantum vacuum state. While there would be a nonzero probability for observing Boltzmann-Brain-like fluctuations in such a state, "observation" refers to a specific kind of dynamical process that does not occur in the vacuum (which is, after all, time-independent). Observers are necessarily out-of-equilibrium physical systems, which are absent in the vacuum. Hence, the fact that projection operators corresponding to states with observers in them do not annihilate the vacuum does not imply that such observers actually come into existence. The Boltzmann Brain problem is therefore much less generic than has been supposed.

Publication: arXiv
ID: CaltechAUTHORS:20150814-091340697

]]>

Abstract: It is commonplace in discussions of modern cosmology to assert that the early universe began in a special state. Conventionally, cosmologists characterize this fine-tuning in terms of the horizon and flatness problems. I argue that the fine-tuning is real, but these problems aren't the best way to think about it: causal disconnection of separated regions isn't the real problem, and flatness isn't a problem at all. Fine-tuning is better understood in terms of a measure on the space of trajectories: given reasonable conditions in the late universe, the fraction of cosmological histories that were smooth at early times is incredibly tiny. This discussion helps clarify what is required by a complete theory of cosmological initial conditions.

Publication: arXiv
ID: CaltechAUTHORS:20140716-105154654

]]>

Abstract: The standard ΛCDM model provides an excellent fit to current cosmological observations but suffers from a potentially serious Boltzmann Brain problem. If the universe enters a de Sitter vacuum phase that is truly eternal, there will be a finite temperature in empty space and corresponding thermal fluctuations. Among these fluctuations will be intelligent observers, as well as configurations that reproduce any local region of the current universe to arbitrary precision. We discuss the possibility that the escape from this unacceptable situation may be found in known physics: vacuum instability induced by the Higgs field. Avoiding Boltzmann Brains in a measure-independent way requires a decay timescale of order the current age of the universe, which can be achieved if the top quark pole mass is approximately 178 GeV. Otherwise we must invoke new physics or a particular cosmological measure before we can consider ΛCDM to be an empirical success.

ID: CaltechAUTHORS:20130925-113239870

]]>

Abstract: We briefly review the motivations and features of focus point supersymmetry and in particular the focus point region of the CMSSM. Applying the constraint that the neutralino is a thermal relic, we examine current and projected collider and dark matter constraints on the focus point region. We demonstrate that the focus point region is currently constrained by multiple dark matter experiments, and future sensitivy on multiple fronts will probe large portions of the parameter space.

ID: CaltechAUTHORS:20130708-081244977

]]>