Abstracts

 

The preparation problem


Eric Cavalcanti (Centre for Quantum Dynamics, Griffith University) and Nick Menicucci (Perimeter Institute)


The effects of closed timelike curves (CTCs) in quantum dynamics, and its consequences for information processing have recently become the subject of a heated debate. Deutsch introduced a formalism for treating CTCs in a quantum computational framework. He postulated a consistency condition on the chronology-violating systems which led to a nonlinear evolution on the systems that come to interact with the CTC. This has been shown to allow tasks which are impossible in ordinary linear quantum evolution, such as computational speed-ups over (linear) quantum computers, and perfectly distinguishing non-orthogonal quantum states.


Bennett and co-authors have argued, on the other hand, that nonlinear evolution allows no such exotic effects. They argued that all proofs of exotic effects due to nonlinear evolutions suffer from a fallacy they called the " linearity trap". Here we review the argument of Bennett and co-authors and show that there is no inconsistency in assuming linearity at the level of a classical ensemble, even at the presence of nonlinear quantum evolution. In fact, this is required for the very existence of empirically verifiable nonlinear evolution. The arguments for exotic quantum effects are thus seen to be based on the necessity for a fundamental distinction between proper and improper mixtures in the presence of nonlinear evolutions. We show how this leads to an operationally well-defined version of the measurement problem that we call the "preparation problem".

The zig-zag road to reality


Samuel Colin (Centre for Quantum Dynamics, Griffith University)


The de Broglie-Bohm pilot-wave program is an attempt to formulate quantum theory (including quantum field theory) as a theory without observers, by assuming that the wave-function is not the complete description of a system, but must be supplemented by additional variables (beables). Although many progress has been made in order to extend the pilot-wave theory to quantum field theory, a compelling ontology for quantum field theory is still lacking and the choice of beable is likely to be relevant for the study of quantum non-equilibrium systems and their relaxation properties (Valentini).

The present work takes its root in the fact that in the standard model of particle physics, all fermions are fundamentally massless and acquire their bare mass when the Higgs field condenses. In our tentative to build a pilot-wave model for quantum field theory in which beables are attributed to massless fermions, we are naturally led to Weyl spinors and to Penrose's zig-zag picture of the electron.

In my talk, I will sketch this tentative and insist on some of its remarkable properties: namely that a positive-energy massive Dirac electron can be thought of as a superposition of positive and negative energy Weyl spinors of the same helicity, and that the massive Dirac electron can in principle move luminally at all times.

Based on a joint work with H. Wiseman.

Cosmological insight into fundamental physics


Tamara Davis (School of Maths and Physics, University of Queensland, and Dark Cosmology Centre, University of Copenhagen)


The last decade of astrophysics has shown more than ever before that cosmology can teach us about the nuts-and-bolts of basic physics. This has been driven by the discovery of the accelerating universe (dark energy) --- the theories being proposed to explain dark energy often invoke new physics such as brane-worlds arising from fledgling models of quantum-gravity. It has become evident that the large timescales and spatial-scales probed by cosmology allow us to learn about fundamental physics in a way inaccessible to any earth-bound experiment.


This talk will review my work as part of the ESSENCE and SDSS supernova surveys, and the WiggleZ Baryon Acoustic Oscillation survey, to test new fundamental physics. I'll present the latest data and discuss how the cosmological constraints will be improved in the future with more data, different types of data, and improved analysis techniques.

Protective Measurement and the Interpretation of the Wave Function


Shan Gao (Centre for Time, University of Sydney)


We investigate the validity of the field explanation of the wave function by analyzing the mass and charge density distributions of a quantum system. According to protective measurement, a charged quantum system has effective mass and charge density distributed in space, proportional to the square of the absolute value of its wave function. If the wave function is a description of a physical field, then the mass and charge density will be distributed in space simultaneously for a charged quantum system, and thus there will exist a remarkable electrostatic self-interaction of its wave function, though the gravitational self-interaction is too weak to be detected presently. This not only violates the superposition principle of quantum mechanics but also contradicts experimental observations.

The complementary contributions of free will, indeterminism and signalling to models of quantum correlations


Michael Hall (ANU)


To model statistical correlations that violate Bell inequalities (such as singlet state correlations), one must relax at least one of three physically plausible postulates: measurement independence (experimenters can freely choose measurement settings independently of any underlying variables describing the system); no-signalling (underlying marginal distributions for one observer cannot depend on the measurement setting of a distant observer), and determinism (all outcomes can be fully determined by the values of underlying variables).

It will be shown that, for any given model, one may quantify the degrees of measurement dependence, signalling and indeterminism, by three numbers M, S and I. It will further be shown how the Bell-CHSH inequality may be generalised to a "relaxed" Bell inequality, of the form

<XY>+<XY'>+<X'Y>-<X'Y'><=B(I,S,M),

where the upper bound is tight and ranges between 2 and 4. The usual Bell-CHSH inequality corresponds to I=S=M=0. More generally, the bound B(I,S,M) quantifies the necessary mutual tradeoff between I, S and M that is required to model a given violation of the Bell-CHSH inequality.

Some information-theoretic implications will be briefly described, as well as a no-signalling deterministic model of the singlet state that allows up to 86% experimental free will.

Local scale invariance as an alternative to Lorentz invariance


Sean Gryb (Perimeter Institute)


I will present a recent result showing that general relativity admits a dual description in terms of a 3D scale invariant theory. The dual theory was discovered by starting with the basic observation that, fundamentally, all observations can be broken down into local comparisons of spatial configurations. Thus, absolute local spatial size is unobservable. Inspired by this principle of "relativity of size", I will motivate a procedure that allows the refoliation invariance of general relativity to be traded for 3D local scale invariance. This trade does away with "many fingered time" and offers a new possibility for dealing with the many technical and conceptual difficulties associated with the Wheeler-DeWitt equation.

Entanglement and mixture in algebraic quantum field theory


Vincent Lam (University of Queensland)


The aim of this talk is to review and discuss some aspects of quantum entanglement in the quantum field theoretic (QFT) domain. The discussion takes place in the algebraic approach to QFT, the motivation for which is briefly discussed. We consider in what sense this approach is sometimes called 'local quantum theory'. We discuss a possible 'realist' understanding of quantum entanglement within this framework, addressing some conceptual and methodological worries raised by Einstein (among others).

$\rho$-ontism: thermodynamics and the interpretation of quantum theory


Owen Maroney (Centre for Time, University of Sydney)


Can a density matrix be regarded as a description of the physically real properties of an individual system? If so, it may be possible to attribute the same objective significance to statistical mechanical properties, such as entropy or temperature, as to properties such as mass or energy. Non-linear modifications to the evolution of a density matrix can be proposed, based upon this idea, to account for thermodynamic irreversibility. Traditional approaches to interpreting quantum phenomena assume that an individual system is described by a pure state, with density matrices arising only through a statistical mixture or through tracing out entangled degrees of freedom. Treating the density matrix as fundamental can affect the viability of some of these interpretations, and introducting thermodynamically motivated non-linearities will not, in themselves, help in solving the quantum measurement problem.

From timeless physical theory to timelessness


Peter Evans (Centre for Time, University of Sydney)


The extent to which Julian Barbour's Machian formulation of general relativity and his interpretation of canonical quantum gravity can be called timeless is addressed. We differentiate two types of timelessness in Barbour's work (1994a, 1994b and 1999) and attempt to refine Barbour's metaphysical claim by providing an account of the essential features of time through considerations of the representation of time in physical theory. We argue that Barbour's claim of timelessness is dubious with respect to his Machian formulation of general relativity but warranted with respect to his interpretation of canonical quantum gravity. We conclude by discussing some of the implications of Barbour's view.

Change of quantum reference frames; towards a quantum relativity principle?


Matt Palmer (School of Physics, University of Sydney)


An explicit description of a physical system is necessarily written with respect to a particular reference frame. It is important to know how to adapt the description when a different, equally valid, reference frame is chosen. In the case of classical frames there is a well-defined covariance of the description. The question we want to address is: How can we extend this description of change of reference frame to the case where the frames are quantum objects?

We study this problem within specific toy models, and approach it operationally. We define a procedure that will change the quantum reference frame with which a quantum system is described. We find this procedure induces decoherence in the system and is described by a non-unitary CP map, which is in interesting distinction to the reversible nature of the classical change of frame procedures.

Quantum information theory in curved spacetime


Maki Takahashi (School of Physics, University of Sydney)


We present a formalism describing the transport of the quantum spin state of massive fermions in curved space-time for the purpose of studying relativistic quantum information phenomena such as entanglement and teleportation. We are concerned with answering the elementary question of how the state of a qubit transforms as it moves through a curved space-time manifold. This transport equation takes the form of the Fermi-Walker transport of a two component spinor, which will be shown to be unitary in the spinor's rest frame. The talk will summarise key results and highlight foundational issues such as the absence of global parallelism and conceptual issues/difficulties regarding entanglement and teleportation.

Quantum control in foundational experiments: what can we say?


Daniel Terno (Macquarie University)


Wheeler's delayed choice (WDC) is one of the "standard experiments in foundations". It aims at the puzzle of a photon simultaneously behaving as wave and particle. Bohr-Einstein debate on wave-particle duality prompted the introduction of Bohr's principle of complementarity, ---`.. the study of complementary phenomena demands mutually exclusive experimental arrangements" . In WDC experiment the mutually exclusive setups correspond to the presence or absence of a second beamsplitter in a Mach-Zehnder interferometer (MZI). A choice of the setup determines the observed behaviour. The delay ensures that the behaviour cannot be adapted before the photon enters MZI. Using WDC as an example, we show how replacement of classical selectors by quantum gates streamlines experiments and impacts on foundational questions. We demonstrate measurements of complementary phenomena with a single setup, where observed behaviour of the photon is chosen after it has been already detected. Spacelike separation of the setup components becomes redundant. The complementarity principle has to be reformulated --- instead of complementarity of experimental setups we now have complementarity of measurement results. Finally we present a quantum-controlled scheme of Bell-type experiments.

A quantization quandary on the canonical road to quantum gravity


Karim Thebault (Centre for Time, University of Sydney)


Canonical quantization techniques are generally considered to provide one of the most rigorous methodologies for passing from a classical to a quantum description of reality. For classical Hamiltonian systems with constraints a number of such techniques are available (i.e. gauge fixing, Dirac constraint quantization, BRST quantization and geometric quantization) but all are arguably equivalent to the quantization of an underlying reduced phase space that parameterizes the "true degrees of freedom" and displays a symplectic geometric structure. The philosophical coherence of making any ontological investment in such a space for the case of canonical general relativity will be questioned here. Further to this, the particular example of Dirac quantization will be critically examined. Under the Dirac scheme the classical constraint functions are interpreted as quantum constraint operators restricting the allowed state vectors. For canonical general relativity this leads to the Wheeler-de Witt equation and the infamous problem of time but, prima facie, seems to rely on our interpretation of the classical Poisson bracket algebra of constraints as the phase space realization of the theory's local symmetries (i.e. the group of space-time diffeomorphisms). As with the construction of an interpretively viable symplectic reduced phase space, this straight forward connection between constraints and local symmetry will be questioned for the case of GR. These issues cast doubt on the basis behind the derivation of the so-called wave function of the universe and give us some grounds for re-examining the entire canonical quantum gravity program as currently constituted.

Algorithm for the shortest path through time


Joan Vaccaro (Centre for Quantum Dynamics, Griffith University)


Feynman showed that the path of least action is determined by quantum interference. The interference may be viewed as part of a quantum algorithm for minimising the action. In fact, Lloyd describes the Universe as a giant quantum computer whose purpose is to calculate its own state. Could the direction of time that the universe is apparently following be determined by a quantum algorithm? The answer lies in the violation of time reversal (T) invariance that is being observed in an increasing number of particle accelerator experiments. The violation signifies a fundamental asymmetry between the past and future and calls for a major shift in the way we think about time. Here we show that processes which violate T invariance induce destructive interference between different paths that the universe can take through time. The interference eliminates all paths except for two that represent continuously forwards and continuously backwards time evolution. This suggests that quantum interference from T violation processes gives rise to the phenomenological unidirectional nature of time. A path consisting exclusively of forward steps gives the shortest path to a point which is in the forwards direction. The quantum interference, therefore, underlies a quantum algorithm that determines shortest path through time.

Scale invariance, Weyl gravity, and Einstein's three objections


Hans Westman (School of Physics, University of Sydney)


Basic epistemological considerations suggest that the laws of nature should be scale invariant and no fundamental length scale should exist in nature. Indeed, the standard model action contains only two terms that break scale

invariance: the Einstein-Hilbert term and the Higgs mass term. We give a simple introduction to Weyl's 1918 scale invariant gravity based on basic epistemology and discuss the three main objections put forth by Einstein: 1) the hydrogen spectrum depends on their previous history of the atom (something which is empirically ruled out to a high

precision), 2) there is no account for proper time in Weyl's theory, and 3) fieldequations are 4th order leading to Ostrogradsky-type instabilities. We show that the first two objections can readily be answered. In particular the second objection is answered by developing a physical model of an ideal clock from which proper time is identified as the reading of the clock. We then outline an attempt to tackle the third objection by breaking foliation invariance

and so introduce a preferred simultaneity. We show that Lorentz invariance can still be maintained if only the gravitational sector is sensitive to the preferred foliation. We impose the restrictions I) the new theory should contain general relativity in the limit of zero scale curvature, II) no fundamental length scales should appear, III) the field

equations should be of second order.

Can ANY Description of Physical Reality Be Considered Complete?

---

Bell retolled


Howard Wiseman (Centre for Quantum Dynamics, Griffith University)


Although the EPR paper is famous for arguing that quantum mechanics is incomplete, their detailed criterion for completeness has been largely ignored. Here I formalize this criterion, and show that their argument can be made absolutely rigorous, and does not rely upon any additional assumptions of a metaphysical nature e.g. locality. If it is reasonable to similarly formalize Bohr's defence of the completeness of quantum mechanics, it would seem to rely upon a different concept of *disturbance*. Next, I propose a more general criterion for completeness, based on EPR's criterion. Using this, I derive a new formulation of Bell's theorem: Any theory that predicts violation of a Bell inequality cannot be both complete and free of space-like disturbances. Crucially, this theorem holds for both EPR's and Bohr's concept of disturbance.

Preparation Noncontextuality and Continuous Transformations of Quantum Systems


Joel Wallman (School of Physics, University of Sydney)


Traditionally, the focus on determining characteristic properties of quantum mechanics has been on properties such as entanglement. However, entanglement is a property of multiple systems. Another interesting question is to ask what properties are characteristic of single quantum systems. Two answers to this question are:

1.There is a continuous path of pure quantum states connecting any two quantum states [1], and,

2.Quantum mechanics is preparation noncontextual [2].


In this talk, I will discuss a link between these two answers to this question. In particular, I will establish some strict upper bounds on the maximum size of the set of quantum states that can be modelled in a preparation noncontextual, nonnegative theory and show that this set contains pure states that cannot be connected to any other pure state in the set. I will also discuss a common example of a preparation noncontextual model that allows negative values, namely, a discrete Wigner function, and establish necessary and sufficient conditions for bases of an arbitrary dimensional Hilbert space to have nonnegative Wigner functions, i.e., to admit a classical model. I will conclude with a discussion of some open problems.


[1] L. Hardy, quant-ph/0101012v4 (2001).

[2] R. W. Spekkens, Phys. Rev. A, 71, 052108 (2005)

The status of determinism in noncontextual models of quantum theory


Rob Spekkens (Perimeter Institute)


In an ontological model of quantum theory that is Bell-local, one can assume without loss of generality that the outcomes of measurements are determined deterministically by the ontic states (i.e. the values of the local hidden variables). The question I address in this talk is whether such determinism can always be assumed in a noncontextual ontological model of quantum theory, in particular whether it can be assumed for nonprojective measurements. While it is true that one can always represent a measurement by a deterministic response function by incorporating ancillary degrees of freedom into one's description (for instance those of the apparatus), I show that in moving to such a representation, one typically loses the warrant to apply the assumption of measurement noncontextuality. The implications for experimental tests of measurement noncontextuality will be discussed.

Quantum Mechanics in the Presence of Time Machines


T. C. Ralph (Centre for Quantum Computer Technology, Department of Physics, University of Queensland)


We consider quantum mechanical particles that traverse general relativistic wormholes in such a way that they can interact with their own past, thus forming closed timelike curves. Using a simple geometric argument we reproduce the solutions proposed by Deutsch for such systems. Deutsch's solutions have attracted considerable interest because they do not contain paradoxes, however, as originally posed, they do contain ambiguities. We show that these ambiguities are removed by following our geometric derivation.

A generalization of Noether's theorem and the information-theoretic approach to the study of symmetric dynamics


Iman Marvian (Perimeter Institute)


Information theory provides a novel approach to study of the consequences of symmetry of dynamics which goes far beyond the traditional conservation laws and Noether's theorem. The conservation laws are not applicable to the dissipative and open systems. In fact, as we will show, even in the case of closed system dynamics if the state of system is not pure the conservation laws do not capture all the consequences of symmetry. Using information theoretic approach to this problem we introduce new quantities called asymmetry monotones, that if the system is closed they are constant of motion and otherwise, if the system is open, they are always non-increasing. We also explain how different results in quantum information theory can have non-trivial consequences about the symmetric dynamics of quantum systems.

Why is quantum theory complex?


Philip Goyal (University at Albany)


Complex numbers are an intrinsic part of the mathematical formalism of quantum theory, and are perhaps its most mysterious feature. But what is their physical origin? In this talk, I show how it is possible to trace the complex nature of the quantum formalism directly to the basic symmetries associated with the basic operations which allow elementary experiments to be combined into more elaborate ones. In particular, I show that, by harnessing these symmetries, the Feynman rules of quantum theory can be derived from the assumption that a pair of real numbers is associated to each sequence of measurement outcomes, and that the probability of this sequence is a real-valued function of this number pair.

The derivation has numerous intriguing implications, such as pointing to a deep connection between the foundations of quantum theory and the foundations of number systems. It also demonstrates that, contrary to the rather prevalent working hypothesis that the structure of the quantum formalism has something essentially to do with nonlocality, the core of the quantum formalism in fact does not depend in any essential way on the properties of space.

Reference: "Origin of Complex Quantum Amplitudes and Feynman's Rules", Phys. Rev. A 81, 022109 (2010). Full text available at www.philipgoyal.org

Some thoughts about the wave function of the universe


Latham Boyle (Perimeter Institute)

Quantum non-locality: how much does it take to simulate quantum correlations?


Cyril Branciard (School of Physics, University of Queensland)


Quantum correlations cannot be given any classical explanation that would satisfy Bell's local causality assumption. This quite intriguing feature of quantum theory, known as quantum non-locality, has fascinated physicists for years, and has more recently been proven to have interesting applications in quantum information processing.

To properly understand the power of quantum non-locality, it is important to be able to quantify it. One way for that is to compare it to other "non-local resources", such as classical communication or "non-local Popescu-Rohrlich (PR) boxes", and try to use these alternative resources to reproduce the quantum correlations. I will review known results on this subject, and present new simulations of multipartite non-local correlations.

Action Duality: A Constructive Principle for Quantum Foundations


David Miller (Centre for Time, Sydney University), Huw Price (Centre for Time, Sydney University) and Ken Wharton (San Jose University)


An analysis of the path-integral approach to quantum theory motivates the hypothesis that two experiments with the same classical action should have dual ontological descriptions.  If correct, this hypothesis would not only constrain realistic interpretations of quantum theory, but would also act as a constructive principle, allowing any realistic model of one experiment to generate a corresponding model for its action-dual. Two pairs of action-dual experiments will be presented, including one experiment that violates the Bell inequality and yet is action-dual to a single particle. Demanding a consistent, realistic ontology leads to a highly restricted parameter space of possible interpretations.

Quantum limits for measurement of the metric tensor


Tony Downes (School of Physics, University of Queensland)


The geometry of space-time can only be determined by making measurements on physical systems. The ultimate accuracy achievable is then determined by quantum mechanics which fundamentally governs these systems. In this talk I will describe uncertainty principles constraining how well we can estimate the components of a metric tensor describing a gravitational field. I shall outline a number of examples which can be easily constructed with a minimum of mathematical complexity. I will also attempt to derive a general bound on the uncertainty in any attempt to determine the metric tensor which is expected to hold in an arbitrary globally hyperbolic space-time. I shall use tools developed within the algebraic approach to quantum field theory on a classical space-time background. I shall not consider limits on estimating space-time metrics that might arise from a quantisation of gravity itself.