Geometric quantum computation using nuclear magnetic resonance.

Nature 403:6772 (2000) 869-871

Authors:

JA Jones, V Vedral, A Ekert, G Castagnoli

Abstract:

A significant development in computing has been the discovery that the computational power of quantum computers exceeds that of Turing machines. Central to the experimental realization of quantum information processing is the construction of fault-tolerant quantum logic gates. Their operation requires conditional quantum dynamics, in which one sub-system undergoes a coherent evolution that depends on the quantum state of another sub-system; in particular, the evolving sub-system may acquire a conditional phase shift. Although conventionally dynamic in origin, phase shifts can also be geometric. Conditional geometric (or 'Berry') phases depend only on the geometry of the path executed, and are therefore resilient to certain types of errors; this suggests the possibility of an intrinsically fault-tolerant way of performing quantum gate operations. Nuclear magnetic resonance techniques have already been used to demonstrate both simple quantum information processing and geometric phase shifts. Here we combine these ideas by performing a nuclear magnetic resonance experiment in which a conditional Berry phase is implemented, demonstrating a controlled phase shift gate.

Entangling atoms and ions in dissipative environments

Journal of Modern Optics 47-14:15 (2000) 2583-2598

Authors:

A Beige, S Bose, D Braun, SF Huelga, PL Knight, MB Plenio, V Vedral

Abstract:

Quantum information processing rests on our ability to manipulate quantum superpositions through coherent unitary transformations, and to establish entanglement between constituent quantum components of the processor. The quantum information processor (a linear ion trap, or a cavity confining the radiation field for example) exists in a dissipative environment. We discuss ways in which entanglement can be established within such dissipative environments. We can even make use of a strong interaction of the system with its environment to produce entanglement in a controlled way. © 2000 Taylor & Francis Group, LLC.

Landauer's erasure, error correction and entanglement

Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 456:1996 (2000) 969-984

Abstract:

Classical and quantum error correction are presented in the form of Maxwell's demon and their efficiency analysed from the thermodynamic point of view. We explain how Landauer's principle of information erasure applies to both cases. By then extending this principle to entanglement manipulations we rederive upper bounds on purification procedures, thereby linking the 'no local increase of entanglement' principle to the second law of thermodynamics. © 2000 The Royal Society.

Mixed state dense coding and its relation to entanglement measures

Journal of Modern Optics 47-2:3 (2000) 291-310

Authors:

S Bose, MB Plenio, V Vedral

Abstract:

Ideal dense coding protocols allow one to use prior maximal entanglement to send two bits of classical information by the physical transfer of a single encoded qubit. We investigate the case when the prior entanglement is not maximal and the initial state of the entangled pair of qubits being used for the dense coding is a mixed state. We find upper and lower bounds on the capability to do dense coding in terms of the various measures of entanglement. Our results can also be reinterpreted as giving bounds on purification procedures in terms of dense coding capacities. © 2000 Taylor & Francis Group, LLC.

Mixedness and teleportation

Physical Review A - Atomic, Molecular, and Optical Physics 61:4 (2000) 401011-401012

Authors:

S Bose, V Vedral

Abstract:

We show that on exceeding a certain degree of mixedness (as quantified by the von Neumann entropy), entangled states become useless for teleportation. By increasing the dimension of the entangled systems, this entropy threshold can be made arbitrarily close to maximal. This entropy is found to exceed the entropy threshold sufficient to ensure the failure of dense coding.