Weakly Abstract: A tale of ambition and tragedy

Once upon a time there was a weekly blog series here at Brissie to Brizzle entitled “Weakly Abstract“. In this series I ranted about whatever paper on the arXiv caught my attention in a given week. Unfortunately, it died.

(Who would have thought that keeping a regular schedule while jetting around the world would be hard?)

Well, I want to know if people would like me to revise this series? I’m the first to acknowledge that the Weakly Abstract had problems. I mean to begin with who wants to hear my mostly ill-informed rants? What’s more, coming up with those rants was quite an effort.

So, why bother? Well, a few folk around the place have said that they appreciated the rants. Also, in the last few weeks the arXiv has been particularly HOT!

Incidently, and I’m fairly certain that these are unrelated events (ha!), but the QIP submission date is only a couple of weeks away now…

So people, should the Weakly Abstract make a comeback or should it be relegated to the great Google cache in the sky?


Weakly Abstract: March 10 – March 14

Sorry everyone, I dropped the ball and didn’t manage to get out the Weakly Abstract on time this week. My excuse is kinda lame, yesterday I was giving a talk at the Quantum Information Processing Spring School and with one thing and another I didn’t get time to crank out a blog post. What’s doubly lame is that I’m still at the Spring School so this post is gonna be a quick-un.

There’s a whole host of papers that caught my eye in the last week and unfortunately they can’t all be the Weakly Abstract. So, in the finest tradition of the interwebs I’m gonna simply choose a paper by a friend of mine in order to give it a plug. Well, that and this paper fits into the whole Spring Schoolish vibe that I’ve got going on. This week’s Weakly Abstract is Optical Quantum Computing by Jeremy O’Brien:

Optical Quantum Computing
Jeremy L. O’Brien

In 2001 all-optical quantum computing became feasible with the discovery that scalable quantum computing is possible using only single photon sources, linear optical elements, and single photon detectors. Although it was in principle scalable, the massive resource overhead made the scheme practically daunting. However, several simplifications were followed by proof-of-principle demonstrations, and recent approaches based on cluster states or error encoding have dramatically reduced this worrying resource overhead, making an all-optical architecture a serious contender for the ultimate goal of a large-scale quantum computer. Key challenges will be the realization of high-efficiency sources of indistinguishable single photons, low-loss, scalable optical circuits, high efficiency single photon detectors, and low-loss interfacing of these components.

I gather that this is the arXivish version of a review paper that Jeremy published in Science late last year. I don’t have time to get into the details of the paper, but basically it’s a bit of a snapshot of where the field of optics based quantum computation is at right now. It’s easy to read so, um, get to it I guess.

I’d just like to go on the record and say that I’m very pro this whole quick review paper thing and I wish that people would do it more. So, people, get out there and start summarizing stuff!!

Weakly Abstract: March 3 – March 7

This week’s Weakly Abstract is going to be highly controversial. You see, over the last month or so that I’ve been doing this I’ve followed a pretty tried-and-true pattern of picking either the most, or second most, scited paper on SciRate from any given week.

This week I’m going to dip waaayyyy down into the list (I think it’s currently the 4th most scited this week) to declare "Identifying phases of matter that are universal for quantum computation" by Andew ("Google Mouth") Doherty and Steve ("Old Man") Bartlett this week’s Weakly Abstract:

Identifying phases of matter that are universal for quantum computation
Andrew C. Doherty, Stephen D. Bartlett

A recent breakthrough in quantum computing has been the realization that quantum computation can proceed solely through single-qubit measurements on an appropriate quantum state – for example, the ground state of an interacting many-body system. It would be unfortunate, however, if the usefulness of a ground state for quantum computation was critically dependent on the details of the system’s Hamiltonian; a much more powerful result would be the existence of a robust ordered phase which is characterized by the ability to perform measurement-based quantum computation (MBQC). To identify such phases, we propose to use nonlocal correlation functions that quantify the fidelity of quantum gates performed between distant qubits. We investigate a simple spin-lattice system based on the cluster-state model for MBQC, and demonstrate that it possesses a zero temperature phase transition between a disordered phase and an ordered "cluster phase" in which it is possible to perform a universal set of quantum gates.

Now why is it that I "hate freedom" so much that I ignore the will of the intertubes and choose a paper with a measly 4 scites over papers like these:

  1. Austin G. Fowler, Ashley M. Stephens, Peter Groszkowski, High threshold universal quantum computation on the surface code. (7 scites)
  2. B. Dierckx, M. Fannes, C. Vandenplas, Additivity of the 2-Renyi entropy for PPT inducing channels. (6 scites)
  3. Grigori G. Amosov, Stefano Mancini, The decreasing property of relative entropy and the strong superadditivity of quantum channels. (5 scites)
  4. M. Cramer, A. Serafini, J. Eisert, Locality of dynamics in general harmonic quantum systems.

Is it because I support Barrack Obama and think that Ron Paul and is a loon?


Is it because I’m an old drinking buddy of both of the authors?


Is it because I’m just trying to shake things up a bit?

Oddly, no.

It’s because I’ve worked on the problem that Doherty and Bartlett are trying to solve and it is a thoroughly hard problem! You see, there’s this very odd little fact in our field that despite the measurement-based model of quantum computing being roughly 6 (or 7?) years old this year we don’t really have our heads around the problem of which families of states are universal for measurement-based quantum computing.

It sounds like it shouldn’t be that hard a problem, at least before thinking about it for about 30 seconds. In a more Hamiltonian controlish view of quantum computing we have a pretty good idea about which Hamiltonian evolutions, together with local control, are universal for quantum computation. Even when we don’t have complete local control we know how to map the problem to the theory of Lie Algebras in order to solve it (modulo some extra conditions).

I guess it could be argued that we also don’t understand which sets of unitary operations are universal for quantum computation either. That’s kinda true, it’s just that in Nature we are rarely given some fixed set unitary operations without any sort of control over the amount of time for which they are applied. So while it’s an interesting mathematical problem which I would dearly like to solve, it doesn’t really have a whole lot of connection to physics.

Where was I? Oh right, measurement-based quantum computing (MBQC) and identifying states which are universal for quantum computation. So, as it stands we know that 2D cluster states are universal for quantum computing, and we also know that all regular lattices graph states are universal for measurement-based quantum computing because there is a nice construction that shows which measurements we need to perform to turn a regular lattice state into a cluster-state.

We also know a fair bit about how the multiparty entanglement in a family of pure states should scale with the number of qubits in the state in order for that family to be used for measurement-based quantum computing (and given that we can’t change what our logical qubits are).

Unfortunately, it seems that when we are given a shiny new family of states with all the right entanglement characteristics we don’t have a good way of identifying how to make logical gates via measurement and local unitary operations. Gross and Eisert had a really good idea (link is to a longer paper with Schuch and Perez-Garcia as co-authors) a few years back when the realized that you can use a state’s description in terms of Matrix Product States (geez, those things keep cropping up don’t they?) to work out how to implement quantum gates. The problem with that is actually getting a description of your state in terms of MPS which is a pretty non-trivial thing to do.

Doherty and Bartlett have in this week’s Weakly Abstract demonstrated a whole class of states that can be used for measurement-based quantum computing while still using the same measurements that allow for MBQC on cluster states. The way that they do this is kinda far from obvious. Basically, they define a Hamiltonian that they dub the transverse-field cluster model, which is basically the Hamiltonian that has the cluster-state as its unique ground state with a transverse field thrown in. They demonstrate that as the transverse field strength is varied the ground state the system transitions from something that is universal for MBQC to something that isn’t. In short, they demonstrate that such a Hamiltonian undergoes a quantum phase transition between being universal for quantum computation and being something else.

Now, they don’t do this by actually finding out what the ground state is but rather they show that by studying the properties of certain long-range correlation functions you can work out the gate fidelity of a gate teleportation through the system. Essentially they are saying, among other things, that it is how these gate fidelities scale that determines your ability to do quantum computation. At least that’s my take on it anyway.

I should point out that I haven’t even begun to scratch the surface of the issues raised in this paper. I don’t fully understand a few parts of the paper and a lot of my confusion is a result of my lack of expertise in condensed matter physics. Yet it seems to me that this paper contains an abundance of compelling techniques. For instance, the way they determine which correlation functions are important for gate teleportation and the mapping between the transverse-field cluster model and the anisotropic quantum orbital compass model are both really interesting things to think about.

Weakly Abstract: February 25 – February 29

Checking out SciRate there’s been a fair bit of good quant-ph action over the last week. For instance you might want to check out the following papers:

  1. Guillaume Aubrun, A remark on the paper “Randomizing quantum states: Constructions and applications”. (10 scites)
  2. Huw Price, Toy Models for Retrocausality. (7 scites)
  3. Keisuke Fujii, Katsuji Yamamoto, Fault-tolerant quantum computation in concatenation of verified cluster states. (5 scites)
  4. Shesha Raghunathan, Todd Brun, Continuous monitoring can improve indistinguishability of a single-photon source. (5 scites)
  5. Paulo E. M. F. Mendonca, Alexei Gilchrist, Andrew C. Doherty, Optimal tracking for single qubit states. (5 scites)

Unfortunately the world is cruel and the law of the land clearly states that there can only ever be one Weakly Abstract (that is until I decide to do things differently of course). The paper that caught my eye this week was The computational difficulty of finding MPS ground states by Schuch, Cirac, and Verstraete:

The computational difficulty of finding MPS ground states
Authors: Norbert Schuch, Ignacio Cirac, Frank Verstraete

We determine the computational difficulty of finding ground states of one-dimensional (1D) Hamiltonians which are known to be Matrix Product States (MPS). Therefore, we construct a class of 1D frustration free Hamiltonians with unique MPS ground states and a polynomial gap above, for which finding the ground state is at least as hard as factoring. By lifting the requirement of a unique ground state, we obtain a class for which finding the ground state solves an NP-complete problem. Therefore, for these Hamiltonians it is not even possible to certify that the ground state has been found. Our results thus imply that in order to prove convergence of variational methods over MPS, as the Density Matrix Renormalization Group, one has to put more requirements than just MPS ground states and a polynomial spectral gap.

Continue reading

Weakly Abstract: February 18 – February 22

Another Monday, another Weakly Abstract. This week’s Weakly Abstract is "Creation, manipulation, and detection of anyons in optical lattices" by Aguado et al:

Creation, manipulation, and detection of anyons in optical lattices
Authors: M. Aguado, G. K. Brennen, F. Verstraete, J. I. Cirac

Anyons are particle-like excitations of strongly correlated phases of matter with very exotic physical properties. Unlike bosons and fermions, they have so-called fractional statistics, which are characterized by non-trivial changes in the quantum wavefunction when two of them interchange their positions. Those changes can, in turn, be used to perform quantum computations [A. Yu. Kitaev, Annals Phys. 303, 2 (2003), arXiv:quant-ph/9707021v1], something which has renewed the interest in the investigation of physical systems where anyons may be present, manipulated and detected. In this work we show how this can be accomplished in the context of optical lattices. Our proposal just requires one (or several) ancilla particle(s) which can: (i) undergo single particle gates; (ii) be moved close to each constituents of the lattice and undergo a simple quantum gate; (iii) be detected. Recent experimental progress with atoms in optical lattices makes our proposal feasible with present technology.

Aguado et al propose a method for generating and controlling anyonic excitations in optical lattice systems via coupling atoms in the lattice to well-controlled ancilla particles. The amount of stuff that I don’t know about topological quantum computing could fill a few textbooks, so if anyone would like to elaborate further on the results of this paper they should feel free to do so in the comments section.

I think that Gavin Brennen (the second author) talked about these results at QEC07 but I can’t verify this as his talk doesn’t seem to be available on the QEC website (btw, there are some fantastic talks available there). Does anyone else remember Gavin’s talk?

Weakly Abstract: February 11 – February 15

It’s time for the second installment of the Weakly Abstract and I can already tell that this is going to be a hell of a job to get through every week. The last week saw some fantastic papers hit the quant-ph arXiv, here’s the Weakly Abstract shortlist:

  1. Stephen Jordan and Edward Farhi, Perturbative Gadgets at Arbitrary Orders.
  2. Julia Kempe, Oded Regev, Falk Unger, and Ronald de Wolf, Upper Bounds on the Noise Threshold for Fault-tolerant Quantum Computing.
  3. Ronald de Wolf, A note on quantum algorithms and the minimal degree of $\eps$-error polynomials for symmetric functions.

The world however is a harsh and cruel place so there can only be one Weakly Abstract per week, and because nepotism is such a powerful force (and also because this paper got the most scites on SciRate this week) I’ve chosen "Random Quantum Circuits are Approximate 2-designs" by Richard Low¬† and Aram Harrow:

Random Quantum Circuits are Approximate 2-designs
Authors: Aram Harrow, Richard Low

Given a universal gate set on two qubits, it is well known that applying random gates from the set to random pairs of qubits will eventually yield an approximately Haar-distributed unitary. However, this requires exponential time. We show that random circuits of only polynomial length will approximate the first and second moments of the Haar distribution, thus forming approximate 1- and 2-designs. Previous constructions required longer circuits and worked only for specific gate sets. As a corollary of our main result, we also improve previous bounds on the convergence rate of random walks on the Clifford group.

This paper is Richard (Rich) Low’s arXiv debut which is a massive achievement seeing as he has the misfortune of sharing an office with me. Now, because of this I get to try something that I’ve never really done before, I’m going to do an interview with Rich about the paper…

mick: Hi Rich.

Rich: Hello.

mick: Your paper is really long. Let’s say, hypothetically, that I couldn’t be assed reading the whole thing, and possibly couldn’t even be bothered to read my way through the abstract. What would you say is the take home message of your paper?

Rich: The title. Presumably you’ve read that.

mick: OK then, what is a quantum 2-design?

Rich: A state 2-design is an ensemble of states that looks uniformly random when you are only given two copies of the state.

mick: What is a unitary 2-design?

Rich: It’s an ensemble of unitaries that looks uniformly random given two copies of a unitary from the ensemble.

mick: OK, so what did you do again?

Rich: We showed that random circuits constructed from a 2-qubit universal gate set give approximate unitary 2-designs efficiently.

mick: So that really took 36 pages? I guess that at least this paper is shorter than Toby’s paper last week. So should we end this now and just drink our beer?

Rich: Yes.

Weakly Abstract: February 4 – February 8

An idea that I’ve been toying with for a while now is to write a weekly post on my favourite quant-ph arXiv paper which has been released in the last week. I’ve decided after literally minutes of consultation with my office mates that this new regular post should be called "Weakly Abstract".

I expect that the Weakly Abstract will appear on Mondays, though this first week is going to be an exception because I traveled a couple of thousand kilometers yesterday.

I’m not really sure what format the "Weakly Abstract" is going to take. I plan to experiment with a few different styles over the coming weeks. Ideally I’d just like this post to be a bit of an open forum for people to discuss the offerings from the previous week or so on the quant-ph arXiv. I’ll probably get the discussion started by pointing to my favourite paper, or the most scited paper on SciRate each week and then people can take it from there.

I doubt that I’ll always take this super-seriously and I’d like it if people keep the discussion as light and humorous as possible. Be polite but don’t worry too much about keeping the tone too professional, I want this thread to be relatively accessible to all but, of course, if you want to throw down some serious math be my guest. In short, a chatty and jokey atmosphere is highly encouraged.

Oh, and if anyone has any suggestions for papers they’d like discussed can feel free to email me or to let me know in the comments.

So, here goes…

For the first Weakly Abstract I’d like you all to point your browsers towards "Hamiltonian Quantum Cellular Automata in 1D" by Daniel Nagaj and Pawel Wocjan (arXiv:0802.0886):

Hamiltonian Quantum Cellular Automata in 1D
Authors: Daniel Nagaj, Pawel Wocjan
(Submitted on 6 Feb 2008)

We construct a simple translationally invariant, nearest-neighbor Hamiltonian on a chain of 10-dimensional qudits that makes it possible to realize universal quantum computing without any external control during the computational process. We only require the ability to prepare an initial computational basis state which encodes both the quantum circuit and its input. The computational process is then carried out by the autonomous Hamiltonian time evolution. After a time polynomially long in the size of the quantum circuit has passed, the result of the computation is obtained with high probability by measuring a few qudits in the computational basis.

This result also implies that there cannot exist efficient classical simulation methods for generic translationally invariant nearest-neighbor Hamiltonians on qudit chains, unless quantum computers can be efficiently simulated by classical computers (or, put in complexity theoretic terms, unless BPP=BQP).

I really like this paper. It is well written and with 11 scitations, Nagaj and Wocjan’s paper is currently the most scited paper of last week on SciRate.

The key point of this paper is that the authors constructively prove that it is possible to perform efficient quantum computations by time evolving relatively simple, separable, low dimensional initial states by a fixed time-independent 1D nearest neighbour translationally invariant Hamiltonian and measuring in a fixed basis. As is pointed out in the second paragraph of the abstract, this result sets up a compelling dichotomy – either BPP=BQP (that is classical computers can simulate quantum computers) or the folklore belief in the condensed matter physics that the time evolution of all low-dimensional 1D translationally invariant nearest neighbour systems can be simulated classically is wrong.

A similar, yet more complex, construction that proves the basically the same concept was presented in an earlier paper by Vollbrecht and Cirac (arXiv:0704.3432v2) which I also highly recommend.

In Nagaj and Wocjan’s constructions all of the control required for the computation is programmed into the initial state of the computer. In a sense, all of the description complexity of any "circuit" that is implemented is pushed into the initial state of the system. The compelling point is that this can always be done for efficient quantum algorithms with a separable input that is in a computational basis state. All of the superposition and entanglement that is normally associated with the "power" of a quantum computation is performed via the time evolution by the one, simple, time-independent entangling Hamiltonian (and for efficient algorithms the amount of time you have to wait for an answer isn’t very long).

This result seemed really surprising to me at first, but I guess in a lot of ways it isn’t really that dissimilar to the 1-way model of quantum computing. In the 1-way model all of the quantumish part of a computation is thrown into the initial state (normally thought of as a 2D cluster state). The circuit complexity is pushed into the measurement stage of the computation (albeit with a bit of feed-forward control).

Wheras in the model of Nagaj and Wocjan’s the complexity of the circuit happens, mostly, in the initial state. Though there is of course the tricky point of having to wait enough time for the computation to be able to give the correct result, this is essentially something that can’t be sped up.

In summary, I think this paper is tops because it gives us a real target for resolving the problem of quantum versus classical computation. If we can demonstrate a method for simulating the time evolution of the Hamiltonians involved in Nagaj and Wocjan’s constructions then BQP=BPP (and I’m out of a job). Though it should be noted that a recent paper by Schuch et al (arXiv:0801.2078v1) suggests that given current simulation methods in computational physics this is very unlikely.

These results, to me, gives one of the more compelling arguments for pushing ahead with development quantum computers. While we now know that simulating the static properties of generic quantum systems (like finding the ground state) is really hard both classically and quantumly, all of this recent work is suggesting that when it comes to simulating dynamics we may be able to use quantum computers to do a lot more than we ever can with classical devices.