2016 SQuInT Workshop

A Review


Been a while, hasn’t it? These past couple of months have found me busy with research and whatnot, hence a lack of writing.

Last week, I had the opportuntity to attend the “Southwestern Quantum Information and Computer Science” (SQuInT) workshop, hosted by UNM’s Center for Quantum Information and Control This year, #squint2016 was held in downtown Albuquerque; so no “travel”, per se, for me, though the opportuntity to go home each night and recharge was nice.

SQuInT is a three-day event, bringing together experimentalists and theorists in quantum information. The workshop organizers found a lot of great presentations and posters for this year’s workshop. Below, I discuss some of the talks I found most helpful/useful.

Day One

Loophole-Free Bell Tests

The first two sessions of SQuInT focussed on so-called “loophole-free Bell tests”. In the 1964, John Bell showed that there could exist correlations between measurement outcomes on entangled quantum states which exceed the classical correlations. (More specifically, for a particular correlation (CHSH-type) \(C\), Bell showed the classical correlation would be bounded by \(2\), while the quantum correlation would be bounded by \(2\sqrt{2}\).) As such, there has been a lot of interest in performing experiments to measure this correlation, and see whether it exceeds \(2\), and if so, by how much.

One problem, though, is that poorly-designed experiments might fool us into thinking we had observed a genuinely quantum effect when instead, the effect was purely classical. Hence, the need for “loophole-free Bell tests”. Several researchers, such as Ronald Hanson (QuTech, Delft), Morgan Mitchell (ICFO), Marissa Guistina (Vienna), and Lynden Shalm (NIST, Boulder) discussed recent experiments in which loophole-free Bell tests were performed, and also showed these experiments observed a correlation which exceeds the classical bound. As such, we now have experimental confirmation that one particular view of quantum physics - the “local realist” view - has been ruled invalid. (For more information regarding local realism, see here.)

One really helpful talk during these sessions came from Scott Glancy, who discussed the kinds of data analysis necessary to actually show that the loophole-free Bell tests were statistically rigorous. (Because the experimentalists report \(p\)-values for their measured correlations — asking “What is the probability of seeing this correlation simply by random chance if the local realist model was correct?” — it became very important to compute those \(p\)-values correctly, and to avoid \(p\)-value hacking.)

Semiconductor QIP and Tomography

The second part of the day focussed on semiconducting systems and their tomography. Seth Merkel (HRL) gave a talk on how currently-used characterization techniques – such as randomized benchmarking (RB) – may fail rather dramatically if the assumptions used in the protocol are violated. In particular, RB assumes that non-Markovian (aka, “memory-full”) noise is not present in the system. However, if non-Markovian noise is present, then the estimates produced by RB for quantities such as gate fidelity might be very far off from the true fidelities. This would necessitate moving away from single-number characterizations of quantum gates, and also suggests there is a need for more advanced characterization techniques.

CQuIC’s own Charlie Baldwin discussed work he has been doing on quantum state tomography. In particular, Charlie showed how it is possible to define a certain positive operator-valued measure (POVM), called a “rank-\(r\) stricly complete POVM”, such that noiseless measurement data from that POVM would uniquely identity a rank \(r\) state from the set of all positive states. (For a paper on the subject, see here.) Such POVMs are useful because they provide this strong guarantee of uniqueness. (For other POVMs, there may be several states consistent with the noiseless data, which would make it hard to try and estimate the state, at least uniquely.)

Poster Session

Due to the limited time available for oral presentations, most conferences have some time set aside for poster presentations. SQuInT this year had several interesting posters; though sadly, I did not get a chance to look at them all!

Day Two

Neutral Atoms

We started off Day Two with a session on neutral atoms. There were two talks on the subject, one from Cindy Regal (JILA) on how one can interfere and entangled neutral atoms. (Since the atoms are neutral, it’s hard to make them interact with each other using, say, the Coulomb force, which is used to interact ions.) Another challenge with neutral atoms is simply manipulating them - since they are electrically neutral, they will not respond to externally applied electric fields. Several researchers, including Cindy, have shown how one can use so-called “optical tweezers” to move neutral atoms around. In addition, by building “optical lattices”, researchers can trap neutral atoms in a lattice-type structure, and, due to the tunable nature of the lattice, simulate different interactions and Hamiltonians.

Adam Kaufman (Harvard) discussed how we could use neutral atoms to study the thermalization of isolated systems. Recall from thermodynamics that we tend to think of systems as being coupled to external baths; thermalization is the process whereby which some properties of the system equilibrate to those of the bath. However, this raises a pesky problem - how does the bath thermalize? If we continue our earlier line of reasoning, then we conclude the bath must be in contact with yet another bath. Extrapolating this line of logic, we immediately encounter an issue - the largest possible bath is our entire universe, so how does that thermalize? Is there some external bath outside it? We don’t know. Hence the need to study thermalization, and investigate whether we can think about thermalization in the context of quantum mechanics.

Adam showed how some experiments with neutral atoms can measure certain quantum properties of the system (such as the entanglement entropy), and suggested that these quantum quantities may be related to a (macroscopically) thermodynamic counterpart.

Computer Science

There have been many advances in quantum algorithms in recent years, and at SQuInT, we were fortunate to hear about some of those. Ashley Montanaro (University of Bristol) discussed quantum algorithms for two problems - solving optimization problems via “backtracking”, and speeding up Monte Carlo. In both cases, it is possible to find a quantum speedup over the classical algorithm.

The backtracking algorithm is an optimization technique which seeks to quasi-exhaustively explore the set of possible solutions, and does so in a smart way. In particular, backtracking follows a particular choice of values for the optimization variables until it encounters a situation where it is not possible to set another variable. (For instance, in satifiability problems, backtracking follows the tree for the various values of the variables until the clauses are not satisified.) Once it encounters such a set of values, it backtracks up the tree and chooses a different path to follow.

Monte Carlo techniques are used when we wish to to approximate quantities by numerical averages. (For instance, Monte Carlo integration computes an integral as a discrete sum, or computes the expected value of a random variable by drawing samples and returning the numerical average.) Ashley showed that it is possible to construct a quantum algorithm which speeds up Monte Carlo by a quadratic factor (aka, the runtime goes from \(X\) to \(\sqrt{X}\).)

Parallel Sessions

Day Two brought with it the parallel sessions, where many talks are presented in different sessions simultaneously. Parallel sessions help increase the availability of timeslots for speaking, though at the expense that the audience has to choose which talks to go to. (The APS March meeting extensively uses parallel sessions - after all, almost 10,000 people will be at the conference!) I was fortunate enough to be able to speak during one of these sessions; you can check out my talk on YouTube, or visit this page for the slides.)

Day Three

The last day of SQuInT! Even though Day Three was on a Saturday, talks still took place from 0830 until 1815. (There’s always science to be done/talk about!)

Quantum Computation

The day got started off with a talk from Seth Lloyd (MIT), who discussed a quantum algorithm for computing the topology of datasets. Seth’s research intersts have included quantum machine learning and algorithms, and he reviewed some of the relevant research which has been done on those subjects.

One potentially useful thing to do with very large datasets is compute their topology. For instance, on a social network graph, it can be useful to identify clusters of people, or to see how there are holes in the network. For very high-dimensional datasets, computing the topology can be difficult. Thankfully, there are classical algorithms to do so, which utilize techniques in algebraic topology. Essentially, these algorithms work by formalizing the notion of “shapes that persist” in the data across various length scales. (Formally speaking, they look for “persistent homologies”.) The best classical algorithms require a runtime \(\mathcal{O}(2^{2N})\); Seth was able to show how a quantum algorithm could get that runtime down to \(N^{3}\).

Experimental Superconducting Qubits

As mentioned, SQuInT brings together both theorists and experimentalists. One really neat session was on progress in building superconducting qubit devices.

Robert Schoekopf (Yale) showed us how it is possible to use superconducting qubits and optical cavities to store quantum information for a “long time”. (“Long” in this context can be a microsecond!) The key idea was to encode the quantum information in the state of the light in the optical cavity (by using a Schrodigner-cat coherent state), and to use the superconducting qubit as a way to couple to that information and perform error correction on it. (A paper on the subject.)

Ryan Babbush (Google) discussed how variational quantum eigensolvers (VQEs) could be used to efficiently compute energy surfaces of molecules. One of the original applications for quantum computers was to simulate quantum systems; a VQE could be used to compute the ground state energy of a molecule, for instance. (See this paper for a discussion of VQEs).


The keynote talk this year was given by John Preskill. This talk discussed “Our Quantum Future”, and touched upon how we might use quantum information to advance a whole set of fields in physics.


Overall, SQuInT 2016 was a great workshop. Compared to the first SQuInT I went to four years ago, I was much more knowledgeable about what people are doing, and felt I could contribute to technical conversations.

This year, the workshop was organized by Professor Akimasa Miyake; it was his first time doing so, and he did a great job keeping everything going smoothly.