Grappling with the Quantum
Trying to Understand the Fundamental Rules Governing Our World
The 36th LaFollette Lecture
October 30, 2015
by Dennis E. Krause
I’m deeply honored to be speaking this afternoon. I was surprised and excited when Dwight Watson came to my office at the beginning of the year with the invitation. Naturally there was trepidation—what was I going to talk about? However, I was excited because it would give me a chance to talk about the things that have been perplexing me over the years and which have been driving much of my research and teaching. It would also give me the opportunity to talk more about what I do and how I do it. Finally, I felt that by looking at my work from a humanities perspective, I would learn something new, and that is exactly what happened.
While I’m honored to be first physicist to give the LaFollette Lecture, I actually believe I’m the second physical scientist to speak, following Paul McKinney who gave the thirteenth LaFollette Lecture in 1992. It shouldn’t be surprising to note that I’ll be speaking on some of the same things he discussed over 20 years ago since the problems of the quantum remain as fascinating now as then.
The charge of this talk is to address how my research touches the humanities broadly speaking, which sent me searching for a definition of the “humanities.” One that I found seems to describe well how the term is used in academia:
The humanities can be described as the study of how people process and document the human experience. 
But I’m a theoretical physicist. I’m trying to find and understand the rules that underlie the operation of the physical universe, not the people within it (however interesting that may be). Then I found another definition that seemed to work better for me:
Since the nineteenth century the humanities have generally been defined as the disciplines that investigate the expressions of the human mind. 
While the painter, photographer, and poet are each trying to capture some aspect of the physical world on canvas, digital media, or paper, I’m trying to capture some element of the universe with mathematical equations. These are all the result of human minds trying to express thoughts and feelings about our world.
There is, however, a big difference between what I do and what these artists do. As a professional physicist, I am not free to do whatever I want. Richard Feynman referred to it as “imagination in a terrible strait-jacket.” Since I’m trying to capture some essence of the real universe, the results of my efforts must be consistent with what is already known about the universe. I’m not free to invent new spatial dimensions unless I can devise a way to show they really may exist. One of my hobbies is writing fiction (although I don’t have nearly the time I wish to devote to it). There I’m free to imagine a small town in northern Minnesota where the townspeople must drape the body of a freshly killed victim over an ancient oak tree to ward off an unspeakable horror. But as a physicist, I can’t tell any story I want. I can’t invent some new law of physics unless I can also show that it is consistent with everything we know now and how it can be tested by experiment. If it cannot be readily tested, it is just speculation and is of no interest to me. And if my theory is tested and disagrees with the results of experiment, it is wrong. It doesn’t matter how beautiful and elegant the equations may be, if they fail to describe the world, they must be discarded. In physics, theories must work.
This may make it seem that science is antithetical to the humanities. Don’t we have our cold, impersonal “scientific method”? In my long experience in science, the only time I hear mention of the “scientific method” is possibly in an introductory physics class or in an oral comprehensive exam when I’m quizzing a humanities major. Like apprentices in any skilled craft, we learn the method through practice. As for being “cold, impersonal,” nothing can be further from the truth. Some recent quotes by Nobel laureate physicist Steven Weinberg accurately describe what we actually do:
Descartes and Bacon are only two of the philosophers who over the centuries have tried to prescribe rules for scientific research. It never works. We learn how to do science, not by making rules about how to do science, but from the experience of doing science, driven by desire for the pleasure we get when our methods succeed in explaining something. 
So the world acts on us like a teaching machine, reinforcing our good ideas with moments of satisfaction. After centuries we learn what kinds of understanding are possible, and how to find them. We learn not to worry about purpose, because such worries never lead to the sort of delight we seek. We learn to abandon the search for certainty, because the explanations that make us happy never are certain. We learn to do experiments, not worrying about the artificiality of our arrangements. We develop an aesthetic sense that gives us clues to what theories will work, and that adds to our pleasure when they do work. Our understandings accumulate. It is all unplanned and unpredictable, but it leads to reliable knowledge, and gives us joy along the way. 
I hope you noticed words that aren’t usually used to describe what scientists do: “pleasure,” “satisfaction,” “delight,” “happy,” “aesthetic,” and “joy.” I do what I do not just because it (hopefully) provides insight about the world—I also do it because it is fun!
So I’m a theoretical physicist, but what do I actually do? Most folks in academia specialize and so can readily answer the question: What do you do? It is a little harder for me, because I don’t see myself as a particle physicist or a nuclear physicist or a… Rather, I view myself as a general theoretical physicist who is on the lookout for very simple problems that can reveal something new and interesting about our world. Most theorists investigate problems that are vastly more complicated than the ones I study. Fortunately I’ve been lucky to find simple problems that turn out to be very interesting, and I will share one of these with you later. However, before I get to it, we need to take a very brief tour of physics.
When people learn that I’m a physicist, the usual response is “Oh, that’s hard,” or “Oh, I failed that class in college,” or simply: “Oh…” However, we’re all physicists to some extent since we all need to have some sense of the rules obeyed by our universe. When someone tosses a ball to you, you can probably catch it without difficulty because you’re familiar with how gravity and air resistance affect the motion of objects flying through the air. You know that larger objects are harder to accelerate than smaller ones. You’re familiar with the affects of acceleration as you go around a bend, and know that freeways have wider turns than slower roads since these effects depend on speed. You know that the rate at which time passes is the same for everyone even though at times it may not seem like it.
The rules governing the objects of our everyday world are called “classical” and were developed by Galileo, Newton, and others through the 19th century. The classical world has certain features that most of us would consider “commonsensical” (although they need to be learned by newborn infants). These include:
- Physical objects exist at definite positions in space and instants in time. Two objects can’t occupy the same location at the same time, or a single object cannot exist in two places simultaneously.
- Objective System State: The state of an object is determined by its properties which have definite values at all times. The state is an objective property of the system independent of the observer. Observation of the system should not affect its behavior (or can be taken into account).
- Determinism: The world is deterministic since the present state of a system completely determines its future states.
- Continuity: Systems evolve continuously through space and time. (Babies who haven’t learned this love playing peek-a-boo. For them, something that is out of sight doesn’t exist.)
In essence, classical physics assumes there exists an objective reality, an external world that is independent of us.
This classical world has an interesting property: it allows us to tell stories about how things happen. To see this, suppose we are at a baseball game (the World Series has started), and we wish to know where a ball struck by the batter (say, Mets post-season hero Daniel Murphy) will land. Will it be a home run? Knowing the initial position and velocity of the ball, we can calculate its trajectory using the laws of classical physics. This is a story, albeit boring unless we’re rooting for the Mets. It has a beginning (ball leave the bat), a continuous middle with the suspense of wondering what will happen (ball flies through the air), and an ending (perhaps it just barely clears the right-field fence). We take this ability to create a narrative of how things behave in the universe for granted, including in physics. Often when physicists feel they understand some phenomena, they have a mental story they can tell of what is happening along with the equations.
It is important to recognize that while these features of classical physics are reasonable and are obeyed in our everyday lives, they may be only approximately true. Like all laws of physics, they have a limited range of validity. If one tests them in new realms, they may fail. Furthermore, neuroscientists have shown that our perception of the world is heavily processed by our brain which seamlessly combines sense data and memories to create a model which we use to interact with the world. We never notice this is happening unless probed by the right experiment or if something goes terribly wrong.
By the end of the 19th century, after the development of the theories of electromagnetism and thermodynamics, much of the behavior we observe in our everyday life could be adequately explained by classical physics. But all was not well…
Stop reading and look around.
While we can use classical physics to describe the motion of things around us, and know that the light reaching our eyes is actually electromagnetic waves, but why do the objects that you see have the properties you observe? What determines their colors? What determines whether they are hard or soft? Why is glass transparent, but a wall is opaque? Why does copper conduct electricity, but plastic doesn’t? All of these everyday properties require an understanding of the microscopic nature of matter. By microscopic, I don’t mean on the size of things that can be seen with an ordinary (light) microscope. By microscopic, I mean on an even smaller scale, on the level of atoms.
By the end of the 19th century, probably most chemists and physicists (but certainly not all) believed in the existence of atoms, but there was little understanding of what they were or how they worked. It wasn’t until 1897 that the electron, a crucial component of an atom, was discovered. This finally gave physicists a clue to work with. Electrons carry a negative electric charge and, since ordinary matter is electrically neutral, there must be something in the atom which carried the opposite, positive charge.
The first decade of the 20th century was a fertile period of experiments probing the nature of the atom. The biggest breakthrough occurred when Ernest Rutherford discovered the atomic nucleus in 1911. He found that most atoms (and most of us) are composed of empty space. The bulk of the mass and all of the positive electric charge was concentrated in a tiny region—the nucleus—while the electrons somehow moved around it attracted by the opposite charge of the nucleus. Now physicists had a good enough picture to develop a theory to explain the atom and the nature of matter. Or so they may have thought…
From 1911 and through the early 1920s, Albert Einstein, Niels Bohr, Arnold Sommerfeld, and many others worked furiously on various models of the atom, but however much they tried, they achieved only partial success. Their attempts to apply the rules of classical physics to develop an accurate model of the atom failed. Then, in the 1920s, completely new ideas emerged which solved the problem. But not in a way that anyone would have expected. Werner Heisenberg, Louis De Broglie, Erwin Schrödinger, Paul Dirac, Max Born and others needed to develop an entirely new mechanics to explain atoms—quantum mechanics.
Let’s pause briefly for an analogy. Why does the College spend so much effort to send students abroad for a semester or on immersion trips? Our students need to understand that the farther one travels from Wabash College, the more different people become. While folks in Lafayette or in Indianapolis behave virtually the same way as residents of Crawfordsville, people in Asia, Africa, Europe may view the world in a very different way. It is likely that many of our problems arise from the fact that we think everyone should behave like we do, but they don’t. And there is nothing wrong with this—we just need to understand and respect these other perspectives and act accordingly.
The same thinking applies when we attempt to apply our classical worldview gained from our experiences to realms far removed from our everyday life. The atom is as far removed from our everyday scale as a speck of dust is to the entire Earth. Should we be anymore surprised that the atomic world is so different from our own than we would be surprised that the Chinese, with their much older culture, think differently than us? As Richard Feynman put it:
The behavior of things on the small scale is so fantastic, it’s so wonderfully different … than anything on the large scale! 
It is so different that the best way we have to understand the atomic world viewed through the quantum lens is through mathematics.
What is quantum mechanics in a nutshell? It is a set of mathematical rules that tell us two things:
- Measurement outcomes. It tells us what are the possible results of an experiment. Many of these results come in discrete quantities (e.g., chucks of energy), they are “quantized,” which is where quantum mechanics gets its name. The energies of an atom are quantized in the same way that the discrete musical notes of a guitar string or the flute arise.
- Probabilities of measurement outcomes. Quantum mechanics is a theory of probabilities. Unlike classical mechanics, it doesn’t tell us what will happen. When we do the experiment, quantum mechanics determines the probabilities of getting each possible measurement outcome, but it doesn’t tell us which one will actually occur.
The key quantity that is used to make these calculations is the quantum state (or wave function), which is most often denoted by the Greek letter Ψ.
Here is where the situation gets interesting. While nearly all physicists agree on the mathematical formalism of quantum mechanics and how to use it, there is no universal interpretation of what is happening or even what the quantum state vector Ψ really means. While this is a situation familiar to anyone in the humanities where the meaning of an ancient text can be disputed by scholars, it is extremely unusual in physics. There are four physicists in the Wabash Physics Department, and it is very likely we all have somewhat different views on the meaning of quantum mechanics. I cannot think of any similar disagreement of any other topic that we teach in our curriculum. And this situation is the same throughout the field.
To get a better understanding of this situation, let’s now consider a simple problem that I’ll call the Which-Path Problem.
Here’s the setup:
Figure 1: Setup for Which-Path Problem
A particle source sprays particles, each traveling with the same velocity, at a wall with two slits. The particles that pass through the slits travel on until they hit a screen. A moveable detector will determine where they strike. If the particles were Ping-Pong balls, we would expect the balls to strike the screen mostly in two areas behind each slit:
Figure 2: Detection probability (green curve) for Ping-Pong balls.
The balls don’t strike at exactly the same spots due to slight variations in the directions of the balls travel. However, after shooting many balls, we get a sense of where it is most likely for the next ball to hit the screen. If you knew exactly the position and velocity of the ball when it left the source, classical physics would allow you to determine exactly its trajectory and where it would impact the screen.
Now let’s see what would happen if we replace the Ping-Pong balls with atoms which must be described by quantum mechanics. If we tried a classical approach, we would be stopped in our tracks before we even started. Remember that we need the exact initial position and velocity of the particle to determine its trajectory? The famous Heisenberg Uncertainty Principle does now allow such a state to exist! A quantum particle can never have simultaneously a definite position and velocity! Instead, the rules of quantum mechanics assign a wave function Ψ to each possible trajectory of the particle. In our case, to reach a particular point on the screen, there are two possible paths through slits #1 and #2 so there are two different wave functions for each:
Figure 3: Two possible trajectories that the quantum atom can take to reach the detector with their corresponding wave functions.
According to the rules of quantum mechanics, the probability that the atom will strike the detector is obtained by adding the wave functions and squaring the result:
where Ψ1 is the wave function for the trajectory passing through slit #1 and Ψ2 is the wave function for the trajectory passing through slit #2. (Readers familiar with the mathematics of waves will recognize this formula as the way one finds the intensity of a wave pattern.) We see that the detection probability is the sum of three parts:
- is the probability that the atom will reach the detector passing through slit #1 if slit #2 is blocked.
- is the probability that the atom will reach the detector passing through slit #2 if slit #1 is blocked.
- is the truly novel quantum piece: it characterizes the interference of the two trajectories since it depends on both. The trajectory of one path seems to be affecting the other.
Here is what the result of all this looks like:
Figure 4: The detection probability (blue curve) when the particles are quantum atoms. The blue pattern is taken from a video showing the actual observed pattern near the axis (horizontal dashed line) from an experiment using electrons.
We see that the atoms strike the screen in bands instead of just two spots as in the Ping-Pong ball case. How do we interpret this?
Quantum mechanics determines the detection probability for where the atom will strike the screen, but unlike classical mechanics, it does not tell us what the atom actually does between when it leaves the particle source and when it strikes the screen. Unlike classical physics, which allows us to tell a narrative of the trajectory of a baseball or the Ping-Pong balls in the which-path experiment, quantum mechanics is completely silent on the middle portion of the story, between the beginning and the end.
Why don’t we just look?
Let’s set up some detectors near the slits and see which path the atoms take. Beautiful experiments have been conducted which attempt to determine which path the atoms take. What happens?
Figure 5: If detectors are set up to see which slit the atoms pass through, the detection probability at the screen near the axis (horizontal dashed line) is a flat line (red). All the interference occurring without the detectors has vanished.
We find that the more information we get about which path the atom took, the less quantum interference arises. If we know exactly which slit the atom passed through, no interference occurs!
While all physicists agree on how to use quantum mechanics to calculate what will be observed in an experiment, they do not agree on the interpretation of what goes on, possibly because the formalism does not permit a narrative description. There have been many attempts to explain this weirdness. Some do this by introducing alternate universes, one universe where the atom goes through slit #1 and another in which the atom passes slit number #2. Others try to envision the atom simultaneously taking both paths simultaneously. In other interpretations, the atom is actually just like the Ping-Pong ball, taking one or the other paths, but it is guided by a mysterious quantum potential which is determined by the experimental configuration. My own feeling is that the situation is more complicated than this. The picture of atoms as particles is encouraged by the way they are detected, but all of these quantum systems are probably better described by quantum fields which combine wave and particle aspects. Even then, it is not clear to me that this will allow one to complete the narrative of what is actually going on. Quantum mechanics describes what may happen, not what will happen.
According to quantum mechanics, the detection probability is determined by the total wave function which is the sum, or superposition, of the wave functions for each path:
There is a tendency to view this to mean that the particle simultaneously takes both paths, but this is reading between the lines. It is a mathematical expression incorporating the experimental setup and the possible motions. When we look to see which path the particle takes, we find it takes path #1 or #2, not both.
There is an optical illusion that provides a helpful analogy to this situation. In quantum mechanics, the total wave function describing the two paths the particle can take is like the Necker Cube:
Figure 6: The Necker Cube analogy of the Which-Path Problem. The Necker Cube is the line drawing while the blue cubes represent the two possible configurations of the Necker Cube seen by the eye, which are analogous to the two possible paths the atom can take.
Here the Necker Cube represents the wave function Ψ as the superposition of the two wave functions for each path. When we look at the cube (i.e., do an experiment to see which path the atom takes), it is impossible for our brain to perceive the two simultaneously—we see the cube in only one of two possible configurations (i.e., the detectors by the slits find the atom takes either one or the other path) with roughly equal probability (50% for path #1, 50% for path #2), but the interference is destroyed.
It appears as if Nature is conspiring to prevent us from seeing what is happening! When we have no information about the atom’s trajectory, we get beautiful interference patterns. But if we look to see what is going on, we get no interference.
All of this is well-known. Now I’m going to make things even stranger. What happens if we replace the ordinary atom in our which-path experiment with an unstable particle, like a radioactive nucleus or an atom in an excited state? Such a particle only lives for a short time, before decaying into other particles. We don’t know when an individual unstable particle will decay—the decay process is completely random (and described by quantum mechanics), but if we average over many lifetimes, we can find an average lifetime. The average lifetime is related to the particle’s half-life, which is the time it takes for half of a large collection of identical unstable particles to decay. My colleagues Zach Rohrbach (’12), Ephraim Fischbach at Purdue, and I set out to see how the which-path experiment would change using these particles.
If we use QuUPs (Quantum Unstable Particles), as Zach Rohrbach dubbed them, in our which-path experiment, we have two situations to consider: (1) the QuUP decays while passing from the source to the screen, and (2) the QuUP survives and reaches the screen without decaying.
The first case has been investigated by other researchers, and a diagram will help us figure out what will happen:
Figure 7: The QuUP decays while traveling one of the two paths, emitting light that (possibly) reveals which path the QuUP took. The detection probability is measured close to the axis (horizontal dashed line) and depends on the wavelength of the light emitted by the decaying QuUP. The interference occurs (dashed blue curve) when the wavelength of the emitted is too long to give away which-path information, but disappears (red line) when the wavelength is short enough to determine the path.
When an atomic QuUP decays, it will emit light. When this happens, which-path information is revealed so one would think that the interference would disappear at the screen. However, the real situation is more interesting than that. To actually reveal the path, the light emitted must be of a wavelength sufficiently small to resolve the two different paths. If the wavelength is longer than the path separation, one can’t tell from which path it was emitted, in which case we should still observe the interference as the decayed QuUP hits the screen. If the wavelength is shorter than the path separation clearly revealing the path taken, the interference disappears. This is exactly what was observed in a beautiful set of experiments by Anton Zeilinger’s group using heated C70 molecules!
That was the first scenario. What happens in the second case, when the QuUP does not decay during its travel and reaches the screen? This is the case that Rohrbach, Fischbach, and I set out to investigate. According to what we have already discussed, it would appear that nothing new should occur. If the QuUP travels undecayed, shouldn’t the interference be the same as for the ordinary atom discussed earlier? Since it didn’t decay, no which-path information could be revealed, so we might think we would observe the ordinary interference pattern. However, that is not what we found. The interference pattern of an undecayed QuUP is not the same as that for an ordinary atom:
Figure 8: The calculated interference pattern of the undecayed QuUP (red curve) differs from the pattern due to an ordinary atom (dashed blue curve), especially far from the experiment axis (dashed line).
This means that there still must be some information available to determine the path taken. But where is it?
It turns out that the information is within you. Look carefully at the setup shown in Figure 8. The QuUP has two paths to reach the screen at the point shown. It lives for only a short time: which path will give it the best chance of surviving to reach the screen? (Remember it travels the same speed over each path.) Since the path through slit #1 is shorter (hence, takes a short time), the QuUP has a higher likelihood of reaching the indicated point than via the path through slit #2. This is which-path information! It is called a priori which-path information since this information is known before the QuUP leaves the source. You will also notice from Figure 8 that the further from the experiment axis (horizontal dashed line) one is on the screen, the more the QuUP probability pattern deviates from the ordinary atom pattern. This is because the path lengths become more different as one moves away from the axis so the QuUP taking the longer path is less likely to survive. The two patterns are exactly the same at the center where the paths are of equal length so the QuUP is equally like to survive each path. In fact, there is a nice mathematical formula that quantifies this effect, which can be written schematically as
(Which-Path Information)2 + (Quantum Interference)2 = 1.
The more which-path information you have, the less quantum interference arises (and vice versa).
The discovery of this which-path effect with undecayed QuUPs has convinced me that information plays an important role in understanding our universe. Quantum mechanics seems to be telling us that the information available about a system is related to the behavior of that system, and it also helps to explain why these effects don’t appear in our everyday lives. Light is constantly reflecting off everyday objects, carrying information about them into the environment. This is will naturally cause the destruction of the quantum mechanical effects as the information leaks out. This is a big problem for the folks trying to create a quantum computer. If such a device can be made, calculations that would take millions of years using the fastest classical computer could be done in seconds. But the challenge is to prevent the environment from measuring the quantum computer, destroying the vary quantum properties needed to make it work.
In fact, recent work indicates that even if we could isolate a large object (e.g., the fabled Schrödinger’s cat) from light and other environmental effects, there is still a way for the information about what it is doing to escape. Einstein’s theory of relativity couples an object’s internal and external motions in a way that would be almost impossible to eliminate, so the object’s motion would be encoded with its internal state, destroying quantum interference. My student Inbum Lee (’16) and I are exploring these ideas, which are actually related to the work I’ve described with QuUPs, and we have already found some new effects. There are many interesting things yet to be found! But that is a different story.
And that’s the real point. The underlying theme of this talk is that there is a natural desire of humans to tell stories. This is true in science as well as in the humanities. What makes quantum mechanics hard to understand is that it is not allowing us to tell a story.
I’ve also been telling you a story with this talk, about how quantum mechanics came into being, how the which-path experiment works, and how the idea of using QuUPs led us to a new interesting quantum effect. However, this really is just a story. It is a narrative that allows you to follow what is going on, but it does not really match how things actually happened. The development of quantum mechanics was extremely tortuous. To tell the true story would take far more than my allotted time. Instead, as in editing a short story, I smoothed out the wrinkles and tighten up the plot, eliminated unessential characters and storylines, so as to focus on the key things I want to express. I also didn’t tell you how I actually stumbled upon the effect with QuUPs.
My colleagues and I were working on a completely different problem when I noticed something strange in our results. After a few weeks of hard thinking I finally figured out what was happening. Fortunately, I have a large storehouse of knowledge of lots of interesting physics ideas so I was able to piece it together. This is how science is really done. After the fact, when we have figured everything out, we polish the story so everything appears logical and almost inevitable, but that isn’t really the case. This is why it is so important to support student research and creative work so students can actually see how the things we teach were discovered or created. They can then realize that they, too, can do this.
We have also seen that the mysterious quantum effects are also impacted by stories. In fact, Nature seems to be preventing us from telling a story of what is happening in the quantum realm. When information of what is going on leaks out, the very thing we’re trying to understand disappears. The funny thing is that these strange goings on, behind a cloak of secrecy, are vital to our world around us and are becoming part of our everyday technology like LEDs, lasers, and computers. We are living in a quantum world, but the machinery (if there even is machinery!) is hidden away.
I have one final story to tell. I didn’t discover that stories were the theme of this talk until I was nearly finished. As in writing a lecture, a short story, or doing physics research, I often don’t really know what I’m doing until I’m nearly finished. I set out in some promising direction knowing that, wherever it will lead, the journey and resulting story will usually be interesting and fun. I hope this is the way you found my story of grappling with the quantum.
For complete lecture with figures: LaFollette Lecture
I have to acknowledge the support of all my colleagues at Wabash College. I’m so thankful to have landed here in 1998—it is hard to imagine a more collegial place to work. I especially enjoy the debates about quantum mechanics I have had with my physics colleagues Martin Madsen and Jim Brown. My ideas are constantly evolving as a result of these discussions. I’m also thankful to have had incredibly bright students like Zach Rohrbach and Inbum Lee who have helped me work through the ideas presented here. I also need to thank all my research colleagues, especially Ricardo Decca at IUPUI, and my colleagues in Mexico, Daniel Sudarsky, Yuri Bonder, and Hector Hernandez-Coronado, whose project led to the QuUP which-path effect. Finally, I can’t sufficiently express my deep appreciation for my Purdue mentor, colleague, and friend, Ephraim Fischbach. We’ve been carrying out very fruitful investigations for over 25 years, and we continue to generate questions and problems faster than we and our students can solve them.
 R. Bod, A New History of the Humanities (Oxford University Press, Oxford, 2013), p.1.
 R. P. Feynman, The Character of Physical Law (MIT Press, Cambridge, MA, 1965), p. 171.
 S. Weinberg, To Explain the World (Harper, New York, 2015), p. 214.
 Ibid., p. 255.
 Stanford Encyclopedia of Philosophy, Copenhagen Interpretation of Quantum Mechanics, http://plato.stanford.edu/entries/qm-copenhagen/#ClaPhy.
 See, for example, David Eagleman’s The Brain: The Story of You (Pantheon, 2015).
 For a good history of the development of quantum mechanics, see The Quantum Story: A History in 40 Moments by Jim Baggott (Oxford University Press, Oxford, 2011) .
 The treatment I use here is similar to Richard Feynman’s approach found in R. P. Feynman, R. B. Leighton, M. Sands, The Feynman Lectures on Physics, Vol. 3 (Addison-Wesley, Reading, MA, 1965), pp. 1-1—1-11.
 The wave functions are actually complex numbers, but for our purposes I’m going to ignore this complication. The factor of is included so that the probability of reaching the screen is 100% (no atoms are lost along the way).
 E.g., J. Kommeier and M. Bach, Frontiers in Human Neuroscience 6, 51 (2012).
 T. Sleator, et al., in: Quantum Measurements in Optics, edited by P.Tombesi and D. F. Walls (Plenum, New York, 1992), pp. 27–40; T. Sleator, et al., in: Laser Spectroscopy X, edited by M Ducloy, E. Giacobino, and G. Camy (World Scientific, Singapore, 1991), pp. 264–271; P. Facchi, Journal of Modern Optics 51 (2004) 1049; P. Facchi, A. Mariano, and S. Pascazio, arXiv:quant-ph/0105110; S. Takagi, in: Fundamental Aspects of Quantum Physics, ed. by L. Accardi and S. Tasaki (World Scientific, Singapore, 2003), 188.
 L. Hackermüller, et al., Nature 427, 711 (2004).
 I. Pikovski, et al., Nature Physics 11, 668 (2015).
 Y. Bonder, E. Fischbach, H. Hernandez-Coronado, D. E. Krause, Z. Rohrbach, and D. Sudarsky, Phys. Rev. D 87 (2013) 125021.