Glas Whiskey

Digressions on Quantum Computing:
And Why Quantum (and Algorithms) Matter 


Meg Tufano, M.A. 


    If you haven't read David Amerland's great piece in this issue on, among other things, algorithms taking over the world, you might want to do that first so as to put in context why I'm writing about algorithms, and why it is probably more important than the title implies. 

    If algorithms are taking over the world, then I, for one, want to know what the hell they are! And, of course, I've studied math so I sort of remember what they are, but what I did not realize (things have changed since I went to college (computers were not available to regular people back then)) is that computers are using such complicated new analysis techniques that even this nice Apple MacBook Pro that I'm typing on is old-fashioned in the scheme of things. (Don't tell Guy Kawasaki!)

    What has happened to computers is something I want to understand because, after all, I am "on" a computer every day (even when I'm "off" a computer, I have my iPhone with me which is also a computer) so, as I said, I am "on" a computer––or at least at its beck and call––almost all day every day. I need to understand what these new computers are doing!

    When I decided to write this, I called the most Googley person I know who in a few minutes had my head spinning (he works at Google and does not want his name mentioned (mainly because others might call him, not because he's doing secret stuff. But, hey, you never know because if it's secret . . . but I digress)). He gave me directions as to what to study first (Heisenberg and Shroedinger) and left me with the feeling that this was not going to be an easy article to write. 

    But fools rush in . . . 

Never Be Afraid of Algorithms or Quantum Computing Again!  (No, there will not be a test; and you do not have to know (or like) math to understand this . . . )

Quantum Computing  

    OK, just to show you that even though I prefer studying psychology and philosophy, I am not trying to understand something as complicated as Quantum Computing without any background in the "hard sciences" (always have loved the double entendre of that phrase): I have studied science (including college physics and chemistry), so I sort of already know what quantum means, and I THINK I understand Heisenberg's Uncertainty Principle, but I'm missing something big time in trying to understand quantum computing and Schroedinger. I figured I wasn't alone, so I thought I'd do what I do when I'm teaching something with which I'm not that familiar. I'd write about it. Writing forces you to think! (Or it should force you to think!)

    I like the Heisenberg story so I'll tell that one first from memory. Heisenberg discovered that if you use light to see a tiny object (like a part of an atom, or light itself), you're going to change that object. If the object is moving (all atoms are always moving because they are actually not objects––well, a teeny part of them has some "stuff"––but they are almost entirely energy), then you're going to change the velocity of the components of the atom (the particles of the atoms, the names of which I forget). But the thing that I like about Heisenberg is that the closer you get to "seeing" the particle, whatever it's called, the less the information will mean because you are changing the momentum of the particle so much with the light you're using to see it! 

    I cannot tell you how many times I have experienced something in life, tried to understand it by getting up really close to it, and realized that I very well might be changing everything by merely trying to get close to understanding it! I am awakened to this kind of insight, for example, when my husband says, "Would you please leave me alone?" It is at those times that I think fondly of Heisenberg. 

    Important to know if you don't already: light has a wave/particle duality. Light acts like a wave AND a particle. I love that. It's the kind of knowledge that gives me a sense of the greatness of existence. It is SO improbable and paradoxical and almost intentionally curious! We are constantly thinking things are either this or that, but, by nature, we are immersed in a 'both and' cosmos! No wonder we get things so wrong in our either-or human endeavors.

    I love using Heisenberg's principle in metaphors because so much in life is so paradoxical. So many things seem to be getting more confusing the closer you get to them! And so many "truths" seem opposite; yet both seem to be true at the same time . . . like absence makes the heart grow fonder, but also––at the very same time––out of sight out of mind! Turns out that paradox is the stuff of life! 

    Enough digression, let's get back to quantum computing.   

Ho Hum, Zeroes and Ones: From Hole Punches to a Grain of Sand

    Computers interpret data to get useful information. Classic computers work first by translating data into binary numbers. Zeroes and ones. Sometimes a whole lot of zeroes and ones, but, eventually? That gets to be a "bit" onerous and time consuming.  

    How to best explain this zero and one business might be to go back to the origin of computers because that's actually a fascinating digression––talk about time-consuming (weaving is time consuming)––the idea of computing came from people making cloth! When you are weaving with a loom, you have to get all the strands of thread lined up and then count where to put the waft (the threads that go in and out of the first set of threads (the warp)). Someone figured out a way to have the threads "count themselves," as it were, by using cards with holes punched in them in a certain pattern so the thread would get the information, "Go in here NOW!" without anyone having to count!!!!!! Then you could make ever more interesting patterns because you didn't have to hire a zillion people to do all that counting!

    Then, a whole bunch of associations came to people about how useful these cards could be. For one useful association example, you could use these cards to count PEOPLE (see the U.S. 1890 Census). Isn't it amazing that counting things and numbers are so important to so many things that seem totally human and non-numerical: a warm blanket and how many representatives are going to go to Washington, DC (determined by a census)? 

    Programming moved away from cards to long sheets of punched paper (I am just old enough to have seen the original IBM punch cards and the punched paper in use in an office, mostly for bookkeeping (adding up numbers)), then someone came up with the idea of using sand. Isn't it ironic that grains of sand (silicon) end up being able to "count" how many grains of sand there are in the world? Well, I could use the word "transistor" but that just confuses me. All the grain of silicon is doing is giving the same information as the card––but super tiny–– "Turn on NOW!" but zillions of times faster and, of course, "Turn OFF NOW!"

    What's very cool about silicon is that it can send that information to the next grain of sand and the next and the next (using silicon's inherent electrical properties) and, well, you get the picture. Silicon can connect. And it can move really fast (not as fast as some other semiconductors), but fast and CHEAP!  Cheaper even than paper cards (and the cards could not send their data from one card to the next using their innate electrical properties as silicon could do because paper does not HAVE innate conductivity, it is an insulator, the opposite of a conductor).  Did you know you could make 10,000 transistors (off/on switches) on silicon for a PENNY??????  No wonder computer inventors made billions of dollars!

    OK, I think we've gotten off to enough tangents to the point of understanding that this computer I'm typing on is using tiny on/off switches in zillions of patterns (programs) that are pretty effective for getting stuff done (I just paid all my bills without having to lick a single stamp!)

    So, we're pretty efficient already, why do we need quantum?  And what's the big deal?  


    Well, going back to those paradoxical particles that seem to be in two places at the same time (didn't really explain that yet) and to particles that seem to have properties that change when they are looked at (Heisenberg): turns out that if you mix all that up, you get a principle that the nature of things is such that even if you think you have things pinned down, you have to take into consideration that maybe you don't! You have to actually leave the room for other things that are going on that you are either changing because you're looking at them (similar to Heisenberg's problem); or that are changing because they can't stop changing (nature itself)! Believe it or not, someone actually can describe that crazy-sounding reality in NUMBERS! My dear departed mother would have LOVED to have known that! (She would have called Schroedinger's equation, "The All Mixed Up Like a Dog's Breakfast Equation.") But the point is that you really can do this! You can use the equation in such a way that you can describe all this uncertainty that is happening all around us without actually pinning anything down, just by using approximations. Isn't that amazing?

    In case you do not already know it, that's actually what Calculus does too (you add up the "approximate" hypotenuses on teeny tiny triangles on a curve and that's how you can measure the area under the curve super exactly, or, another way of saying that, by using approximations you can get a damned good useful measurement). I think Pythagoras would have been so happy to know that his theorem about figuring out the hypotenuses of triangles and his discovery of the irrational number Pi ended up being used for something so complicated . . . which is another digression . . . I'm having trouble staying on track!  I'm DEFINITELY not a computer!!!!  Back to quantum computing!


   OK, so you can't pin things down but you might want to argue that, logically, surely things ARE either one thing or the other! Right? It would be crazy to think that I was here in Tennessee AND in, say, China, at the very same time. That would be impossible. Even with the nature of light, even if we cannot measure light exactly because light is made up of particles AND waves at the same time, but they can't be in two places at the same time, can they? It would be crazy illogical if things could be in two places at the same time. Even if we can't know if they are in one place or the other, we know logically they must be in one place or the other, right?

    This is where the paradoxical nature of the quantum idea and the qubit blows me away. In fact, that's what the quantum idea is all about. You add up all the possible positions the system can be in! And, fascinatingly similar to the usefulness of Calculus, Quantum Mechanics describes states by adding together all the approximations within those states. So what if you can't tie things down, you can guess, right? And that "guess" can get closer and closer and closer to being a useful measurement (like Calculus can). 

    Most important to the quantum computer is the qubit (the spelling of which apparently hasn't made it yet into the Evernote dictionary). Qubits, according to Wikipedia, are the "fundamental building blocks of the quantum computer."

    So, that's where I'm going to next. I generally understand the cards, the transistors (the silicon). I now have some idea of the statistical uncertainty logic of quantum computer programming, next I need to understand how understanding data changes when we use qubits!!!!

Qubits or Quantum Bits

    First thing to "get" about qubits is that they can be in two states at the same time–surprise! Normal logic doesn't work with qubits. And that they are big, as in SUPER LARGE! Data on my computer right here on my lap has 500 Gigabytes (classical bits) of storage.  Sounds like a lot, doesn't it?  Well that's only 4,294,967,296,000 zeroes and ones!


    But a qubit will blow your mind because it can be a zero or one at the same time and nothing but an algorithm can pin it down enough to understand it numerically; and, through the magic quantum entanglement of qubits, they not only can be a zero or a one at the same time, but they can be multiple places at the same time and, the best part, because of all this uncertainty and legerdemain, they can deal with ENORMOUS numbers and process them faster than anything like cards, paper or sand. And we need them because, if you haven't figured this out by now, we are getting to the point of needing to process REALLY big numbers (call those numbers Big Data if you want). We're not doing anything so simple as counting how many people there are in a district for a census, or how many particles in the viewable universe (which is over three billion light-years across) but we are representing the complexity of reality itself. The sad truth is we each only have X number of years to live (sorry about that depressing thought), so we can't wait around for even silicon transistors, we need to calculate these numbers faster than silicon can calculate them so we can use them now while we're still here to remember what the hell we were measuring those huge things for! 


    Oh, I forgot Schroedinger, well I'll come back to him.

Schroedinger Again. Whew!

    So, essentially, the whole binary system of zeroes and ones just cannot process the QUANTITY of numbers of huge data. And, it turns out, that the "Dog's Breakfast Equation” (Schroedinger's equation) is using ENORMOUS numbers!!!!  And so, to represent reality (one good reason to use quantum computing), one needs computers that can handle the kind of equations that actually describe real things like, for example, the state of an atom in which you need––not to add up a bunch of hypotenuses––(as you do in measuring a geometrical curve (as Calculus)), but you have to add up a whole bunch of equations that use approximations which themselves have a whole bunch of equations that use approximations which themselves have a whole bunch of equations––I could go on but you get the point––all of which include huge numbers of probability in order to represent the uncertainty with enough certainty to approximate reality.

    No wonder my Googler friend warned me this was going to take me a while to "get!"

Quantum Logic and Quantum Algorithms

    Good ol' Wikipedia says, "A quantum computer operates by setting the qubits in a controlled initial state that represents the problem at hand and by manipulating those qubits with a fixed sequence of quantum logic gates. The sequence of gates to be applied is called a quantum algorithm. The calculation ends with measurement of all the states, collapsing each qubit into one of the two pure states, so the outcome can be at most n classical bits of information."

    For those who are wondering, "n" is any number whatsoever, like counting the number of threads in the waft and weave of a cloth! "N classical bits." In other words, an answer we can understand and discuss using my trusty Mac.

    I think that's where I'll stop. What Quantum Computing is doing is analyzing reality and so Quantum Algorithms are going to make it possible not just to understand our world better, but, because these equations can handle huge numbers that we cannot do by "hand," it is going to make it possible for information about reality to be analyzed fast enough to be useful to us in our lifetime!

    Send all questions about this article to Google. ;')



Is There Going to be a Test?

    No. My head hurts.

    But here's a take-home pop quiz: Who already has these computers? 

    You may guess your answer. Use approximations.

    (Hint:  Not your Mom and Pop local store; and not The Apple Store.)

Author Bio:  
Meg Tufano, M.A.

Meg is originally from Washington, DC, has lived all over North America and just recently returned from four years of living in The Hague, The Netherlands.  She is married to Dr. Daniel R. Tufano, Sr., a scientist, and has two sons, Julian and Danny, Jr.  She completed her undergraduate degree at St. John's College and The University of Toronto; received her Master's from Antioch University, Midwest.  (Meg describes her education in Part III of her Critical History of the University in the spring 2013 issue of The Journal.)  Her favorite city (so far) is Florence, Italy, the background to her picture at left. She owns the publishing imprint S+, and is honored and excited to have just published a book of science fiction stories by Laston Kirkland, Copy Me. She loves new writers and encourages submissions on social era topics for The Journal. She is always happy to discuss publication by authors of both fiction and non-fiction books for S+™.  Contact:

About the Artist, Ria Nieswaag:

Ria Nieswaag (born 1950) lives and works in Delft, The Netherlands.

From 1984-1988 she was trained as Creative Therapist in Zeist. She combined her activities as creative therapist and artist until 1994 when she definitely chose for the career of an artist. This decision resulted in an assignment by Rijkswaterstaat (1998), the Dutch Ministry of Infrastructure and Water, on the occasion of its 200th birthday. Queen Beatrix was present at the opening of the exhibition. A book consisting of the paintings Ria made for the Ministry was published: ‘17 schilderijen, Nederland als kunstwerk’ (17 paintings, the Netherlands as a piece of art).

From 1995 onwards Ria has been exhibiting in distinguished galleries in the Netherlands as well as abroad (o.a. Argentina, Brazil, MASC for the organisation ‘Paint a Future’).

In 2006 a small book of Ria’s paintings was published: Ria Nieswaag, Paintings 2005-2006 (a choice); ISBN:13 978-90-810765-1-7

From December 2008 to February 2009 an overview exhibition of Ria’s work was held in Museum ‘Het Prinsenhof’ , the municipal museum of Delft. On this occasion the book ‘Spiegels van de ziel’ (Mirrors of the soul) was published. This book contains an overview of paintings by Ria Nieswaag from 1994-2008; ISBN: 978-90-74063-38-8