The Art of Replicating Nature Itself

Evan Lin
7 min readOct 10, 2020

Taking Quantum Computing with a grain of salt ..or better yet, a bean of coffee…

Figure 1: A pile of coffee beans

Ah yes, the classic Monday morning where we all do our weekly rituals of groggily sulking into our kitchen to make a cup of coffee. If you’re as familiar with this little event as me, you’d then likely commute to your 9–5 or school, and wait for that 250mL of coffee to “kick-in” before you start typing away on a computer. In my case, this cup of coffee is helping me with writer’s block.

We’re all not much different than the conventional computer, in fact, we’re basically all computers! Wait… what? What do you mean?

Hold on just a second! Not from a biological perspective, silly, but we made computers to be more efficient, and we use computers to complete our daily tasks. And my biggest point of them all… We’re all searching for speed! Yes, that's right. We all want to be more efficient!

Figure 2: Productivity

So we drink our cups of coffee in efforts to boost our productivity. But at the end of the day, we collapse onto our beds, relax and maybe watch a few shows. We have limits to how much we can take in work.

Similarly to computers, conventional computers, too, are reaching a limit of computational power — just like how we reach a limit and need to sleep despite the purpose of coffee helping us to stay awake.

So here lies the question: What is the what, in “us to computers is basically computers to what?”

Enter the realm of Quantum Computing

Alright now take this idea with a grain of salt. This example is going to sound completely arbitrary but bear with me. I promise this will all wrap up in the end.

Figure 3: Image of Google’s quantum computer, Sycamore

The Limits of Conventional Computing

Here's a pretty farfetched example, but it gets the message across. Let's rewind to the Monday we started off with. Now, let's suppose you magically wake up as a research scientist, and in the morning and while making coffee you think to yourself: “Man, this caffeine ain't doing it for me. It's just not enough”. So, you take that bean to a laboratory and using the given equipment and computers, you try to model and simulate the molecular data of the caffeine on a supercomputer. To your surprise, the computer crashes because even the supercomputer can’t handle this data. But why?

Well, in order to describe the amount of information needed to describe a single TINY molecule of caffeine would be 10 to the 48 bits. In contrast to the number of atoms in the Earth which is 10 to the 49th — 50th bits. A singular coffee bean has the molecular information of 10% of the number of atoms in our EARTH! So now you, as the scientist with immeasurable disappointment levels could be still wondering how you could accurately model the molecule of caffeine.

The solution to this immeasurable disappointment? Quantum Computing.

What is a Quantum Computer?

Ok, you may be thinking “Woah quantum computers must be extremely fast therefore being infinitely faster than the computer I have at home!” Well, not exactly. Quantum computers only have a few specific tasks that they can excel at, and they all involve breaking down numbers. This makes quantum computers extremely useful for factoring gigantic values, just like our coffee bean dilemma from the beginning.

So how does Quantum Computing work?

Alright sport, you might be getting a lil’ ahead of yourself if you’re not familiar with the prerequisites. Lets first rewind to our basic understandings of classical conventional computers.

Figure 4: Definition of a Bit
  • Classical computers run on a language called “machine language” that only consist of combinations of “0s” and 1s”
  • One bit is either a “0” or a “1”
  • A bit is really like a light switch: “0” meaning off, and “1” meaning on

These combinations of bits make up basic mathematic operations and when they are put together, they create these bigger switches called boolean logic gates. A Boolean logic gate takes in multiple binary values and returns a single digit of either a “0” or a “1”. In other words, the logic gate spits out an on or an off, or better yet because we are talking about logic, a TRUE or a FALSE. So how does this relate to quantum computing?

Quantum Computers run on Qubits

As you just saw, bits are either a “0” or a “1”, and they represent a state of activity: On or off, or true or false. Qubits are a bit different (pun intended). Qubits can be in 3 states, unlike the classical bit that can be in 2. Qubits can be in either “0” or “1” and a third state which is “0 AND 1”, meaning that the third state is 0 AND 1 and the SAME time. This phenomenon of the third state is called Superposition. Here a little diagram I made below that’ll help:

Figure 5: Abstract definition of a Qubit

How does Superposition make Quantum computing “faster” than conventional computers?

Ok, let's say that you have a computer with 2 bits. (Of course, you’re not going to have a computer with 2 bits but do consider the simplicity of the example.) In a conventional computer with 2 bits, there can only be 4 combinations of “0s” and “1s” being strung together.

The combinations you can get with 2 bits are “00”, “01”, “10” and “11”

But keep in mind, you can only be in ONE state at a time, meaning that if you ever wanted to have all these values returned to you in the most efficient way possible, they would have to run ONE by ONE, at four individual times.

Now see, these very two place values that take up “00” or “01” or “10” or “11” can also be represented by 2 Qubits, with superposition! With just two Qubits, a 2 Qubit computer can be in ALL these 4 combinations at the same time. If you got a bit lost there, don't worry, I’ll explain it just a bit further. Let's break down the example into just one bit and one qubit.

Lets say now, you have a computer with one bit. That means there can only be 2 combinations of “0s” and “1s” being strung together. Lets also assume this computer is SUUUUUPER slow and takes one second per operation…

The combinations you can get with one bit are just “0” and “1”

Now with qubits, remember that one qubit can be in BOTH in the state of 0 and 1 at the same time. Therefore to represent “0” and “1” in a qubit, we will only need ONE.

So lets now break down the difference between 1 qubit and 2 qubits and 1 bit and 2 bits. Each new numbered line represents an individual computational operation→ This is key to understand.

Combinations for 1 Qubit:

  1. 1 Qubit = [“0” and “1”]= 1 digit that is either 0 or 1

(Total number of operations = 1 | Time taken to calculate = 1 second)

Combinations for 2 Qubits:

  1. 2 Qubits = [ “0” and “1”, “0” and “1” ] = 2 digits that can be “00” or “01” or “10” or “11”

(Total number of operations = 1 | Time taken to calculate = 1 second)

Combinations for 1 Bit:

  1. 1 Bit = [0]
  2. 1 Bit = [1]

(Total number of operations = 2 | Time taken to calculate = 2 seconds)

Combinations for 2 Bits:

  1. 2 Bits = [00]
  2. 2 Bits = [01]
  3. 2 Bits = [10]
  4. 2 Bits = [11]

(Total number of operations = 4 | Time taken to calculate = 4 seconds)

Notice how many more operations bits take rather than qubits! The point is, bits can only deal at one state at a time, and the more bits you add, the more processes there will be. With qubits, notice how the number of operations remains the same. That means the more qubits you have, the greater the exponential power is going to be, for your quantum computer.

So what's the formula for calculating the number of combinations, or shall we say states?

Glad you asked. The formula is simply 2^n states, ‘n’ representing the number of qubits. The formula is also the same for bits too! Except that it has to step through each and every process individually… Going back to the coffee bean question, remember that the single caffeine molecule consists of 10 ^ 48 bits. If we were to convert this GIANT number into qubits, we would get 160 qubits, and 2 ^ 160 states.

Figure 6: #of states

A regular PC has a speed of 2 billion processes/instructions per second, at 2GHz (2 gigahertz). Therefore, this would take MANY MANY MANY years for a regular computer to hypothetically break down this number.

So now you, as the research scientist can finally rest knowing how and what you can use to model that caffeine molecule.

Additional Resources!

These resources helped me train my intuition in understanding how Quantum computing works!

https://www.youtube.com/watch?v=IrbJYsep45E&feature=emb_title&ab_channel=PBSInfiniteSeries

Thank you for reading my article! If you enjoyed it, I’d really appreciate a *clap* and a share :)

--

--

Evan Lin

Innovator at The Knowledge Society (TKS). Interested in Machine Learning and Quantum Computing.