Now that Techtober is out of the way, I can transition away from tech products and talk more about technology concepts.  This week, I’ll be giving a brief overview of quantum computing: what it is and why it’s important.

 

Graphical Representation of Moore’s Law | Source: Wikipedia

For the past 25 years, computers have been getting increasingly fast according to Moore’s Law, which states that the number of transistors that can be packed into a processor will double every year.  This is made possible because each year, the manufacturing process for processors improves so that transistors and other chip features can be smaller. Transistors are the fundamental devices that allow a computer to perform calculations.  They serve as electrical switches that have a base which determines whether current can or cannot pass through the transistor. These transistors are then combined into logic gates that can be used to create devices to perform useful calculations.  I know it can be hard to imagine that this blog is coming to you as a result of billions of tiny electrical switches working together. Luckily, there’s no need to have a doctorate in electrical or computer engineering in order to understand how cool computers are.  If, however, you are interested in the technicals, check out this article, or watch these videos from Crash Course Computer Science, which give a more elaborate and visual idea of how this works.  

 

Transistor Schematics | Source: Hyperphysics Concepts

This year has seen transistors get down to a size of just 7nm, which is really, really small; you can read my blog on the iPhone XS to get a sense for how small that is.  Unfortunately, this is dangerously close to the limit of Moore’s Law. Modern transistors are getting to the minimum size they can be while still working. This means that we aren’t doubling the number of transistors in our processors anymore.  The exact limit is not perfectly agreed on, but 5nm is a good estimate for the smallest we can make transistors before computers can’t work anymore. What prevents transistors from getting smaller is a quantum effect known as tunneling where quantum particles, in this case electrons, can spontaneously and randomly teleport small distances.  When transistors get too small, electrons can tunnel across the transistor and prevent it from maintaining a stable on or off state.

 

I’m sure you’ve also heard of quantum computing, but you probably don’t have much of a semblance of what it is.  Quantum computing is fundamentally different than traditional computing. Quantum computing uses qubits instead of traditional bits.  Bits are formed by the on/off states of transistors and store information in binary. Due to a strange property of quantum physics, superposition, qubits can take on any combination of states, finally deciding whether to be a 1 or a 0 only after observation.  These qubits can then be combined to run endless possibilities simultaneously. I can’t even say I understand quantum logic, perhaps you’ll fare better reading this monster.  You can take my word for it, though, quantum computers are really really good at solving problems.  Their ability to work problems in all possible ways simultaneously makes them particularly good at tasks like the travelling salesperson problem or searching through a database.  Unfortunately, quantum computers are far from a reality. Scientists have been able to generate qubits, but not enough to make their processing any better than traditional computers.  Further, quantum states are very delicate, so scientists have to cool the computers to single digit Kelvin temperatures and insulate them from all forms of interference. Plenty of theoretical research has gone into quantum computers though, including developing algorithms for them and designing quantum logic gates to allow them to actually perform calculations.

Quantum Computer | Source: Cosmos Magazine

Quantum computers would make some very particular and large scale computations way faster and more efficient, which would lead to breakthroughs in science and the development of unimaginable new technologies; however, I don’t expect to see practical quantum computers until maybe 2030 at the earliest.

 

If you want to see this very general intro into what quantum computing is and why it is important in a different medium, check out this

video from my favorite YouTube channel, Kurzgesagt.  Their video was the main inspiration for much of the information in this post.