top of page
Writer's pictureMichael Trotter-Lawson

The Future of Computers

Let's explore the future of computing. In the last 50 years, computers have evolved incredibly rapidly, so quickly that many of the most prolific and capable computer engineers from the 60s and 70s would find modern devices totally foreign to them. This follows a well-known phenomenon of computing advancement, known as Moore’s Law. Moore's law is the observation that the number of transistors in an integrated circuit doubles about every two years. Moore's law is not actually a law, rather it is an observation and projection of a historical trend. In layman’s terms, Moore’s Law means that processing power doubles roughly every two years, a trend that has consistently held true since 1975. However, Moore’s Law is about to face its greatest challenge: the actual laws of physics.


Technically, microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, slightly below the pace predicted by Moore's law, and in September of last year, Nvidia CEO Jensen Huang considered Moore's law dead, however, many industry professionals still view the law as alive and well. The greater issue at play is the way that Moore’s Law functions. If you remember the wording from before, it is the “number of transistors in an integrated circuit” that doubles every two years. This is possible because we have gotten better at making transistors exponentially smaller in the past few decades, but we are approaching a point where transistors are nearing the size of a literal atom. At that scale, creating anything is nearly impossible. This means that we need to find new ways to make computing more efficient.


This is where quantum computing and analog computing enter the mix. Quantum computing is a new, developing technology that has been theoretically in the works for several years now, while analog computing is actually an older technology that is being revisited as digital computers start to reach their limits. Both of these fields of computing are likely to be vital for the growth and development of computing for coming generations, and both have very different roles to play. Quantum computing is the more natural progression of computing technology, utilizing quantum mechanics to unlock a new tier of computing power. Analog computing, however, takes a step back in time to take advantage of physical mechanisms that can circumvent specific limitations of digital computers.


So, what is an analog computer? Basically, where digital computers use ones and zeros to simulate scenarios and phenomena, analog computers use aspects of physical phenomena to model the problem they are designed to solve. So, if you want an analog computer that could accurately predict tides for instance, you wouldn’t write a computer code, you would instead create a machine that was perfectly analogous to the tides, so what the tides did, so did this machine. That’s exactly what William Ferrel did back in the late 19th century, and his tide-predicting machine was how allied forces knew the tides for the famous Normandy Beach landing sixty years later. Analog computers like Ferrel’s fell out of favor after digital computers became significantly more powerful in the 60s and 70s. Digital computers have significantly more functionality and flexibility compared to their analog counterparts; you’ll never find an analog computer that can play pong, write a word document, and open a PDF, for instance.


If that is the case, why on Earth would we ever go back to analog computers for anything? Well, despite their limitations, analog computers are very good at their specific, respective jobs. An analog computer is faster, more powerful, and significantly more energy efficient than a digital computer, but only for a specific task. We use digital computers exclusively today because they are multi-purpose and exact; areas where analog computers cannot compete. However, we have a new purpose for analog computers: artificial intelligence. I've written about AI before here, so all I'll say for this story is that AI programs primarily work off neural networks. The larger the network, the smarter the AI. These networks require a ridiculous amount of math to function at the scale we want them to, and that is where analog computers come in. Like we said, analog computers are faster, stronger, and more energy efficient at performing a single task, so, if you need billions of complex multiplication equations solved like these neural networks do, analog may be the way to go.


Analog computers may be the missing piece for improving the capabilities of artificial intelligence, but what about the many other functions we need computers to fulfill? This takes us to the extremely complex and confusing world of quantum mechanics. Digital computers, no matter how powerful or complicated, boil down to a series of trillions of ones and zeros. It is a binary system: on or off, yes or no, etc. Quantum computing seeks to change that by adding additional states of being for the basic, bedrock code of a computer, theoretically increasing the computation capabilities of a device exponentially. However, it’s not that simple. Quantum computing is built off the qubit, which, unlike the traditional bit, can exist in a superposition of its two "basis" states, which loosely means that it is in both states simultaneously. What does that mean for quantum computers? Well, two qubits can theoretically display the equivalent value of four bits, and that information quantity increases exponentially, so just 25 qubits equate to 33 million, 554 thousand, 432 bits, and 300 qubits equate to more bits than there are particles in the known universe.


There’s always a catch though. Qubits can exist in multiple states simultaneously, but eventually, for the data to be usable for any computer, even a quantum one, that data must be measured in one of its two possible states. Without delving too deep into the intricacies of quantum mechanics, this limitation makes to where quantum computing is not a proper replacement for traditional binary computing. That does not, however, mean that quantum computing is without its uses. A properly functioning quantum computer could shatter the current state of cryptography, ruining the security and privacy of Web pages, encrypted email, and many other types of data. This naturally has massive ramifications for electronic privacy and security in a post-quantum reality. The good (and bad) news is that quantum computing is still highly theoretical; the field still faces many challenges before we have viable quantum computers, and engineers and security experts are actively, preemptively researching post-quantum cryptography.


The future of computing is simultaneously bright and full of possibilities, while also being filled with mystery and theory. With the resurgence of analog computing and how it can be applied to new areas of research, specifically artificial intelligence and machine learning, as well as the exciting potential that lies in the field of quantum computing. As technology continues to advance towards the eventually end of Moore’s Law, we can only imagine what new frontiers will have to be unlocked in the coming years. However, as we can already see with the rise of AI like ChatGPT, there are dangerous ethical implications of future computing. Assuming that analog computing can take us to the next tier of machine learning, and quantum computing will eventually break all currently existing forms of encryption, that is a future of computing that presents a lot of unknowns at best, while being extremely bleak, at worst.

1 view0 comments

Comments


bottom of page