3 min read

The ONE thing stopping you from becoming an AI Supercomputer

The ONE thing stopping you from becoming an AI Supercomputer

AI has taken our lives by storm.

My earliest brush with AI was when I watched the movie iRobot with my parents. It was a remarkable movie, especially for the time, but being burdened to live up to the expectations of a summer blockbuster, it leaned heavily on the generative trope where - something developed for the good of mankind, turns against it. However, it raised the possibility of a world with AI that was achievable and not too far off.

iRobot raised extremely pertinent questions for the time it was released

Today numerous AI models have been conceptualised to make this iRobot dream a reality. Google and Samsung are some popular heavyweights, who are also commercialising the same with - Machine Learning, Deep Learning and the newest kid on the block, Generative AI. Thanks to the latter, we can now Photoshop a cat on a pizza in the blink of an eye!

Humans! The blueprint

Have you ever stopped to think - where are these giants deriving their next big idea from?

The answer: Humans!

As a living breathing average homo-sapien, we are also, coincidentally, the pinnacle of AI evolution. We can analyze, interpret, process and generate information from any prompt provided to us. In addition, we have been gifted with in-built reasoning and opinion-forming skills, whose ability to adapt only grows with time. Until yesterday, Google could look at a photo to identify the Eiffel Tower. Today, Google can generate an accurate Eiffel Tower next to you, in a photo.

The amount we learn from practice is more than the amount we learn from books and theory. However, what sets humans apart is our ability to learn from mistakes. This is something that AI has not yet had the luxury to have programmed into them. Though AI may be better at identifying mistakes, it cannot teach itself from them, yet.

The ONE thing

Could you guess it? The answer is memory. Humans have both limited random access memory (RAM) and read-only memory (ROM). The actual amount of memory we have gradually decreases with age and time. In comparison, the one thing increasing exponentially in computers is memory.

Ten years ago, a single gigabyte was considered to be unnaturally humungous. Today a single photo is multiple gigabytes. With cloud-based memory, the potential is infinite. AI Supercomputers can load the entire history of the universe in seconds while analysing it for relevant data, in a few more seconds. As humans, the limit in our memory, both in its capacity and the way we can utilise it is where these machines have a distinct advantage over us. When our RAM is clogged, we need, in most cases, a good night’s sleep to clear it again. A supercomputer needs a few seconds and a reboot button.

Unfortunately, we cannot increase the number of brain cells or neurons we have. Our RAM and ROM are limited. Some are always occupied with no way to hard reset them and free up space.

So what’s the takeaway?

Our limited memory may hinder us from competing with an AI supercomputer in a 100-metre sprint. However, everything that the computer knows is based on our knowledge and our understanding of the universe as we know it.

Our insight into our sole limitation gives us the strength and ability to use computers to substitute this - in the form of digital calendars, digital notes, and reminder systems, we have harnessed computers to bridge this gap. Our ability to reason right from wrong and good from bad is not based on an algorithm, and though in creature comfort, we will use AI to assist us, when in despair, we will seek the help of another human.


Enjoyed the read? Consider subscribing to my newsletter, ❄️ Freezethawed, to get a weekly update on thoughts pondered upon me, insights I’ve been exposed to and maybe some interesting facts I’ve encountered – delivered straight to your inbox.  Click here: ❄️ Freezethawed: The Newsletter

Have you followed me on LinkedIn? Read about my insights on Cataract Surgery and its potential risks and considerations for the Retina.