The Tech Revolution: 10 Predictions about AI, Internet, and Possibly Our World in 3 Years
Here's What the Tech Ecosystem Might Look Like When You Blink Twice
In the wake of engaging discussions with my good friends Blixt and Luis, I felt inspired to gaze into the crystal ball of technology. Their insights sparked my curiosity, leading me to the daunting yet fascinating task of forecasting the near future.
As someone who has witnessed first-hand the sweeping transformations brought about by the internet rollout, the rise of Linux, the birth of cloud computing, the advent of smartphones, and the explosion of social networks, I now venture into the realm of forecasts for the next big leap - the evolution of Large Language Models (LLMs) and Generative AI.
But remember, when it comes to my predictions, I'd advise taking them with the proverbial grain of salt. I'm the same guy who, back in 1999, boldly declared that a company with a quirky name like Google would never make it. On the other hand, in 2010, I did prophesy that the future of deployments and packaging would be a platformized chroot (Hello, Docker!)
With that in mind, I present my 10 predictions for the next 3 years, outlining how the AI landscape might transform in the not-so-distant future. This isn't just wild guessing - it's extrapolation from current trends and an attempt to peek into a future that's just around the corner. So, strap in and join me as we delve into the exciting possibilities that lie ahead.
1. LLMs will become as essential and omnipresent as your morning coffee
Picture this - a world where Large Language Models (LLMs) are as widespread as smartphones today. These artificial intelligence entities will become the ultimate interface for knowledge interaction, rendering conventional search engines almost obsolete. Instead of scouring through countless articles, abstracts, and papers to quench your knowledge thirst, you'll engage in intuitive conversations with an LLM, leveraging its capability to provide elaborate, nuanced, and context-appropriate responses. Every query, whether it's about the culinary culture of Vietnam or the latest advancements in cancer research, will be a chat away, blurring the lines between human-computer interactions.
2. Prepare to see LLMs breaking free from the shackles of the (online) internet
As technology evolves, our reliance on the internet for non-real-time knowledge will diminish. In no time, most LLMs will function offline, freeing us from the shackles of constant internet connectivity. Just as we download software updates on our devices, these LLMs will receive periodic updates, allowing them to stay abreast with the ever-evolving world. Imagine having the collective knowledge of humanity accessible from your device, anytime, anywhere, even in the remotest corners of the Earth - a veritable library of Alexandria in your pocket.
3. Specialized Generative AI will eclipse the era of one-size-fits-all AI models
Generative AI models will no longer be Jacks-of-all-trades. Instead, they will develop into specialists, proficient in providing in-depth insights into specific domains. From arts to sciences, business to sports, these AIs will cater to every niche. Just like selecting a channel on your TV, you'll choose the AI best suited for your current need. The future isn't far off when you'll summon a culinary AI to perfect your béchamel sauce, or a history AI to explore the socio-political undercurrents of the French Revolution.
4. Wave goodbye to Client-Server Computing as powerful personal devices steal the show
The traditional client-server computing model, the backbone of the digital world, will recede into the shadows. As companies like NVIDIA and Qualcomm continue to enhance device capabilities and devices become more powerful and LLMs more advanced, the need to reach out to distant servers to fetch knowledge will vanish. With AI powerhouses residing on our devices, data processing and knowledge generation will happen right where we are, turning each device into a personal knowledge hub.
5. Get ready for a new birth of innovative monetization as advertisers take on the offline shift (or suffer)
In a world where interactions predominantly happen offline, the traditional revenue channels of online advertising will face a drastic overhaul. The billion-dollar industry of online advertisements will be challenged to either find a way to transition offline or risk obsolescence. As LLMs shift from being data consumers to data producers, the onus of revenue generation will fall upon innovative monetization strategies that align with the new AI paradigm. Ad giants like Facebook and Google may need to devise innovative offline monetization strategies that align with the AI-driven paradigm shift.
6. In the time of AI partnerships and enhanced APIs, browsers will no longer be your go-to
The future is ripe for a seismic shift in how we interact with service providers. Complex tasks like hotel bookings, restaurant reservations, or organizing shipments will no longer involve browsing multiple websites. Instead, users will interact with enhanced APIs that collaborate with their locally running AI models. This AI-powered approach will not only streamline operations but also augment the overall user experience, making it more personalized and efficient.
7. Expect a new marketplace for AI integrations - a one-stop-shop for all generative AI needs
We'll see the emergence of an extensive marketplace for AI integrations. This marketplace will serve as a hub for offline core services, specialized model downloads, and real-time service interactions. Just like how app stores revolutionized software distribution, this marketplace will democratize access to AI capabilities, enabling users to enrich their personal AI models with specialized skills and functionalities.
8. The internet is due for a makeover: less corporate monotony, more personal expression
With the fading significance of corporate websites, the internet will transform into a more personal space for expression. As AI-powered interfaces gain prominence, the internet will revert to a more user-centric model, reminiscent of its early days. But this time, it will be bolstered by advanced AI capabilities that provide users with a more personalized and enriching experience.
9. The race for edge computing will become the next frontier for tech dominance
As AI continues its march towards the edges of networks, device manufacturers and operating system builders will find themselves on the frontlines of this new battleground. Companies like Microsoft and Apple that understand the value of the 'edge' and have computational advantages may find themselves in a favorable position in this new landscape. The race for dominance in the new age of computing will not be just about who has the most data, but also who controls the edge.
10. Content creators are set to become the new knowledge trailblazers in the AI landscape
Content creators will find new avenues in the AI era, enriching LLMs with their specialized knowledge. Traditional content like research papers, novels, technical documentation, and more, will be transformed into downloadable forms for enriching LLMs. This evolution will empower content creators to reach a wider audience, allowing their expertise to permeate through the global knowledge ecosystem via AI interfaces.
As I began penning down this article, I was convinced that the transformation I was about to predict would unfold over the next decade. A vision of the future that would materialize slowly but surely. However, the more I delved into the nuances of these technological advancements, the faster my estimated timeline contracted. Midway through, I found myself looking at a mere 5-year horizon. And by the time I reached the end, I had further shortened the time frame to just 3 years. Yet, even as I write this conclusion, I can't help but question whether I've still been too conservative. Could it be that this breathtaking transformation, reshaping AI, the internet, our world and possibly our reality, is poised to occur even faster? Only time will tell. But one thing is clear - change is coming, and it's coming quicker than we think.