Gen AI hype will compress in 2024 says DataStax
Wednesday, December 20, 2023
Richard Harris |
Patrick McFadin from DataStax shares his 2024 predictions for AI, including why the GenAI hype circle will compress, the rise of dark LLMs, the race to be the best LLM, the privacy concerns from GenAI, and that the next big thing in generative AI won't be a single product, plus much more.
There will be a next-level renaissance around open data and open source when it comes to generative AI.
We have gone from the open source being software you can download for free to "knowing what would know what's in your food". People are severely freaked out about the privacy aspect of GenAI. They worry they have lost their privacy and are unsure where the copyrighted data comes from. There will be a revolt against hyperscalers, OpenAI, AWS, and Google, because of this.
The more you grip, the more they'll slip away.
Regulating open source will be almost impossible. Once things are out there and open, you can't regulate it. The open-source LLMs exist, they're being trained, and they're starting to get better than GPT-4 making it harder to regulate them.
This is the next Cold War
There is a race of who will have the biggest LLM amongst the biggest population data centers in the world and will try to weaponize private data in some way against their citizens and what everyone's worried about. This is already happening at some scale - governments regulate information constantly. Sometimes for very good reasons. For example, you cannot download plans to build an atomic bomb or can download the formula for sarin gas.
Heads up, consumers - just like the dark web, there are dark LLMs
Users have to be more careful when using LLMs. Smart consumers aren’t just going to use an LLM without asking the proper questions.
Before the 1990s, why did dog poop turn white?
It had to do with the insane amount of calcium filler that they put into dog food. And there was this whole shift in consumer behavior, and consumers became educated about what they were feeding their dogs, and then white dog poop disappeared. This will happen with LLMs as well. When consumers become educated and aware, they change their habits, and we will see this as consumers choose the right LLM for their generative AI stack and avoid the “dark LLMs” that are on the market.
The next big thing in generative AI won't be a single product
It will be something integrated into a product we are already using, and that is what everyone will be working on in the coming year. A good example of this would be what Adobe is doing right now. In their Photoshop product, they have integrated AI as a new menu item. You're still using Photoshop, but now with their AI bot Phil, you can highlight something and say remove the mountain and add a castle, and you will have a medieval landscape. But not every person uses Photoshop. It is a good example of what could be, but I don't think it is in killer app territory yet.
We are going to see the Instagrams of Gen AI apps emerge in 2024
When the iPhone came out, everybody went crazy for apps and thousands came out at once, but many weren't successful. Eventually, we had the real winners pop up, apps like Instagram. In 2024, we are going to see the Instagrams of gen AI apps start to emerge as well. Out of the thousands of GenAI apps available now, we will see leaders come into view. The field will start washing out quickly.
The Generative AI hype cycle will compress
We've seen the adoption in the hype cycle compress. We're already in the trough of disillusionment, we're going to be in productivity very fast. What might have taken two years is now taking a year.
We're headed more towards a generative AI monopoly
Hyperscalers will continue to innovate and own generative AI because they are adopting Gen AI startups' business plans and ideas by incorporating them into their product. And by consolidating quickly, they will leave AI startups obsolete. We are heading towards limited choices when it comes to building out our AI stack. As a result, there's going to be quick regulatory pressure from the EU, US Congress, and so on but it will be too late.
There is going to be a rapid shift from infrastructure-based Gen AI to local Gen AI
Right now, locally run generative AI is not possible. The average startup doesn’t have thousands of dollars to throw at a cloud provider and it is nearly impossible to run an LLM by yourself but that is changing quickly with innovations around local generative AI. With a local model, you will have a complete RAG stack under your control. That way, you won’t have to expose your proprietary data in any way. When we go from centralized, API-based LLMs to local LLMs, adoption will happen quickly - the ones that will work will be adopted like wildfire. Just be mindful of the downside as de-centralized LLMs can introduce the concept of bad actors in the loop.
Become a subscriber of App Developer Magazine for just $5.99 a month and take advantage of all these perks.
MEMBERS GET ACCESS TO
- - Exclusive content from leaders in the industry
- - Q&A articles from industry leaders
- - Tips and tricks from the most successful developers weekly
- - Monthly issues, including all 90+ back-issues since 2012
- - Event discounts and early-bird signups
- - Gain insight from top achievers in the app store
- - Learn what tools to use, what SDK's to use, and more
Subscribe here