AI hallucinations could be problematic in 2024 says Couchbase

Posted on Thursday, December 7, 2023 by RICHARD HARRIS, Executive Editor

Rahul Pradhan, VP, Product & Strategy, Couchbase discusses predictions including how Real-time data will become the standard for businesses to power generative experiences with AI. How data layers should support both transactional and real-time analytics, why multimodal LLMs and databases will enable a new frontier of AI apps across industries, what paradigm shift to expect from model-centric to data-centric AI, and how retrieval-augmented generation will be paramount for grounded, contextual outputs when leveraging AI in 2024.

Real-time data will become the standard for businesses to power generative experiences with AI; Data layers should support both transactional and real-time analytics

  • The explosive growth of generative AI in 2023 will continue strong into 2024. Even more, enterprises will integrate generative AI to power real-time data applications and create dynamic and adaptive AI-powered solutions. As AI becomes business-critical, organizations need to ensure the data underpinning AI models is grounded in truth and reality by leveraging data that is as fresh as possible.
  • Just like food, gift cards, and medicine, data also has an expiration date. For generative AI to truly be effective, accurate, and provide contextually relevant results, it needs to be built on real-time, continually updated data. The growing appetite for real-time insights will drive the adoption of technologies that enable real-time data processing and analytics. In 2024 and beyond, businesses will increasingly leverage a data layer that supports both transactional and real-time analytics to make timely decisions and respond to market dynamics instantaneously.

Multimodal LLMs and databases will enable a new frontier of AI apps across industries

  • One of the most exciting trends for 2024 will be the rise of multimodal LLMs. With this emergence, the need for multimodal databases that can store, manage, and allow efficient querying across diverse data types has grown. However, the size and complexity of multimodal datasets pose a challenge for traditional databases, which are typically designed to store and query a single type of data, such as text or images. 
  • Multimodal databases, on the other hand, are much more versatile and powerful. They represent a natural progression in the evolution of LLMs to incorporate the different aspects of processing and understanding information using multiple modalities such as text, images, audio, and video. There will be several use cases and industries that will benefit directly from the multimodal approach including healthcare, robotics, e-commerce, education, retail, and gaming. Multimodal databases will see significant growth and investments in 2024 and beyond - so businesses can continue to drive AI-powered applications.

Expect a paradigm shift from model-centric to data-centric AI

  • Data is key in modern-day machine learning, but it needs to be addressed and handled properly in AI projects. Because today’s AI takes a model-centric approach, hundreds of hours are wasted on tuning a model built on low-quality data.
  • As AI models mature, evolve, and increase, the focus will shift to bringing models closer to the data rather than the other way around. Data-centric AI will enable organizations to deliver both generative and predictive experiences that are grounded in the freshest data. This will significantly improve the output of the models while reducing hallucinations.

Retrieval-augmented generation will be paramount for grounded, contextual outputs when leveraging AI

  • The excitement around large language models and their generative capabilities will continue to bring with it a problematic phenomenon of model hallucinations. These are instances when models produce outputs that, though coherent, might be detached from factual reality or the input’s context.
  • As modern enterprises move forward, it’ll be important to demystify AI hallucinations and implement an emerging technique called Retrieval-Augmented Generation (RAG) that when coupled with real-time contextual data can reduce these hallucinations, improving the accuracy and the value of the model. RAG brings in context about the business or the user, reducing hallucinations and increasing truthfulness and usefulness.

About Rahul Pradhan

Rahul Pradhan is VP of Product and Strategy at Couchbase. He has over 16 years of experience leading and managing both Engineering and Product teams across Storage, Networking, and Security domains. Most recently, he led the Product Management and Business Strategy team for Dell EMC's Emerging Technologies and Midrange Storage Divisions to bring all-flash NVMe, Cloud, and SDS products to market. Before that, he was a Principal Software Engineer at Nortel Networks.

More App Developer News

API Manager from WSO2 advances developer productivity



No code test automation powered by GenAI from SmartBear



Buildbox 4 AI turns game ideas into reality faster than ever



Odeeo hires Spotify executive James Cowan



ATT user opt in insights from AppsFlyer



Copyright © 2024 by Moonbeam Development

Address:
3003 East Chestnut Expy
STE# 575
Springfield, Mo 65802

Phone: 1-844-277-3386

Fax:417-429-2935

E-Mail: contact@appdevelopermagazine.com