Covering Scientific & Technical AI | Sunday, December 1, 2024

Dataiku Introduces LLM Mesh with Key Partners 

As more companies experiment with Generative AI technology, a crucial question emerges: Can applications that use Generative AI be safe and scalable for an enterprise? With LLM Mesh, the answer is yes. Organizations can use LLM Mesh to effectively build enterprise-level applications mitigating concerns about cost management, technological dependencies, and compliance.

Dataiku, the artificial intelligence and machine learning company behind Everyday AI, unveiled LLM Mesh at its Everyday AI Conference in New York. There has been a critical need for a scalable, secure, and effective platform for the integration of Large Language Models (LLMs) in the enterprise. Dataiku also announced its LLM Mesh Launch Partners - Snowflake, Pinecone, and AI21 Labs. 

Clément Stenac, Chief Technology Officer and co-founder at Dataiku said, “The LLM Mesh represents a pivotal step in AI. At Dataiku, we’re bridging the gap between the promise and reality of using Generative AI in the enterprise. We believe the LLM Mesh provides the structure and control many have sought, paving the way for safer, faster GenAI deployments that deliver real value.”  

Everyday AI platform by Datiku is a daily livestream, newsletter and podcast on the latest AI trends and tips for everyday people. The Everyday AI conferences are conducted worldwide to help connect cutting-edge technology and enterprise applications. 

While Generative AI offers several benefits and opportunities for enterprises, it also poses several challenges for organizations. One of the key challenges is the absence of a central administration. Most generative AI models operate without centralized oversight or governance. This can result in the spread of misinformation, regulation issues, and ethical concerns. 

There is also a lack of cost-monitoring mechanism, minimal measures against toxic content, inadequate permission controls, and use of personally identifiable information. In addition, there is a need to establish best practices to fully realize and harness the potential of generative AI. With the launch of LLM Mesh, Dataiku plans on overcoming some of these key challenges. 

The LLM Mesh is designed to provide components required to safely build and efficiently scale LLMs. The components of LLM Mesh include safety provisions for response moderation and private data screening, united AI service routing, and performance and cost tracking. Standard components for application development are also included to allow for quality and consistency in delivering control and performance. 

With LLM Mesh sitting between end-user applications and LLM service providers, companies have the flexibility to choose cost-effective models for their needs and easily adapt to changes in the future. In addition, companies can reuse components for scalable application development. 

With the announcement of its LLM Mesh Launch Partners, Dataiku has continued on its philosophy of enhancing, rather than duplicating existing capabilities. The partnership with Snowflake, Pinecone, and AI21 Labs represents several key components of LLM Mesh including vector databases, LLM builders, containerized data and compute capabilities. 

In May 2023, Teradata, a multi-cloud data giant, announced its integration with Dataiku to enable users to import and operate their Dataiku-trend AI models on Terradata’s Vantage Platform. While Teradata provides comprehensive predictive and prescriptive analytics, Dataiku provides a central working environment for training, developing, and managing applications.

(everything possible/Shutterstock)

Torsten Grabs, Senior Director of Product Management at Snowflake, shared, “We are enthusiastic about the vision of the LLM Mesh because we understand that the real value lies not only in deploying LLM-powered applications but also in democratizing AI in a secure and reliable manner. With Dataiku, we empower our mutual customers to deploy LLMs on their Snowflake data using containerized compute from Snowpark Container Services within the security confines of their Snowflake accounts. Dataiku orchestrates this process to reduce friction and complexity, accelerating business value.”

Chuck Fontana, VP of Business Development at Pinecone, said “The LLM Mesh is not just an architectural concept; it represents a path forward. Vector databases set new standards, fueling AI applications through innovations like Retrieval Augmented Generation. Together, Dataiku and Pinecone are establishing a new benchmark, providing a framework for others in the industry to follow. This collaboration helps address the challenges faced in building enterprise-grade GenAI applications at scale, and Pinecone eagerly anticipates its role as an LLM Mesh Launch Partner.”

Pankaj Dugar, SVP and GM, North America at AI21 Labs, added “In today’s ever-evolving technological landscape, fostering a diverse and tightly integrated ecosystem within the Generative AI stack is of paramount importance for the benefit of our customers. Our collaboration with Dataiku and the LLM Mesh underscores our commitment to diversity, ensuring enterprises can access a wide array of top-tier, flexible, and dependable LLMs. We firmly believe that diversity fuels innovation, and with Dataiku’s LLM Mesh, we are stepping into a future filled with limitless AI possibilities.”

Related Items 

Dataiku and Databricks Survey Highlights Booming ROI of AI and Future Challenges 

Dataiku Captures $400 Million Series E Round Funding to Grow Its Data Science and ML Platform

AIwire