Covering Scientific & Technical AI | Tuesday, December 3, 2024

Pathway Launches Free LLM-App for Building Privacy-Preserving LLM Applications 

SAN FRANCISCO, Sept. 22, 2023 --  Pathway, the real-time intelligence technology specialist, today announces LLM-App a free framework in Python for creating real-time AI applications that continuously learn from proprietary data sources. LLM-App also critically overcomes the data governance challenges that have stalled enterprise adoption of LLMs, by building responsive AI applications from data that remains secure and undisturbed in its original storage location.

AI is better when it learns from fresh data

Enterprise adoption of Large Language Models (LLMs) for real-time decision-making on proprietary data has struggled to take off despite the boom in Generative AI. The challenge has been two-fold.

Firstly, there have been concerns over sharing intellectual property and sensitive information with open systems, like ChatGPT and Bard. Secondly, the complexity of designing efficient systems that combine both batch and streaming workflows means AI applications are unable to perform incremental updates to revise preliminary results. This freezes its knowledge to a moment in time, making it unsuitable for decisions that need to be made on accurate, real-time data in industries like manufacturing, financial services and logistics.

Pathway’s LLM-App overcomes these challenges by allowing organizations to build privacy-preserving responsive AI applications based on live data sources. It can leverage your own private LLM or public APIs to provide human-like responses to user queries based on proprietary data sources, with the data remaining secure and undisturbed in its original storage location. This means the application owner retains complete control over the input data and the application’s outputs, making it suitable even for use cases that draw on sensitive data and intellectual property.

Some of the use cases of interest relate to indexing and searching enterprise data with Large Language Models (LLM), for instance for retrieving up-to-date information on rules, regulations or restrictions, while maintaining data privacy. Another powerful use case comes from the ability to mix streaming use cases with information that comes in unstructured sources, such as PDF files, which help enterprise users unlock the full value of their data sources.

LLM-App also supports the enterprise requirements for putting LLMs into production, such as LLM monitoring, cost-monitoring of tokens use, or the need to A/B test model based on business needs.

Cut complexity to build in under 30 lines of code

LLM-App significantly reduces the complexity associated with building LLM applications. Applications can be built in under 30 lines of code, without a separate vector database, and do not require complex and fragmented stacks typical of LLMs (such as Pinecone/Weaviate + Langchain + Redis + FastAPI +, etc.).

Instead, the LLM-App processes and organizes documents, which can be stored in the cloud or on-premise, to build a 'vector index'. User queries come through as HTTP REST requests, which then use the index to find relevant documents and respond using OpenAI API or Hugging Face in natural language.

If new pieces of information are added, it updates its index in real-time and uses this new knowledge to answer future questions. This ensures the proprietary LLM provides accurate answers based on the most up-to-date knowledge. This is enabled by Pathway’s application layer in Python for batch and streaming data processing, which enables use cases such as receiving an alert when the answer to a previously asked question changes. This unlocks the power of live learning with LLMs, as alerts are triggered by fresh relevant data from documents or streams.

Zuzanna Stamirowska, CEO & Co-Founder of Pathway, commented: “While many enterprises have been eager to adopt LLMs, there have been a number of risks involved which have stalled its adoption. From potentially exposing IP or sensitive data, to making decisions based on out-of-date knowledge, concerns around the accuracy and privacy of LLM applications have been difficult to overcome. We hope that with our free framework in Python to build AI apps, more organizations will be able to start building use cases with proprietary data and advance the use of LLMs in the enterprise.”

About Pathway

Pathway develops real-time intelligence technology. Real-time learning is made possible by an effective and scalable engine, which powers LLMs and machine learning models. These models are automatically updated thanks to a framework that combines streaming and batch data, and which is user-friendly and flexible for developers, data engineers and data scientists.  Leading experts in the field of artificial intelligence make up the team, which is headed by Zuzanna Stamirowska. They include CTO Jan Chorowski, co-authors of Geoff Hinton and Yoshua Bengio, as well as Business Angel Lukasz Kaiser, who co-authored Tensor Flow and is also known as the "T" in ChatGPT.


Source: Pathway

About the author: Alex Woodie

Alex Woodie has written about IT as a technology journalist for more than a decade. He brings extensive experience from the IBM midrange marketplace, including topics such as servers, ERP applications, programming, databases, security, high availability, storage, business intelligence, cloud, and mobile enablement. He resides in the San Diego area.

AIwire