Cerebras CS-2 System to Accelerate NLP for Biomedical R&D
SUNNYVALE, Calif., March 14, 2022 — Cerebras Systems, a pioneer in high performance artificial intelligence (AI) compute, and nference, an AI-driven health technology company, today announced a collaboration to accelerate natural language processing (NLP) for biomedical research and development by orders of magnitude with a Cerebras CS-2 system installed at the nference headquarters in Cambridge, Mass.
The vast amounts of health data that lie within patient records, scientific papers, medical imagery, and genomic databases could be critical to advancing health outcomes. Unfortunately, this information is nearly impossible for data scientists and machine learning (ML) researchers to access, as it exists in unstructured, siloed, and incompatible forms, forcing researchers to sift through it manually. While data accessibility is a fundamental challenge in healthcare today, newer AI architectures such as transformer models can assist by processing data from various sources, de-identifying it, and converting it into structured, usable intelligence.
nference uses transformer AI models to employ self-supervised learning from large volumes of unstructured data without labels, translating vast amounts of health data into information that can be used to discover insights and drive research. However, training large models is complex, computationally intensive, and time-consuming, often requiring large clusters of conventional processors. A key feature of the Cerebras CS-2 architecture is the ability for data scientists and ML researchers to use data of longer sequence lengths than is practical using smaller, conventional processors – this is particularly relevant to research being conducted at nference.
“nference was founded to help solve complex medical problems and improve health outcomes by unlocking insights contained within biomedical data while protecting individual patient privacy,” said Ajit Rajasekharan, Chief Technology Officer, nference. “Our solution uses transformer models to help researchers and clinicians make sense of siloed and inaccessible health data, leading to new discoveries and findings that can impact patient outcomes. With Cerebras’s powerful CS-2 system, we can train transformer models with much longer sequence lengths than we could before, enabling us to iterate more rapidly and build better, more insightful models.”
“AI is driving an exponential increase in demand for compute,” said Andy Hock, Vice President of Product, Cerebras Systems. “As we have recently demonstrated across multiple customers and published work, the Cerebras CS-2 is orders of magnitude faster than legacy alternatives. This orders-of-magnitude performance advantage comes from the Cerebras Wafer Scale Engine (WSE-2), the world’s largest and most powerful AI processor. The WSE-2 is purpose built with 850,000 AI-optimized cores to accelerate the models of today and unlock future models not practical or possible on legacy infrastructure. The partnership and our work with nference is a great example of this, where their team – equipped with a CS-2 – is pushing the boundaries of AI to accelerate biomedical research and discovery to improve health outcomes.”
The Cerebras CS-2 system delivers the deep learning compute performance of hundreds of graphics processing units in a cluster, with the programming ease and efficiency of a single system. Powered by the largest and fastest processor ever built – the 2.6 trillion transistor second-generation Cerebras Wafer-Scale Engine (WSE-2) – the CS-2 delivers more AI-optimized compute cores, fast memory, and fabric bandwidth than any other deep learning processor in existence. A CS-2 delivers the wall-clock compute performance of many tens to hundreds of GPUs with the programming ease and efficiency of a single device.
About Cerebras Systems
Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to build a new class of computer system, designed for the singular purpose of accelerating AI and changing the future of AI work forever. Our flagship product, the CS-2 system is powered by the world’s largest processor – the 850,000 core Cerebras WSE-2, enables customers to accelerate their deep learning work by orders of magnitude over graphics processing units.
With customers and partners in North America, Asia, Europe and the Middle East, Cerebras is delivering industry leading AI solutions to a growing roster of customers including GlaxoSmithKline, AstraZeneca, TotalEnergies, Tokyo Electron Devices, Argonne National Laboratory, Lawrence Livermore National Laboratory, Pittsburgh Supercomputing Center, and the Edinburgh Parallel Computing Centre (EPCC).
For more information about the Cerebras CS-2 system and its applications in health and pharma, please visit https://cerebras.net/industries/health-and-pharma.
About nference
Through its powerful augmented intelligence software, nference is transforming health care by making biomedical knowledge computable. Our partnership with Mayo Clinic has given us the opportunity to synthesize more than 100 years of institutional knowledge, producing real-world evidence in real time by converting large amounts of data into deep insights to advance discovery and development of diagnostics and therapeutics. nference is headquartered in Cambridge, Mass. Visit us at nference.ai.
Source: Cerebras Systems