Search
Close this search box.

7 Companies Building Foundation Models in Biology and Chemical Synthesis

In the past three years, AI foundation models have been on fire, swiftly entering various industries. The most prominent examples of foundation models are the GPT-3 and GPT-4 models, which form the basis of ChatGPT.

These are huge models trained on enormous volumes of data, often in a self-supervised or unsupervised manner (without the need for labeled data). Thanks to special model design, including transformer architecture and attention algorithms, foundation models are very generalizable, allowing their adaptation to a diverse array of downstream tasks, unlike traditional AI models that excel in single tasks like, say, predicting molecule-target interaction.

For instance, Deep Genomics unveiled BigRNA, a pioneering AI foundation model for uncovering RNA biology and therapeutics. According to Deep Genomics, it is the first transformer neural network engineered for transcriptomics. BigRNA is informed by nearly two billion adjustable parameters and has been trained on thousands of datasets, totaling over a trillion genomic signals.

A month earlier, Ginkgo Bioworks, Inc. and Google Cloud announced a 5-year partnership where Ginkgo would work to develop new, state-of-the-art large language models (LLMs). The AI foundation model would be focused on genomics, protein function, and synthetic biology and would be running on Google Cloud’s Vertex AI platform. The model is supposed to help Ginkgo’s customers accelerate innovation and discovery in fields as diverse as drug discovery, agriculture, industrial manufacturing, and biosecurity.

Read our case study Foundation Models in Biology

Continue reading

This content available exclusively for BPT Mebmers

Topics:
Emerging Technologies