A genomic foundation model broadly enables sequence modeling, prediction, and design
Science
14 Nov 2024
Vol 386, Issue 6723
pp. 729-730
Abstract
With a vocabulary of just four nucleotides, the language of DNA encodes the fundamental information needed to orchestrate all layers of regulation in a cell, from DNA to RNA and proteins. These instructions direct the function of each cell and transmit information between generations. Changes in the genomic sequence drive evolution, enabling organisms to adapt to their environments through natural selection of advantageous DNA sequences. Therefore, comparing DNA sequences across evolutionarily diverse genomes could enable a large language model to learn the grammar of DNA, which has eluded models trained on single genomes (1). On page 746 of this issue, Nguyen et al. (2) present Evo, a foundation model trained on 2.7 million evolutionarily diverse prokaryotic and phage genomes. Having learned genomic logic, Evo can decode natural genomes; enable predictions and design tasks across DNA, RNA, and proteins; and generate DNA at the whole-genome scale.
Access the full article
View all access options to continue reading this article.
References and Notes
2
E. Nguyen et al., Science 386, eado9336 (2024).
3
V. Fishman et al., bioRxiv 10.1101/2023.06.12.544594 (2024).
4
Y. Hwang et al., Nat. Commun. 15, 2880 (2024).
5
Y. Ji et al., Bioinformatics 37, 2112 (2021).
6
H. Dalla-Torre et al., bioRxiv 10.1101/2023.01.11.523679 (2023).
13
J. A. Doudna, E. Charpentier, Science 346, 1258096 (2014).