Invited talk

Foundation Models for Biomedical Discovery and Reasoning

Abstract

Foundation models trained on large-scale biological data are reshaping the landscape of computational biology, offering unified frameworks for diverse tasks across molecular and clinical domains. In this talk, we present recent advances in applying multi-modal foundation models—specifically MAMMAL—to biologics discovery and biomedical entity reasoning. We demonstrate how MAMMAL, a versatile architecture integrating proteins, small molecules, and omics data, enables accurate prediction of antibody binding and activity against influenza A virus, and supports vaccine selection through antigenicity modeling. These applications highlight foundation models’ utility for specific, limited-size biomedical applications, offering a powerful tool for accelerating therapeutic development. Complementing this, we present a modality alignment strategy that bridges biological data and language models, enhencing LLMs cross-modal question answering ability. These two applications of foundation models illustrate their transformative potential in unifying the analysis of biological modalities and enhancing predictive and generative capabilities in biomedical research.