MeMDLM: De Novo Membrane Protein Design with Masked Discrete Diffusion Language Models
Masked Diffusion Language Models (MDLMs), introduced by Sahoo et al, provide strong generative capabilities to BERT-style models. In this work, we pre-train and fine-tune ESM-2-150M protein language model (pLM) on the MDLM objective to scaffold functional motifs and unconditionally generate realistic, high-quality membrane protein sequences.
Repository Authors
Shrey Goel, Undergraduate Student at Duke University
Vishrut Thoutam, Student at High Technology High School
Pranam Chatterjee, Assistant Professor at Duke University
Reach out to us with any questions!
- Downloads last month
- 31
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for ChatterjeeLab/MeMDLM
Base model
facebook/esm2_t30_150M_UR50D