AmelieSchreiber commited on
Commit
0eabafd
·
1 Parent(s): 0927b55

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -34,7 +34,7 @@ and that single sequence masked language models like ESMFold can be used in atom
34
  AlphaFold2. In our approach we show a positive correlation between scaling the model size and data
35
  in a 1-to-1 fashion provides competative and possibly even comparable to SOTA performance, although our comparison to the SOTA models is not as fair and
36
  comprehensive as it could be (see [this report for more details](https://api.wandb.ai/links/amelie-schreiber-math/0asqd3hs) and also
37
- [this repost](https://wandb.ai/amelie-schreiber-math/huggingface/reports/ESM-2-Binding-Sites-Predictor-Part-3-Scaling-Results--Vmlldzo1NDA3Nzcy?accessToken=npsm0tatgumcidfwxubzjyuhal512xu8sjmpnf11sebktjm9mheg69ja397q57ok)).
38
 
39
 
40
  This model is a finetuned version of the 35M parameter `esm2_t12_35M_UR50D` ([see here](https://huggingface.co/facebook/esm2_t12_35M_UR50D)
 
34
  AlphaFold2. In our approach we show a positive correlation between scaling the model size and data
35
  in a 1-to-1 fashion provides competative and possibly even comparable to SOTA performance, although our comparison to the SOTA models is not as fair and
36
  comprehensive as it could be (see [this report for more details](https://api.wandb.ai/links/amelie-schreiber-math/0asqd3hs) and also
37
+ [this report](https://wandb.ai/amelie-schreiber-math/huggingface/reports/ESM-2-Binding-Sites-Predictor-Part-3-Scaling-Results--Vmlldzo1NDA3Nzcy?accessToken=npsm0tatgumcidfwxubzjyuhal512xu8sjmpnf11sebktjm9mheg69ja397q57ok)).
38
 
39
 
40
  This model is a finetuned version of the 35M parameter `esm2_t12_35M_UR50D` ([see here](https://huggingface.co/facebook/esm2_t12_35M_UR50D)