How to Use

Load model for inference

import torch
from transformers import AutoModel

model = AutoModel.from_pretrained("genbio-ai/dummy-ckpt-hf", trust_remote_code=True)

collated_batch = model.genbio_model.collate({"sequences": ["ACGT", "AGCT"]})
logits = model(collated_batch)
print(logits)
print(torch.argmax(logits, dim=-1))
Downloads last month
4
Safetensors
Model size
4.55M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .