You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Badllama-3.1-405B

This repo holds weights for Palisade Research's showcase of how open-weight model guardrails can be stripped off in minutes of GPU time. See the Badllama 3 paper for additional background.

Access

Email the authors to request research access. We do not review access requests made on HuggingFace.

Branches

  • main holds a LoRA adapter, convenient for swapping LoRAs
  • merged_fp8 holds FBGEMM-FP8 quantized model with LoRA merged in, convenient for long-context inference with vLLM
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for palisaderesearch/Badllama-3.1-405B

Finetuned
(7)
this model

Collection including palisaderesearch/Badllama-3.1-405B