This repo releases the trained LLaMA-adapter weights in paper "Large Language Models are Efficient Learners of Noise-Robust Speech Recognition."

GitHub: https://github.com/YUCHEN005/RobustGER

Data: https://huggingface.co/datasets/PeacefulData/Robust-HyPoradise

Model: This repo

If you consider this work would be related or useful for your research, please kindly consider to cite the work in ICLR 2024. Thank you.

@inproceedings{hu2024large,
  title={Large Language Models are Efficient Learners of Noise-Robust Speech Recognition},
  author={Hu, Yuchen and Chen, Chen and Yang, Chao-Han Huck and Li, Ruizhe and Zhang, Chao and Chen, Pin-Yu and Chng, Eng Siong},
  booktitle={International Conference on Learning Representations},
  year={2024}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Dataset used to train PeacefulData/RobustGER