# Models Converted to fp16 - LLama2-chat-hf-fp16 - LLama3-7b-Instruct Model with fp16 - LLama3-70B-Instruct Model with fp16 # Quantized models: