Rename inference-cache-config/llama.json to inference-cache-config/llama2.json f06a55a verified dacorvo HF staff commited on Apr 19