please add neuron cache for llama-3-70B for tp_degree of 32 (current support is for 24 cores)

#79
by ak-org - opened

please add neuron cache for llama-3-70B for tp_degree of 32 to support fine-tuning on trn1 (current support is for 24 cores for inf2)

Sign up or log in to comment