Spaces:
Runtime error
Runtime error
Context is limited to 1024 tokens, can you make it 4k?
#12
by
yuuhan
- opened
Thanks for your great job that we can quickly play with llama2-70b for deploy-free. But MetaAI said that the context of llama2 has been extend to 4k, so the context limit can be changed to 4k in the gradio demo.