Missing processor chat template
#53
by
AIchberger
- opened
Hello,
I noticed that the processor for Llama-3.2-11B-Vision-Instruct
does not include a default chat_template, while the processor for Llama-3.2-90B-Vision-Instruct
does. This leads to errors when using processor.apply_chat_template( )
without explicitly providing a template.
Is this behavior intentional, or could a default chat_template be added?
AIchberger
changed discussion status to
closed