Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
joaoganteΒ 
posted an update Apr 29
Post
2568
Adding a long prompt can help you fight LLM hallucinations. However, if you know exactly how you want your LLM output constrained, there are much better strategies! πŸ’ͺ

Did you know you can force your LLM to ALWAYS generate a valid JSON file? Or to follow a well-defined answer template? You can do that and more with the πŸ€— transformers-compatible outlines library.

It doesn't only allow you to master your LLM -- your text generation application will also become faster! πŸ”₯ The more constrained your text generation is, the bigger speedups you'll see!

Follow @remi and other outlines folks to stay on top of the constrained generation game 🧠
In this post