Is this free?
I'm using the model and I was wondering if its free to use. Thank you
https://huggingface.co/docs/transformers/en/model_doc/mixtral
Today, the team is proud to release Mixtral 8x7B, a high-quality sparse mixture of experts models (SMoE) with open weights. Licensed under Apache 2.0. Mixtral outperforms Llama 2 70B on most benchmarks with 6x faster inference. It is the strongest open-weight model with a permissive license and the best model overall regarding cost/performance trade-offs. In particular, it matches or outperforms GPT3.5 on most standard benchmarks.
so? I can't understand if its free or pay to use or pay for usage
Hi
@mikeraz
, the license Apache 2.0 allows for most use without restrictions, I would still recommend taking a look at the license since its a very popular one that is handy to know about!
Basically the weights of the models with such license are available to u to do what you desire!
If you don't understand what Apache 2 license is in 2024... perhaps you need to find another hobby. Like Frisbee.
if a person is learning to enter this world for the first time, there is no need to appear so frustrated... peace and love brother
if a person is learning to enter this world for the first time, there is no need to appear so frustrated... peace and love brother
Its not unreasonable to ask people to do some foundational research on their own and not expect it to be done for them. Take some initiative and not burden others with questions about basics. What is next, "what is python"?
if a person is learning to enter this world for the first time, there is no need to appear so frustrated... peace and love brother
Its not unreasonable to ask people to do some foundational research on their own and not expect it to be done for them. Take some initiative and not burden others with questions about basics. What is next, "what is python"?
hope you get happiness in your life. enjoy your day next elon musk