--- license: apache-2.0 --- # Model Card for Model ID **slim-sentiment-tool** is part of the SLIM ("Structured Language Instruction Model") model series, providing a set of small, specialized decoder-based LLMs, fine-tuned for function-calling. slim-sentiment-tool is a 4_K_M quantized GGUF version of slim-sentiment-tool, providing a fast, small inference implementation. Load in your favorite GGUF inference engine, or try with llmware as follows: from llmware.models import ModelCatalog sentiment_tool = ModelCatalog().load_model("llmware/slim-sentiment-tool") response = sentiment_tool.function_call(text_sample, params=["sentiment"], function="classify") Slim models can also be loaded even more simply as part of LLMfx calls: from llmware.agents import LLMfx llm_fx = LLMfx() llm_fx.load_tool("sentiment") response = llm_fx.sentiment(text) ### Model Description - **Developed by:** llmware - **Model type:** GGUF - **Language(s) (NLP):** English - **License:** Apache 2.0 - **Quantized from model:** llmware/slim-sentiment (finetuned tiny llama) ## Uses The intended use of SLIM models is to re-imagine traditional 'hard-coded' classifiers through the use of function calls. Example: text = "The stock market declined yesterday as investors worried increasingly about the slowing economy." model generation - {"sentiment": ["negative"]} keys = "sentiment" All of the SLIM models use a novel prompt instruction structured as follows: " " + text + " " + keys + "" + "/n: " ## Model Card Contact Darren Oberst & llmware team