--- library_name: peft base_model: Locutusque/TinyMistral-248M-Instruct datasets: - Locutusque/InstructMix language: - en pipeline_tag: text-generation widget: - text: >- <|ASSISTANT|> Here is a possible solution to transform high haircare styling and trimming based on decision making for professionals incorporating `if`/`else` statements to handle different consent scenarios: 1. Define nodes and relationships for the graph database: ```cypher CREATE (client:Client) CREATE (stylist:Stylist)-[:HAS_CLIENT {start_date: date() }]->(client) // Relationship types used in this query MATCH (s:Service), (c:Client) WHERE s.name = 'Haircut' AND c IN [client] MERGE (s)<-[r:CONFIRMS_SERVICE]-(c); // Other relationship types could also be added here as needed ``` 2. Query to determine whether client has given their explicit consent to receive specific services: ```cypher // Get all services provided by stylists to clients MATCH (s:Stylist)-[r:PROVIDES_SERVICE*0..5]-(:Service) WITH collect(distinct s) AS stylists, r UNWIND stylists AS s OPTIONAL MATCH (c:Client)-[:HAS_CLIENT]->(sc:ServiceConsent{service:r}) RETURN s, count(*), sum(CASE WHEN sc IS NOT NULL THEN 1 ELSE 0 END) AS num_consents ORDER BY num_consents DESC; ``` 3. Example of how to use the above query to check which service a particular client has already agreed to: ```cypher // Check if client has previously granted consent to any services MATCH (s:Stylist)-[r:PROVIDES_SERVICE*0..5]-(:Service) WITH collect(distinct s) AS stylists, r UNWIND stylists AS s OPTIONAL MATCH (c:Client)-[:HAS_CLIENT]->(sc:ServiceConsent{service:r}) WHERE id(c) = RETURN s, count(*), sum(CASE WHEN sc IS NOT NULL THEN 1 ELSE 0 END) AS num_consents; ``` 4. Code to add new consent for a new service: ```cypher // Add new consent for a new service MERGE (c:Client {id: ''}) ON CREATE SET c.created_at=timestamp(), c.updated_at=timestamp() MERGE (s:Service {name: ''}) ON CREATE SET s.created_at=timestamp(), s.updated_at=timestamp() MERGE (c)-[:GIVEN_SERVICE_CONSENT {consent_given: true}]->(sc:ServiceConsent {service: s}); ``` 5. Code to update existing consent for an existing service: ```cypher // Update existing consent for an existing service MATCH (c:Client {id: ''}), (s:Service {name: ''}) MERGE (c)-[:GIVEN_SERVICE_CONSENT {consent_given: false}]->(oldSc:ServiceConsent) MERGE (c)-[:GIVEN_SERVICE_CONSENT {consent_given: true}]->(newSc:ServiceConsent {service: s}); DELETE oldSc; ``` 6. Code to delete consent for a service: ```cypher // Delete consent for a service MATCH (c:Client {id: ''}), (s:Service {name: ''}) REMOVE (c)-[:GIVEN_SERVICE_CONSENT {consent_given: true}]->(sc:ServiceConsent {service: s}); ``` This approach usesNeo4j's native cypher language to define the database schema and perform queries and mutations on the graph. <|USER|> inference: parameters: temperature: 0.8 do_sample: True top_p: 0.14 top_k: 41 max_new_tokens: 250 repetition_penalty: 1.176 --- ## Uses This model is intended to be used to create instruction-following datasets by predicting a question by passing an answer to it. ### Out-of-Scope Use [More Information Needed] ## Bias, Risks, and Limitations [More Information Needed] ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## Training Details ### Training Data [More Information Needed] ### Training Procedure #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] #### Speeds, Sizes, Times [optional] [More Information Needed] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data [More Information Needed] #### Factors [More Information Needed] #### Metrics [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] [More Information Needed] ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: QuantizationMethod.BITS_AND_BYTES - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.6.2