File size: 1,304 Bytes
818b3ba
 
 
 
 
 
 
 
 
 
 
 
 
0abb0fd
f4b08a5
818b3ba
 
7299c96
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
## Usage of this model:
I'm glad to share with you my exciting journey of fine-tuning Llama 2 for 
Named Entity Recognition (NER),particularly on a customer service dataset.
NER is a fascinating natural language processing task that involves identifying
and classifying entities like names of people, organizations, locations, 
and other important terms within a given text.

The customer service dataset I used was carefully curated and annotated with 
a wide range of service-related entities, such as specific types of services, 
service providers, service locations, and other related terms. The data was diverse and 
representative of the actual domain it aimed to address.
(I will re-upload the dataset with more sample in it to here zaursamedov1/customer-service-ner)

## To get more closer look at to the model read this colab notebook 
(Coming soon...)


---
library_name: peft
---
## Training procedure


The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions


- PEFT 0.5.0.dev0