File size: 2,338 Bytes
f4fec9c
 
 
 
 
 
 
 
 
 
18eea68
f4fec9c
 
 
 
18eea68
f4fec9c
18eea68
5255cd3
f4fec9c
 
 
 
5255cd3
9b7a5d6
 
e05bd5a
f4fec9c
e05bd5a
f4fec9c
9b7a5d6
 
e05bd5a
f4fec9c
e05bd5a
f4fec9c
9b7a5d6
5255cd3
18eea68
5255cd3
18eea68
55eb6d2
f4fec9c
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
---
library_name: transformers
license: other
base_model: NousResearch/Hermes-3-Llama-3.1-8B
tags:
- llama-factory
- full
- unsloth
- generated_from_trainer
model-index:
- name: kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor
  results: []
---


# kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor

This is my personal toy project for Chuseok(Korean Thanksgiving Day).

This model is a fine-tuned version of [NousResearch/Hermes-3-Llama-3.1-8B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B) on the Korean_synthetic_financial_dataset_21K.


## Model description

Everything happened automatically without any user intervention. 

Based on finance PDF data collected directly from the web, we refined the raw data using the 'meta-llama/Meta-Llama-3.1-70B-Instruct-FP8' model. 
After generating synthetic data based on the cleaned data, we further evaluated the quality of the generated data using the 'meta-llama/Llama-Guard-3-8B' and 'RLHFlow/ArmoRM-Llama3-8B-v0.1' models. 
We then used 'Alibaba-NLP/gte-large-en-v1.5' to extract embeddings and applied Faiss to perform Jaccard distance-based nearest neighbor analysis to construct the final dataset of 21k, which is diverse and sophisticated.

λͺ¨λ“  과정은 μ‚¬μš©μžμ˜ κ°œμž… 없이 μžλ™μœΌλ‘œ μ§„ν–‰λ˜μ—ˆμŠ΅λ‹ˆλ‹€. 

μ›Ήμ—μ„œ 직접 μˆ˜μ§‘ν•œ 금육 κ΄€λ ¨ PDF 데이터λ₯Ό 기반으둜, 돈이 μ—†μ–΄μ„œ 'meta-llama/Meta-Llama-3.1-70B-Instruct-FP8' λͺ¨λΈμ„ ν™œμš©ν•˜μ—¬ Raw 데이터λ₯Ό μ •μ œν•˜μ˜€μŠ΅λ‹ˆλ‹€. 
μ •μ œλœ 데이터λ₯Ό λ°”νƒ•μœΌλ‘œ ν•©μ„± 데이터λ₯Ό μƒμ„±ν•œ ν›„, 'meta-llama/Llama-Guard-3-8B' 및 'RLHFlow/ArmoRM-Llama3-8B-v0.1' λͺ¨λΈμ„ 톡해 μƒμ„±λœ λ°μ΄ν„°μ˜ ν’ˆμ§ˆμ„ μ‹¬μΈ΅μ μœΌλ‘œ ν‰κ°€ν•˜μ˜€μŠ΅λ‹ˆλ‹€. 
μ΄μ–΄μ„œ 'Alibaba-NLP/gte-large-en-v1.5'λ₯Ό μ‚¬μš©ν•˜μ—¬ μž„λ² λ”©μ„ μΆ”μΆœν•˜κ³ , Faissλ₯Ό μ μš©ν•˜μ—¬ μžμΉ΄λ“œ 거리 기반의 κ·Όμ ‘ 이웃 뢄석을 μˆ˜ν–‰ν•¨μœΌλ‘œμ¨ λ‹€μ–‘ν•˜κ³  μ •κ΅ν•œ μ΅œμ’… 데이터셋 21k을 직접 κ΅¬μ„±ν•˜μ˜€μŠ΅λ‹ˆλ‹€.


## Task duration
3days (20240914~20240916)

## evaluation
Nothing (I had to take the Thanksgiving holiday off.)

## sample

![image/png](https://cdn-uploads.huggingface.co/production/uploads/619d8e31c21bf5feb310bd82/gJ6hnvAV2Qx9774AFFwQe.png)

### Framework versions

- Transformers 4.44.2
- Pytorch 2.4.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1