avi-pipable commited on
Commit
7fb8c25
1 Parent(s): 2d58f3f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -27,6 +27,8 @@ widget:
27
 
28
  [colab_notebook](https://colab.research.google.com/drive/17PyMU_3QN9LROy7x-jmaema0cuLRzBvc?usp=sharing)
29
 
 
 
30
  ## What have we built?
31
 
32
  A 1.3 bn code documentation model that outperforms most models on documenting codes and making your in-house libs ready for LLM and RAG pipelines.
@@ -72,8 +74,8 @@ prompt = f"""<example_response>{example of some --question: , --query}</example
72
  ```python
73
  from transformers import AutoModelForCausalLM, AutoTokenizer
74
  device = "cuda"
75
- model = AutoModelForCausalLM.from_pretrained("PipableAI/pip-library-etl-1.3b ").to(device)
76
- tokenizer = AutoTokenizer.from_pretrained("PipableAI/pip-library-etl-1.3b b")
77
  prompt = f"""<example_response>
78
  --code:def function_2(x): return x / 2
79
  --question:Document the python code above giving function description ,parameters and return type and example how to call the function.
 
27
 
28
  [colab_notebook](https://colab.research.google.com/drive/17PyMU_3QN9LROy7x-jmaema0cuLRzBvc?usp=sharing)
29
 
30
+ [pip library_etl](https://github.com/PipableAI/pip-library-etl.git)
31
+
32
  ## What have we built?
33
 
34
  A 1.3 bn code documentation model that outperforms most models on documenting codes and making your in-house libs ready for LLM and RAG pipelines.
 
74
  ```python
75
  from transformers import AutoModelForCausalLM, AutoTokenizer
76
  device = "cuda"
77
+ model = AutoModelForCausalLM.from_pretrained("PipableAI/pip-library-etl-1.3b").to(device)
78
+ tokenizer = AutoTokenizer.from_pretrained("PipableAI/pip-library-etl-1.3b")
79
  prompt = f"""<example_response>
80
  --code:def function_2(x): return x / 2
81
  --question:Document the python code above giving function description ,parameters and return type and example how to call the function.