vikp commited on
Commit
520d09a
1 Parent(s): 3df6f96

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -3
README.md CHANGED
@@ -7,9 +7,14 @@ This model will generate instructions given some text. It is useful for labelli
7
 
8
  It was trained across the [reverse-instruct](https://huggingface.co/vikp/reverse_instruct) dataset for 2 epochs. Final validation loss was .72, with rouge-l of .66 .
9
 
10
- Here is an inference example, with some random text from falcon-refinedweb:
11
 
12
  ```
 
 
 
 
 
13
  template = """
14
  Output
15
 
@@ -19,7 +24,7 @@ Instruction
19
 
20
  """.lstrip()
21
 
22
- text = "Many of the programmers, engineers and developers we talk to have a secret that they don't reveal until they know people pretty well. No, I'm not talking about the complete set of Star Wars playing cards they have stashed in the basement or the Rush LPs they haven't gotten around to trading in yet. I'm talking about Legos. You remember Legos, those infinitely malleable blocks that children around the world use to construct everything from tiny towers to life-size towers. Perhaps because these toys leave so much to the imagination, they've captured the imagination of a generation of tech workers. The appearance of Lego in Douglas Copeland's novel Microserfs, set on the Microsoft corporate campus, is one example of how pervasive it is."
23
  prompt = template.format(output=text)
24
 
25
  inputs = tokenizer(prompt, return_tensors="pt")
@@ -29,6 +34,6 @@ texts = [t.replace(template, "") for t in texts]
29
  print(texts)
30
  ```
31
 
32
- And the output instruction for the above example would be `What is a secret that many programmers, engineers and developers don't reveal until they know people pretty well?`
33
 
34
  It works with code, too, although llama-7b is undertrained on code.
 
7
 
8
  It was trained across the [reverse-instruct](https://huggingface.co/vikp/reverse_instruct) dataset for 2 epochs. Final validation loss was .72, with rouge-l of .66 .
9
 
10
+ Here is an inference example, with some random text from `falcon-refinedweb`:
11
 
12
  ```
13
+ from transformers import AutoModelForCausalLM, AutoTokenizer
14
+
15
+ model = AutoModelForCausalLM.from_pretrained("vikp/reverse_instruct")
16
+ tokenizer = AutoTokenizer.from_pretrained("vikp/reverse_instruct")
17
+
18
  template = """
19
  Output
20
 
 
24
 
25
  """.lstrip()
26
 
27
+ text = """SE3 Condenser Microphone from SE Electronics Sonic Distribution is now handling the SE Electronics line of imported studio condensers. The SE3 caught my eye at the Summer NAMM Show in Nashville and is their flagship "pencil" microphone with a fixed cardioid pattern and 48V phantom powering. This mic uses Class A FET amplifier electronics and has both low cut filter and -10dB pad switches. I had the opportunity to try this mic out on several sources while recording a band and was impressed by its natural sound and all around usefulness. I used it for acoustic guitar overdubs where the low cut filter helped to tame a jumbo bodied guitar's boomy sound. The gentle presence lift added a sparkle without using EQ. I also tried it on drums and cymbals and it (using the pad) didn't fold up (overload) at all. I even tried it on vocals with good results although it does 'pop' easily and required a couple of pop screens. Housed in an elegantly finished new body design, it comes with a sturdy shock mount and packaged in a deluxe wooden travel case. Significant specifications are: frequency response rated at 20Hz-20khz; sensitivity is 10mV/Pa +/- 2dB; noise level is 17dB (A weighted); and Max SPL for 0.5% THD @ 1kHz is 135dB. I certainly found a 'Swiss army knife' of a condenser with the SE3 and I completely recommend it for any studio task especially acoustic instruments such as guitar, violin, cello or string bass."""
28
  prompt = template.format(output=text)
29
 
30
  inputs = tokenizer(prompt, return_tensors="pt")
 
34
  print(texts)
35
  ```
36
 
37
+ And the output instruction for the above example would be: `Write a product review for the SE3 Condenser Microphone from SE Electronics Sonic Distribution.`
38
 
39
  It works with code, too, although llama-7b is undertrained on code.