Update README.md
Browse files
README.md
CHANGED
@@ -41,20 +41,6 @@ This is the model card of a 🤗 transformers model that has been pushed on the
|
|
41 |
|
42 |
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
43 |
|
44 |
-
```
|
45 |
-
from transformers import AutoModelForTokenClassification, AutoTokenizer, pipeline
|
46 |
-
model = AutoModelForTokenClassification.from_pretrained('kalexa2/fabner-ner')
|
47 |
-
tokenizer = AutoTokenizer.from_pretrained('kalexa2/fabner-ner')
|
48 |
-
token_classifier = pipeline('ner',
|
49 |
-
model=model,
|
50 |
-
tokenizer=tokenizer,
|
51 |
-
aggregation_strategy="simple" )
|
52 |
-
r = token_classifier("Here, we report in-situ characterization of melt-flow dynamics in every location of the entire melt pool in laser metal additive manufacturing by populous and uniformly dispersed micro-tracers through in-situ high-resolution synchrotron x-ray imaging .")
|
53 |
-
|
54 |
-
for entity in r:
|
55 |
-
print(entity)
|
56 |
-
```
|
57 |
-
|
58 |
|
59 |
[More Information Needed]
|
60 |
|
@@ -86,6 +72,22 @@ Users (both direct and downstream) should be made aware of the risks, biases and
|
|
86 |
|
87 |
Use the code below to get started with the model.
|
88 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
89 |
[More Information Needed]
|
90 |
|
91 |
## Training Details
|
|
|
41 |
|
42 |
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
43 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
44 |
|
45 |
[More Information Needed]
|
46 |
|
|
|
72 |
|
73 |
Use the code below to get started with the model.
|
74 |
|
75 |
+
|
76 |
+
```
|
77 |
+
from transformers import AutoModelForTokenClassification, AutoTokenizer, pipeline
|
78 |
+
model = AutoModelForTokenClassification.from_pretrained('kalexa2/fabner-ner')
|
79 |
+
tokenizer = AutoTokenizer.from_pretrained('kalexa2/fabner-ner')
|
80 |
+
token_classifier = pipeline('ner',
|
81 |
+
model=model,
|
82 |
+
tokenizer=tokenizer,
|
83 |
+
aggregation_strategy="simple" )
|
84 |
+
r = token_classifier("Here, we report in-situ characterization of melt-flow dynamics in every location of the entire melt pool in laser metal additive manufacturing by populous and uniformly dispersed micro-tracers through in-situ high-resolution synchrotron x-ray imaging .")
|
85 |
+
|
86 |
+
for entity in r:
|
87 |
+
print(entity)
|
88 |
+
```
|
89 |
+
|
90 |
+
|
91 |
[More Information Needed]
|
92 |
|
93 |
## Training Details
|