ek-id commited on
Commit
987fcda
·
verified ·
1 Parent(s): 1508b83

Add Transformers.js and WebNN example to README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -1
README.md CHANGED
@@ -10,6 +10,7 @@ license: apache-2.0
10
  pipeline_tag: text-classification
11
  tags:
12
  - Intel
 
13
  model-index:
14
  - name: polite-guard
15
  results:
@@ -92,10 +93,33 @@ You can use this model directly with a pipeline for categorizing text into class
92
  ```python
93
  from transformers import pipeline
94
 
95
- classifier = pipeline("text-classification", model="Intel/polite-guard")
96
  text = "Your input text"
97
  print(classifier(text))
98
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
99
  ## Articles
100
 
101
  To learn more about the implementation of the data generator and fine-tuner packages, refer to
 
10
  pipeline_tag: text-classification
11
  tags:
12
  - Intel
13
+ - transformers.js
14
  model-index:
15
  - name: polite-guard
16
  results:
 
93
  ```python
94
  from transformers import pipeline
95
 
96
+ classifier = pipeline("text-classification", "Intel/polite-guard")
97
  text = "Your input text"
98
  print(classifier(text))
99
  ```
100
+
101
+ The next example demonstrates how to run this model in the browser using Hugging Face's `transformers.js` library with `webnn-gpu` for hardware acceleration.
102
+
103
+ ```html
104
+ <!DOCTYPE html>
105
+ <html>
106
+ <body>
107
+ <h1>WebNN Transformers.js Intel/polite-guard</h1>
108
+ <script type="module">
109
+ import { pipeline } from "https://cdn.jsdelivr.net/npm/@huggingface/transformers";
110
+
111
+ const classifier = await pipeline("text-classification", "Intel/polite-guard", {
112
+ dtype: "fp32",
113
+ device: "webnn-gpu", // You can also try: "webgpu", "webnn", "webnn-npu", "webnn-cpi", "wasm"
114
+ });
115
+
116
+ const text = "Your input text";
117
+ const output = await classifier(text);
118
+ console.log(`${text}: ${output[0].label}`);
119
+ </script>
120
+ </body>
121
+ </html>
122
+ ```
123
  ## Articles
124
 
125
  To learn more about the implementation of the data generator and fine-tuner packages, refer to