File size: 2,471 Bytes
1edb02d 53d403d 1edb02d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
---
library_name: transformers.js
pipeline_tag: object-detection
license: agpl-3.0
---
# YOLOv10: Real-Time End-to-End Object Detection
ONNX weights for https://github.com/THU-MIG/yolov10.
Latency-accuracy trade-offs | Size-accuracy trade-offs
:-------------------------:|:-------------------------:
 | 
## Usage (Transformers.js)
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@xenova/transformers) using:
```bash
npm i @xenova/transformers
```
**Example:** Perform object-detection.
```js
import { AutoModel, AutoProcessor, RawImage } from '@xenova/transformers';
// Load model
const model = await AutoModel.from_pretrained('onnx-community/yolov10l', {
// quantized: false, // (Optional) Use unquantized version.
})
// Load processor
const processor = await AutoProcessor.from_pretrained('onnx-community/yolov10l');
// Read image and run processor
const url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/city-streets.jpg';
const image = await RawImage.read(url);
const { pixel_values } = await processor(image);
// Run object detection
const { output0 } = await model({ images: pixel_values });
const predictions = output0.tolist()[0];
const threshold = 0.5;
for (const [xmin, ymin, xmax, ymax, score, id] of predictions) {
if (score < threshold) continue;
const bbox = [xmin, ymin, xmax, ymax].map(x => x.toFixed(2)).join(', ')
console.log(`Found "${model.config.id2label[id]}" at [${bbox}] with score ${score.toFixed(2)}.`)
}
// Found "person" at [473.05, 430.35, 533.53, 532.43] with score 0.92.
// Found "car" at [447.48, 378.60, 639.69, 478.38] with score 0.92.
// Found "person" at [549.94, 260.96, 591.81, 331.22] with score 0.91.
// Found "person" at [33.50, 469.62, 78.99, 571.88] with score 0.90.
// Found "car" at [177.90, 337.14, 399.34, 418.01] with score 0.90.
// Found "traffic light" at [208.80, 55.90, 233.13, 101.39] with score 0.90.
// Found "bicycle" at [449.02, 477.23, 555.98, 537.56] with score 0.89.
// Found "bicycle" at [352.45, 527.27, 463.67, 588.07] with score 0.89.
// ...
``` |