Update README.md
Browse files
README.md
CHANGED
@@ -280,3 +280,478 @@ lm_eval \
|
|
280 |
</tr>
|
281 |
</tbody>
|
282 |
</table>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
280 |
</tr>
|
281 |
</tbody>
|
282 |
</table>
|
283 |
+
|
284 |
+
## Inference Performance
|
285 |
+
|
286 |
+
|
287 |
+
This model achieves up to 1.6x speedup in single-stream deployment and up to 1.4x speedup in multi-stream asynchronous deployment, depending on hardware and use-case scenario.
|
288 |
+
The following performance benchmarks were conducted with [vLLM](https://docs.vllm.ai/en/latest/) version 0.7.2, and [GuideLLM](https://github.com/neuralmagic/guidellm).
|
289 |
+
|
290 |
+
<details>
|
291 |
+
<summary>Benchmarking Command</summary>
|
292 |
+
|
293 |
+
```
|
294 |
+
guidellm --model neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w8a8 --target "http://localhost:8000/v1" --data-type emulated --data "prompt_tokens=<prompt_tokens>,generated_tokens=<generated_tokens>" --max seconds 360 --backend aiohttp_server
|
295 |
+
```
|
296 |
+
</details>
|
297 |
+
|
298 |
+
### Single-stream performance (measured with vLLM version 0.7.2)
|
299 |
+
<table>
|
300 |
+
<thead>
|
301 |
+
<tr>
|
302 |
+
<th></th>
|
303 |
+
<th></th>
|
304 |
+
<th></th>
|
305 |
+
<th style="text-align: center;" colspan="2" >Instruction Following<br>256 / 128</th>
|
306 |
+
<th style="text-align: center;" colspan="2" >Multi-turn Chat<br>512 / 256</th>
|
307 |
+
<th style="text-align: center;" colspan="2" >Docstring Generation<br>768 / 128</th>
|
308 |
+
<th style="text-align: center;" colspan="2" >RAG<br>1024 / 128</th>
|
309 |
+
<th style="text-align: center;" colspan="2" >Code Completion<br>256 / 1024</th>
|
310 |
+
<th style="text-align: center;" colspan="2" >Code Fixing<br>1024 / 1024</th>
|
311 |
+
<th style="text-align: center;" colspan="2" >Large Summarization<br>4096 / 512</th>
|
312 |
+
<th style="text-align: center;" colspan="2" >Large RAG<br>10240 / 1536</th>
|
313 |
+
</tr>
|
314 |
+
<tr>
|
315 |
+
<th>Hardware</th>
|
316 |
+
<th>Model</th>
|
317 |
+
<th>Average cost reduction</th>
|
318 |
+
<th>Latency (s)</th>
|
319 |
+
<th>QPD</th>
|
320 |
+
<th>Latency (s)</th>
|
321 |
+
<th>QPD</th>
|
322 |
+
<th>Latency (s)</th>
|
323 |
+
<th>QPD</th>
|
324 |
+
<th>Latency (s)</th>
|
325 |
+
<th>QPD</th>
|
326 |
+
<th>Latency (s)</th>
|
327 |
+
<th>QPD</th>
|
328 |
+
<th>Latency (s)</th>
|
329 |
+
<th>QPD</th>
|
330 |
+
<th>Latency (s)</th>
|
331 |
+
<th>QPD</th>
|
332 |
+
<th>Latency (s)</th>
|
333 |
+
<th>QPD</th>
|
334 |
+
</tr>
|
335 |
+
</thead>
|
336 |
+
<tbody style="text-align: center" >
|
337 |
+
<tr>
|
338 |
+
<th rowspan="3" valign="top">A6000x1</th>
|
339 |
+
<th>deepseek-ai/DeepSeek-R1-Distill-Llama-8B</th>
|
340 |
+
<td>---</td>
|
341 |
+
<td>3.0</td>
|
342 |
+
<td>1511</td>
|
343 |
+
<td>6.0</td>
|
344 |
+
<td>755</td>
|
345 |
+
<td>3.0</td>
|
346 |
+
<td>1483</td>
|
347 |
+
<td>3.1</td>
|
348 |
+
<td>1462</td>
|
349 |
+
<td>23.6</td>
|
350 |
+
<td>191</td>
|
351 |
+
<td>24.0</td>
|
352 |
+
<td>188</td>
|
353 |
+
<td>12.7</td>
|
354 |
+
<td>353</td>
|
355 |
+
<td>41.1</td>
|
356 |
+
<td>110</td>
|
357 |
+
</tr>
|
358 |
+
<tr>
|
359 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w8a8</th>
|
360 |
+
<td>1.53</td>
|
361 |
+
<td>1.9</td>
|
362 |
+
<td>2356</td>
|
363 |
+
<td>3.8</td>
|
364 |
+
<td>1175</td>
|
365 |
+
<td>2.0</td>
|
366 |
+
<td>2291</td>
|
367 |
+
<td>2.0</td>
|
368 |
+
<td>2207</td>
|
369 |
+
<td>15.2</td>
|
370 |
+
<td>297</td>
|
371 |
+
<td>15.5</td>
|
372 |
+
<td>290</td>
|
373 |
+
<td>8.5</td>
|
374 |
+
<td>531</td>
|
375 |
+
<td>28.6</td>
|
376 |
+
<td>157</td>
|
377 |
+
</tr>
|
378 |
+
<tr>
|
379 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w4a16</th>
|
380 |
+
<td>2.35</td>
|
381 |
+
<td>1.2</td>
|
382 |
+
<td>3870</td>
|
383 |
+
<td>2.3</td>
|
384 |
+
<td>1918</td>
|
385 |
+
<td>1.3</td>
|
386 |
+
<td>3492</td>
|
387 |
+
<td>1.3</td>
|
388 |
+
<td>3335</td>
|
389 |
+
<td>9.1</td>
|
390 |
+
<td>492</td>
|
391 |
+
<td>9.5</td>
|
392 |
+
<td>472</td>
|
393 |
+
<td>5.8</td>
|
394 |
+
<td>771</td>
|
395 |
+
<td>22.7</td>
|
396 |
+
<td>198</td>
|
397 |
+
</tr>
|
398 |
+
<tr>
|
399 |
+
<th rowspan="3" valign="top">A100x1</th>
|
400 |
+
<th>deepseek-ai/DeepSeek-R1-Distill-Llama-8B</th>
|
401 |
+
<td>---</td>
|
402 |
+
<td>1.5</td>
|
403 |
+
<td>1308</td>
|
404 |
+
<td>3.1</td>
|
405 |
+
<td>657</td>
|
406 |
+
<td>1.6</td>
|
407 |
+
<td>1274</td>
|
408 |
+
<td>1.6</td>
|
409 |
+
<td>1263</td>
|
410 |
+
<td>12.1</td>
|
411 |
+
<td>166</td>
|
412 |
+
<td>12.4</td>
|
413 |
+
<td>162</td>
|
414 |
+
<td>6.5</td>
|
415 |
+
<td>308</td>
|
416 |
+
<td>25.6</td>
|
417 |
+
<td>78</td>
|
418 |
+
</tr>
|
419 |
+
<tr>
|
420 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w8a8</th>
|
421 |
+
<td>1.30</td>
|
422 |
+
<td>1.1</td>
|
423 |
+
<td>1763</td>
|
424 |
+
<td>2.3</td>
|
425 |
+
<td>882</td>
|
426 |
+
<td>1.2</td>
|
427 |
+
<td>1716</td>
|
428 |
+
<td>1.2</td>
|
429 |
+
<td>1698</td>
|
430 |
+
<td>9.0</td>
|
431 |
+
<td>223</td>
|
432 |
+
<td>9.2</td>
|
433 |
+
<td>218</td>
|
434 |
+
<td>4.9</td>
|
435 |
+
<td>409</td>
|
436 |
+
<td>25.7</td>
|
437 |
+
<td>78</td>
|
438 |
+
</tr>
|
439 |
+
<tr>
|
440 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w4a16</th>
|
441 |
+
<td>1.76</td>
|
442 |
+
<td>0.8</td>
|
443 |
+
<td>2501</td>
|
444 |
+
<td>1.6</td>
|
445 |
+
<td>1236</td>
|
446 |
+
<td>0.9</td>
|
447 |
+
<td>2350</td>
|
448 |
+
<td>0.9</td>
|
449 |
+
<td>2287</td>
|
450 |
+
<td>6.4</td>
|
451 |
+
<td>316</td>
|
452 |
+
<td>6.6</td>
|
453 |
+
<td>306</td>
|
454 |
+
<td>3.7</td>
|
455 |
+
<td>544</td>
|
456 |
+
<td>24.7</td>
|
457 |
+
<td>82</td>
|
458 |
+
</tr>
|
459 |
+
<tr>
|
460 |
+
<th rowspan="3" valign="top">H100x1</th>
|
461 |
+
<th>deepseek-ai/DeepSeek-R1-Distill-Llama-8B</th>
|
462 |
+
<td>---</td>
|
463 |
+
<td>1.0</td>
|
464 |
+
<td>1146</td>
|
465 |
+
<td>1.9</td>
|
466 |
+
<td>574</td>
|
467 |
+
<td>1.0</td>
|
468 |
+
<td>1128</td>
|
469 |
+
<td>1.0</td>
|
470 |
+
<td>1111</td>
|
471 |
+
<td>7.6</td>
|
472 |
+
<td>144</td>
|
473 |
+
<td>7.7</td>
|
474 |
+
<td>142</td>
|
475 |
+
<td>4.1</td>
|
476 |
+
<td>266</td>
|
477 |
+
<td>16.3</td>
|
478 |
+
<td>67</td>
|
479 |
+
</tr>
|
480 |
+
<tr>
|
481 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-FP8-dynamic</th>
|
482 |
+
<td>1.25</td>
|
483 |
+
<td>0.7</td>
|
484 |
+
<td>1567</td>
|
485 |
+
<td>1.4</td>
|
486 |
+
<td>758</td>
|
487 |
+
<td>0.7</td>
|
488 |
+
<td>1484</td>
|
489 |
+
<td>0.7</td>
|
490 |
+
<td>1462</td>
|
491 |
+
<td>5.7</td>
|
492 |
+
<td>191</td>
|
493 |
+
<td>5.8</td>
|
494 |
+
<td>189</td>
|
495 |
+
<td>3.2</td>
|
496 |
+
<td>347</td>
|
497 |
+
<td>22.5</td>
|
498 |
+
<td>49</td>
|
499 |
+
</tr>
|
500 |
+
<tr>
|
501 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w4a16</th>
|
502 |
+
<td>1.30</td>
|
503 |
+
<td>0.7</td>
|
504 |
+
<td>1527</td>
|
505 |
+
<td>1.4</td>
|
506 |
+
<td>768</td>
|
507 |
+
<td>0.7</td>
|
508 |
+
<td>1495</td>
|
509 |
+
<td>0.7</td>
|
510 |
+
<td>1463</td>
|
511 |
+
<td>5.6</td>
|
512 |
+
<td>194</td>
|
513 |
+
<td>5.7</td>
|
514 |
+
<td>190</td>
|
515 |
+
<td>3.1</td>
|
516 |
+
<td>350</td>
|
517 |
+
<td>14.7</td>
|
518 |
+
<td>74</td>
|
519 |
+
</tr>
|
520 |
+
</tbody>
|
521 |
+
</table>
|
522 |
+
|
523 |
+
**Use case profiles: prompt tokens / generation tokens
|
524 |
+
|
525 |
+
**QPD: Queries per dollar, based on on-demand cost at [Lambda Labs](https://lambdalabs.com/service/gpu-cloud) (observed on 2/18/2025).
|
526 |
+
|
527 |
+
|
528 |
+
### Multi-stream asynchronous performance (measured with vLLM version 0.7.2)
|
529 |
+
<table>
|
530 |
+
<thead>
|
531 |
+
<tr>
|
532 |
+
<th></th>
|
533 |
+
<th></th>
|
534 |
+
<th></th>
|
535 |
+
<th style="text-align: center;" colspan="2" >Instruction Following<br>256 / 128</th>
|
536 |
+
<th style="text-align: center;" colspan="2" >Multi-turn Chat<br>512 / 256</th>
|
537 |
+
<th style="text-align: center;" colspan="2" >Docstring Generation<br>768 / 128</th>
|
538 |
+
<th style="text-align: center;" colspan="2" >RAG<br>1024 / 128</th>
|
539 |
+
<th style="text-align: center;" colspan="2" >Code Completion<br>256 / 1024</th>
|
540 |
+
<th style="text-align: center;" colspan="2" >Code Fixing<br>1024 / 1024</th>
|
541 |
+
<th style="text-align: center;" colspan="2" >Large Summarization<br>4096 / 512</th>
|
542 |
+
<th style="text-align: center;" colspan="2" >Large RAG<br>10240 / 1536</th>
|
543 |
+
</tr>
|
544 |
+
<tr>
|
545 |
+
<th>Hardware</th>
|
546 |
+
<th>Model</th>
|
547 |
+
<th>Average cost reduction</th>
|
548 |
+
<th>Maximum throughput (QPS)</th>
|
549 |
+
<th>QPD</th>
|
550 |
+
<th>Maximum throughput (QPS)</th>
|
551 |
+
<th>QPD</th>
|
552 |
+
<th>Maximum throughput (QPS)</th>
|
553 |
+
<th>QPD</th>
|
554 |
+
<th>Maximum throughput (QPS)</th>
|
555 |
+
<th>QPD</th>
|
556 |
+
<th>Maximum throughput (QPS)</th>
|
557 |
+
<th>QPD</th>
|
558 |
+
<th>Maximum throughput (QPS)</th>
|
559 |
+
<th>QPD</th>
|
560 |
+
<th>Maximum throughput (QPS)</th>
|
561 |
+
<th>QPD</th>
|
562 |
+
<th>Maximum throughput (QPS)</th>
|
563 |
+
<th>QPD</th>
|
564 |
+
</tr>
|
565 |
+
</thead>
|
566 |
+
<tbody style="text-align: center" >
|
567 |
+
<tr>
|
568 |
+
<th rowspan="3" valign="top">A6000x1</th>
|
569 |
+
<th>deepseek-ai/DeepSeek-R1-Distill-Llama-8B</th>
|
570 |
+
<td>---</td>
|
571 |
+
<td>12.6</td>
|
572 |
+
<td>56742</td>
|
573 |
+
<td>5.7</td>
|
574 |
+
<td>25687</td>
|
575 |
+
<td>6.5</td>
|
576 |
+
<td>29349</td>
|
577 |
+
<td>5.2</td>
|
578 |
+
<td>23259</td>
|
579 |
+
<td>1.6</td>
|
580 |
+
<td>7250</td>
|
581 |
+
<td>1.2</td>
|
582 |
+
<td>5181</td>
|
583 |
+
<td>0.8</td>
|
584 |
+
<td>3445</td>
|
585 |
+
<td>0.1</td>
|
586 |
+
<td>616</td>
|
587 |
+
</tr>
|
588 |
+
<tr>
|
589 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w8a8</th>
|
590 |
+
<td>1.34</td>
|
591 |
+
<td>17.4</td>
|
592 |
+
<td>78101</td>
|
593 |
+
<td>7.6</td>
|
594 |
+
<td>34351</td>
|
595 |
+
<td>8.8</td>
|
596 |
+
<td>39790</td>
|
597 |
+
<td>7.0</td>
|
598 |
+
<td>31532</td>
|
599 |
+
<td>2.3</td>
|
600 |
+
<td>10405</td>
|
601 |
+
<td>1.5</td>
|
602 |
+
<td>6960</td>
|
603 |
+
<td>1.0</td>
|
604 |
+
<td>4355</td>
|
605 |
+
<td>0.2</td>
|
606 |
+
<td>785</td>
|
607 |
+
</tr>
|
608 |
+
<tr>
|
609 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w4a16</th>
|
610 |
+
<td>0.91</td>
|
611 |
+
<td>10.9</td>
|
612 |
+
<td>48964</td>
|
613 |
+
<td>5.1</td>
|
614 |
+
<td>22989</td>
|
615 |
+
<td>4.8</td>
|
616 |
+
<td>21791</td>
|
617 |
+
<td>3.8</td>
|
618 |
+
<td>17039</td>
|
619 |
+
<td>2.2</td>
|
620 |
+
<td>9726</td>
|
621 |
+
<td>1.2</td>
|
622 |
+
<td>5434</td>
|
623 |
+
<td>0.6</td>
|
624 |
+
<td>2544</td>
|
625 |
+
<td>0.1</td>
|
626 |
+
<td>578</td>
|
627 |
+
</tr>
|
628 |
+
<tr>
|
629 |
+
<th rowspan="3" valign="top">A100x1</th>
|
630 |
+
<th>deepseek-ai/DeepSeek-R1-Distill-Llama-8B</th>
|
631 |
+
<td>---</td>
|
632 |
+
<td>24.5</td>
|
633 |
+
<td>49296</td>
|
634 |
+
<td>11.3</td>
|
635 |
+
<td>22657</td>
|
636 |
+
<td>13.0</td>
|
637 |
+
<td>26047</td>
|
638 |
+
<td>10.5</td>
|
639 |
+
<td>21020</td>
|
640 |
+
<td>3.5</td>
|
641 |
+
<td>7029</td>
|
642 |
+
<td>2.5</td>
|
643 |
+
<td>4995</td>
|
644 |
+
<td>1.7</td>
|
645 |
+
<td>3503</td>
|
646 |
+
<td>0.3</td>
|
647 |
+
<td>659</td>
|
648 |
+
</tr>
|
649 |
+
<tr>
|
650 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w8a8</th>
|
651 |
+
<td>1.27</td>
|
652 |
+
<td>30.8</td>
|
653 |
+
<td>62042</td>
|
654 |
+
<td>14.1</td>
|
655 |
+
<td>28419</td>
|
656 |
+
<td>17.2</td>
|
657 |
+
<td>34554</td>
|
658 |
+
<td>13.8</td>
|
659 |
+
<td>27719</td>
|
660 |
+
<td>4.6</td>
|
661 |
+
<td>9299</td>
|
662 |
+
<td>3.1</td>
|
663 |
+
<td>6215</td>
|
664 |
+
<td>2.2</td>
|
665 |
+
<td>4331</td>
|
666 |
+
<td>0.4</td>
|
667 |
+
<td>807</td>
|
668 |
+
</tr>
|
669 |
+
<tr>
|
670 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w4a16</th>
|
671 |
+
<td>0.97</td>
|
672 |
+
<td>22.7</td>
|
673 |
+
<td>45708</td>
|
674 |
+
<td>10.5</td>
|
675 |
+
<td>21216</td>
|
676 |
+
<td>11.1</td>
|
677 |
+
<td>22353</td>
|
678 |
+
<td>8.9</td>
|
679 |
+
<td>17939</td>
|
680 |
+
<td>3.9</td>
|
681 |
+
<td>7758</td>
|
682 |
+
<td>2.6</td>
|
683 |
+
<td>5241</td>
|
684 |
+
<td>1.6</td>
|
685 |
+
<td>3196</td>
|
686 |
+
<td>0.4</td>
|
687 |
+
<td>718</td>
|
688 |
+
</tr>
|
689 |
+
<tr>
|
690 |
+
<th rowspan="3" valign="top">H100x1</th>
|
691 |
+
<th>deepseek-ai/DeepSeek-R1-Distill-Llama-8B</th>
|
692 |
+
<td>---</td>
|
693 |
+
<td>49.0</td>
|
694 |
+
<td>53593</td>
|
695 |
+
<td>22.6</td>
|
696 |
+
<td>24750</td>
|
697 |
+
<td>28.3</td>
|
698 |
+
<td>30971</td>
|
699 |
+
<td>22.9</td>
|
700 |
+
<td>25035</td>
|
701 |
+
<td>7.2</td>
|
702 |
+
<td>7912</td>
|
703 |
+
<td>5.1</td>
|
704 |
+
<td>5561</td>
|
705 |
+
<td>3.6</td>
|
706 |
+
<td>3939</td>
|
707 |
+
<td>0.6</td>
|
708 |
+
<td>703</td>
|
709 |
+
</tr>
|
710 |
+
<tr>
|
711 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-FP8-dynamic</th>
|
712 |
+
<td>1.14</td>
|
713 |
+
<td>57.1</td>
|
714 |
+
<td>62517</td>
|
715 |
+
<td>26.0</td>
|
716 |
+
<td>28440</td>
|
717 |
+
<td>34.5</td>
|
718 |
+
<td>37781</td>
|
719 |
+
<td>28.7</td>
|
720 |
+
<td>31360</td>
|
721 |
+
<td>7.2</td>
|
722 |
+
<td>7877</td>
|
723 |
+
<td>5.4</td>
|
724 |
+
<td>5923</td>
|
725 |
+
<td>4.3</td>
|
726 |
+
<td>4697</td>
|
727 |
+
<td>0.7</td>
|
728 |
+
<td>782</td>
|
729 |
+
</tr>
|
730 |
+
<tr>
|
731 |
+
<th>neuralmagic/DeepSeek-R1-Distill-Llama-8B-quantized.w4a16</th>
|
732 |
+
<td>1.01</td>
|
733 |
+
<td>49.8</td>
|
734 |
+
<td>54452</td>
|
735 |
+
<td>22.9</td>
|
736 |
+
<td>25035</td>
|
737 |
+
<td>28.5</td>
|
738 |
+
<td>31162</td>
|
739 |
+
<td>23.0</td>
|
740 |
+
<td>25200</td>
|
741 |
+
<td>6.8</td>
|
742 |
+
<td>7493</td>
|
743 |
+
<td>5.0</td>
|
744 |
+
<td>5431</td>
|
745 |
+
<td>3.7</td>
|
746 |
+
<td>4079</td>
|
747 |
+
<td>0.7</td>
|
748 |
+
<td>787</td>
|
749 |
+
</tr>
|
750 |
+
</tbody>
|
751 |
+
</table>
|
752 |
+
|
753 |
+
**Use case profiles: prompt tokens / generation tokens
|
754 |
+
|
755 |
+
**QPS: Queries per second.
|
756 |
+
|
757 |
+
**QPD: Queries per dollar, based on on-demand cost at [Lambda Labs](https://lambdalabs.com/service/gpu-cloud) (observed on 2/18/2025).
|