|
--- |
|
tags: |
|
- document-understading |
|
- endpoints-template |
|
library_name: generic |
|
--- |
|
|
|
# Deploy a Space as inference Endpoint |
|
|
|
_This is a fork of the [naver-clova-ix/donut-base-finetuned-cord-v2](https://huggingface.co/spaces/naver-clova-ix/donut-base-finetuned-cord-v2) Space_ |
|
|
|
This repository implements a custom container for 🤗 Inference Endpoints using a gradio space. |
|
|
|
To use deploy this model as an Inference Endpoint, you have to select Custom as task and a custom image. |
|
|
|
* CPU image: `philschmi/gradio-api:cpu` |
|
* GPU image: `philschmi/gradio-api:gpu` |
|
* PORT: `7860` |
|
* Health Route: `/` |
|
|
|
If you want to use the UI with the inference Endpoint, you have to select as endpoint type `public` and add [auth through gradio](https://gradio.app/docs/#launch-header) |
|
|
|
### Example API Request Payload |
|
|
|
```json |
|
|
|
``` |
|
|
|
|
|
|