Clement Vachet commited on
Commit
b85c571
·
1 Parent(s): a35fd46

docs: add section about hugging face deployment

Browse files
Files changed (1) hide show
  1. README.md +25 -4
README.md CHANGED
@@ -16,21 +16,33 @@ short_description: Object detection Lambda
16
  - Front-end: user interface via Gradio library
17
  - Back-end: use of AWS Lambda function to run deployed ML models
18
 
 
 
 
 
 
 
 
19
  ## 1. Local development
20
 
 
21
 
22
- ### 1.1. Building the docker image
 
 
23
 
24
  bash
25
  > docker build -t object-detection-lambda .
26
 
27
- ### 1.2. Running the docker container locally
28
 
29
  bash
30
 
31
  > docker run --name object-detection-lambda-cont -p 8080:8080 object-detection-lambda
32
 
33
- ### 1.3. Execution via user interface
 
 
34
  Use of Gradio library for web interface
35
 
36
  <b>Note:</b> The environment variable ```AWS_API``` should point to the local container
@@ -42,7 +54,7 @@ Command line for execution:
42
  The Gradio web application should now be accessible at http://localhost:7860
43
 
44
 
45
- ### 1.4. Execution via command line:
46
 
47
  Example of a prediction request
48
 
@@ -142,3 +154,12 @@ python
142
  > --file ./tests/data/boats.jpg \
143
  > --model yolos-small
144
 
 
 
 
 
 
 
 
 
 
 
16
  - Front-end: user interface via Gradio library
17
  - Back-end: use of AWS Lambda function to run deployed ML models
18
 
19
+
20
+ <b>Menu: </b>
21
+ - [Local development](#1-local-development)
22
+ - [AWS deployment](#2-deployment-to-aws)
23
+ - [Hugging Face deployment](#3-deployment-to-hugging-face)
24
+
25
+
26
  ## 1. Local development
27
 
28
+ ### 1.1. Build and run the Docker container
29
 
30
+ <details>
31
+
32
+ Step 1 - Building the docker image
33
 
34
  bash
35
  > docker build -t object-detection-lambda .
36
 
37
+ Step 2 - Running the docker container locally
38
 
39
  bash
40
 
41
  > docker run --name object-detection-lambda-cont -p 8080:8080 object-detection-lambda
42
 
43
+ </details>
44
+
45
+ ### 1.2. Execution via user interface
46
  Use of Gradio library for web interface
47
 
48
  <b>Note:</b> The environment variable ```AWS_API``` should point to the local container
 
54
  The Gradio web application should now be accessible at http://localhost:7860
55
 
56
 
57
+ ### 1.3. Execution via command line:
58
 
59
  Example of a prediction request
60
 
 
154
  > --file ./tests/data/boats.jpg \
155
  > --model yolos-small
156
 
157
+
158
+ ## 3. Deployment to Hugging Face
159
+
160
+ This web application is available on Hugging Face
161
+
162
+ Hugging Face space URL:
163
+ https://huggingface.co/spaces/cvachet/object_detection_lambda
164
+
165
+ Note: This space uses the ML model deployed on AWS Lambda