Spaces:
Sleeping
Sleeping
Fixing bug introduced in app.py
Browse files
README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
---
|
2 |
-
title:
|
3 |
-
emoji:
|
4 |
colorFrom: purple
|
5 |
colorTo: gray
|
6 |
sdk: gradio
|
|
|
1 |
---
|
2 |
+
title: Engineering Summer Bridge Demo
|
3 |
+
emoji:
|
4 |
colorFrom: purple
|
5 |
colorTo: gray
|
6 |
sdk: gradio
|
app.py
CHANGED
@@ -27,8 +27,6 @@ As a derivate work of [Llama-2-7b-chat](https://huggingface.co/meta-llama/Llama-
|
|
27 |
this demo is governed by the original [license](https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/LICENSE.txt) and [acceptable use policy](https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/USE_POLICY.md).
|
28 |
"""
|
29 |
|
30 |
-
default
|
31 |
-
|
32 |
if not torch.cuda.is_available():
|
33 |
DESCRIPTION += "We won't be able to run this space! We need GPU processing"
|
34 |
|
|
|
27 |
this demo is governed by the original [license](https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/LICENSE.txt) and [acceptable use policy](https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/USE_POLICY.md).
|
28 |
"""
|
29 |
|
|
|
|
|
30 |
if not torch.cuda.is_available():
|
31 |
DESCRIPTION += "We won't be able to run this space! We need GPU processing"
|
32 |
|