Spaces:
Sleeping
Sleeping
Deploying Pythonic RAG
Browse files- .gitignore +2 -1
- README.md +5 -0
- app.py +3 -0
- pythonic-rag +1 -0
.gitignore
CHANGED
@@ -1,4 +1,5 @@
|
|
1 |
__pycache__/
|
2 |
.chainlit/
|
3 |
.venv/
|
4 |
-
.env
|
|
|
|
1 |
__pycache__/
|
2 |
.chainlit/
|
3 |
.venv/
|
4 |
+
.env
|
5 |
+
*.pdf
|
README.md
CHANGED
@@ -168,6 +168,9 @@ Simply put, this downloads the file as a temp file, we load it in with `TextFile
|
|
168 |
|
169 |
Why do we want to support streaming? What about streaming is important, or useful?
|
170 |
|
|
|
|
|
|
|
171 |
### On Chat Start:
|
172 |
|
173 |
The next scope is where "the magic happens". On Chat Start is when a user begins a chat session. This will happen whenever a user opens a new chat window, or refreshes an existing chat window.
|
@@ -210,6 +213,8 @@ Now, we'll save that into our user session!
|
|
210 |
|
211 |
Why are we using User Session here? What about Python makes us need to use this? Why not just store everything in a global variable?
|
212 |
|
|
|
|
|
213 |
### On Message
|
214 |
|
215 |
First, we load our chain from the user session:
|
|
|
168 |
|
169 |
Why do we want to support streaming? What about streaming is important, or useful?
|
170 |
|
171 |
+
##[Answer] For seamliess user experience, we want to stream the response to the user as soon as possible. This allows the user to see the response as it is being generated, and it also allows the user to interact with the application while the response is being generated.
|
172 |
+
|
173 |
+
|
174 |
### On Chat Start:
|
175 |
|
176 |
The next scope is where "the magic happens". On Chat Start is when a user begins a chat session. This will happen whenever a user opens a new chat window, or refreshes an existing chat window.
|
|
|
213 |
|
214 |
Why are we using User Session here? What about Python makes us need to use this? Why not just store everything in a global variable?
|
215 |
|
216 |
+
##[Answer] We are using User Session here because we want to store the chain in the user session. This allows us to reuse the chain for the same user across multiple chat sessions. If we stored everything in a global variable, we would not be able to reuse the chain for the same user across multiple chat sessions.
|
217 |
+
|
218 |
### On Message
|
219 |
|
220 |
First, we load our chain from the user session:
|
app.py
CHANGED
@@ -12,6 +12,9 @@ from aimakerspace.vectordatabase import VectorDatabase
|
|
12 |
from aimakerspace.openai_utils.chatmodel import ChatOpenAI
|
13 |
import chainlit as cl
|
14 |
|
|
|
|
|
|
|
15 |
system_template = """\
|
16 |
Use the following context to answer a users question. If you cannot find the answer in the context, say you don't know the answer."""
|
17 |
system_role_prompt = SystemRolePrompt(system_template)
|
|
|
12 |
from aimakerspace.openai_utils.chatmodel import ChatOpenAI
|
13 |
import chainlit as cl
|
14 |
|
15 |
+
from dotenv import load_dotenv
|
16 |
+
load_dotenv()
|
17 |
+
|
18 |
system_template = """\
|
19 |
Use the following context to answer a users question. If you cannot find the answer in the context, say you don't know the answer."""
|
20 |
system_role_prompt = SystemRolePrompt(system_template)
|
pythonic-rag
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
Subproject commit 05854ba074d94068784ee4330210ced8dfe073e8
|