Spaces:
Running
Running
| from typing import List | |
| import gradio as gr | |
| from sentence_transformers import SentenceTransformer | |
| model = SentenceTransformer("nomic-ai/nomic-embed-text-v1.5", trust_remote_code=True, device='cpu') | |
| def embed(document: str): | |
| return model.encode(document) | |
| with gr.Blocks(title="Nomic Text Embeddings") as app: | |
| gr.Markdown("# Nomic Text Embeddings v1.5") | |
| gr.Markdown("Generate embeddings for your text using the nomic-embed-text-v1.5 model.") | |
| # Create an input text box | |
| text_input = gr.Textbox(label="Enter text to embed", placeholder="Type or paste your text here...") | |
| # Create an output component to display the embedding | |
| output = gr.JSON(label="Text Embedding") | |
| # Add a submit button with API name | |
| submit_btn = gr.Button("Generate Embedding", variant="primary") | |
| # Handle both button click and text submission | |
| submit_btn.click(embed, inputs=text_input, outputs=output, api_name="predict") | |
| text_input.submit(embed, inputs=text_input, outputs=output) | |
| # Add API usage guide | |
| gr.Markdown("## API Usage") | |
| gr.Markdown(""" | |
| You can use this API programmatically. Hugging Face Spaces requires using their client libraries which handle queuing automatically. | |
| ### Quick Command-Line Usage | |
| ```bash | |
| # Install gradio client | |
| pip install gradio_client | |
| # Generate embedding with one command | |
| python -c "from gradio_client import Client; print(Client('ipepe/nomic-embeddings').predict('Your text here', api_name='/predict'))" | |
| ``` | |
| ### Python Example (Recommended) | |
| ```python | |
| from gradio_client import Client | |
| client = Client("ipepe/nomic-embeddings") | |
| result = client.predict( | |
| "Your text to embed goes here", | |
| api_name="/predict" | |
| ) | |
| print(result) # Returns the embedding array | |
| ``` | |
| ### JavaScript/Node.js Example | |
| ```javascript | |
| import { client } from "@gradio/client"; | |
| const app = await client("ipepe/nomic-embeddings"); | |
| const result = await app.predict("/predict", ["Your text to embed goes here"]); | |
| console.log(result.data); | |
| ``` | |
| ### Direct HTTP (Advanced) | |
| Direct HTTP requests require implementing the Gradio queue protocol: | |
| 1. POST to `/queue/join` to join queue | |
| 2. Listen to `/queue/data` via SSE for results | |
| 3. Handle session management | |
| For direct HTTP, we recommend using the official Gradio clients above which handle this automatically. | |
| The response will contain the embedding array as a list of floats. | |
| """) | |
| if __name__ == '__main__': | |
| app.launch(server_name="0.0.0.0", show_error=True, server_port=7860) |