Spaces:
Sleeping
A newer version of the Gradio SDK is available:
6.0.2
Gradio Agents & MCP Hackathon
Winners β Gradio logo β‘ Quickstart βοΈ Docs π’ Playground πΌοΈ Custom Components π Community
Search βK
5.44.0 Getting Started Quickstart Building Interfaces The Interface Class More On Examples Flagging Interface State Reactive Interfaces Four Kinds Of Interfaces Building With Blocks Blocks And Event Listeners Controlling Layout State In Blocks Dynamic Apps With Render Decorator More Blocks Features Custom CSS And JS Using Blocks Like Functions Additional Features Queuing Streaming Outputs Streaming Inputs Alerts Progress Bars Batch Functions Sharing Your App File Access Multipage Apps Environment Variables Resource Cleanup Themes Client Side Functions View Api Page Internationalization Chatbots Creating A Chatbot Fast Chatinterface Examples Agents And Tool Usage Creating A Custom Chatbot With Blocks Chatbot Specific Events Creating A Discord Bot From A Gradio App Creating A Slack Bot From A Gradio App Creating A Website Widget From A Gradio Chatbot Data Science And Plots Creating Plots Time Plots Filters Tables And Stats Connecting To A Database Streaming Streaming Ai Generated Audio Object Detection From Webcam With Webrtc Object Detection From Video Conversational Chatbot Real Time Speech Recognition Automatic Voice Detection Custom Components Custom Components In Five Minutes Key Component Concepts Configuration Backend Frontend Frequently Asked Questions Pdf Component Example Multimodal Chatbot Part1 Documenting Custom Components Gradio Clients And Lite Getting Started With The Python Client Getting Started With The Js Client Querying Gradio Apps With Curl Gradio And Llm Agents Gradio Lite Gradio Lite And Transformers Js Fastapi App With The Gradio Client Mcp Building Mcp Server With Gradio Prerequisites What is an MCP Server? Example: Counting Letters in a Word Key features of the Gradio <> MCP Integration Converting an Existing Space Private Spaces Authentication and Credentials Using the gr.Header class Sending Progress Updates Modifying Tool Descriptions MCP Resources and Prompts Creating MCP Resources Creating MCP Prompts Adding MCP-Only Functions Gradio with FastMCP Troubleshooting your MCP Servers File Upload Mcp Building An Mcp Client With Gradio Using Docs Mcp Other Tutorials[ show ] β Fastapi App With The Gradio Client
File Upload Mcp
β Building an MCP Server with Gradio In this guide, we will describe how to launch your Gradio app so that it functions as an MCP Server.
Punchline: it's as simple as setting mcp_server=True in .launch().
Prerequisites If not already installed, please install Gradio with the MCP extra:
pip install "gradio[mcp]" This will install the necessary dependencies, including the mcp package. Also, you will need an LLM application that supports tool calling using the MCP protocol, such as Claude Desktop, Cursor, or Cline (these are known as "MCP Clients").
What is an MCP Server? An MCP (Model Control Protocol) server is a standardized way to expose tools so that they can be used by LLMs. A tool can provide an LLM functionality that it does not have natively, such as the ability to generate images or calculate the prime factors of a number.
Example: Counting Letters in a Word LLMs are famously not great at counting the number of letters in a word (e.g. the number of "r"-s in "strawberry"). But what if we equip them with a tool to help? Let's start by writing a simple Gradio app that counts the number of letters in a word or phrase:
import gradio as gr
def letter_counter(word, letter): """ Count the number of occurrences of a letter in a word or text.
Args:
word (str): The input text to search through
letter (str): The letter to search for
Returns:
str: A message indicating how many times the letter appears
"""
word = word.lower()
letter = letter.lower()
count = word.count(letter)
return count
demo = gr.Interface( fn=letter_counter, inputs=[gr.Textbox("strawberry"), gr.Textbox("r")], outputs=[gr.Number()], title="Letter Counter", description="Enter text and a letter to count how many times the letter appears in the text." )
if name == "main": demo.launch(mcp_server=True) Notice that we have: (1) included a detailed docstring for our function, and (2) set mcp_server=True in .launch(). This is all that's needed for your Gradio app to serve as an MCP server! Now, when you run this app, it will:
Start the regular Gradio web interface Start the MCP server Print the MCP server URL in the console The MCP server will be accessible at:
http://your-server:port/gradio_api/mcp/sse Gradio automatically converts the letter_counter function into an MCP tool that can be used by LLMs. The docstring of the function and the type hints of arguments will be used to generate the description of the tool and its parameters. The name of the function will be used as the name of your tool. Any initial values you provide to your input components (e.g. "strawberry" and "r" in the gr.Textbox components above) will be used as the default values if your LLM doesn't specify a value for that particular input parameter.
Now, all you need to do is add this URL endpoint to your MCP Client (e.g. Claude Desktop, Cursor, or Cline), which typically means pasting this config in the settings:
{ "mcpServers": { "gradio": { "url": "http://your-server:port/gradio_api/mcp/sse" } } } (By the way, you can find the exact config to copy-paste by going to the "View API" link in the footer of your Gradio app, and then clicking on "MCP").
Key features of the Gradio <> MCP Integration Tool Conversion: Each API endpoint in your Gradio app is automatically converted into an MCP tool with a corresponding name, description, and input schema. To view the tools and schemas, visit http://your-server:port/gradio_api/mcp/schema or go to the "View API" link in the footer of your Gradio app, and then click on "MCP". Environment variable support. There are two ways to enable the MCP server functionality: Using the mcp_server parameter, as shown above:
demo.launch(mcp_server=True) Using environment variables:
export GRADIO_MCP_SERVER=True File Handling: The Gradio MCP server automatically handles file data conversions, including:
Processing image files and returning them in the correct format
Managing temporary file storage
By default, the Gradio MCP server accepts input images and files as full URLs ("http://..." or "https:/..."). For convenience, an additional STDIO-based MCP server is also generated, which can be used to upload files to any remote Gradio app and which returns a URL that can be used for subsequent tool calls.
Hosted MCP Servers on σ π€ Spaces: You can publish your Gradio application for free on Hugging Face Spaces, which will allow you to have a free hosted MCP server. Here's an example of such a Space: https://huggingface.co/spaces/abidlabs/mcp-tools. Notice that you can add this config to your MCP Client to start using the tools from this Space immediately:
{ "mcpServers": { "gradio": { "url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse" } } } Converting an Existing Space If there's an existing Space that you'd like to use an MCP server, you'll need to do three things:
First, duplicate the Space if it is not your own Space. This will allow you to make changes to the app. If the Space requires a GPU, set the hardware of the duplicated Space to be same as the original Space. You can make it either a public Space or a private Space, since it is possible to use either as an MCP server, as described below. Then, add docstrings to the functions that you'd like the LLM to be able to call as a tool. The docstring should be in the same format as the example code above. Finally, add mcp_server=True in .launch(). That's it!
Private Spaces You can use either a public Space or a private Space as an MCP server. If you'd like to use a private Space as an MCP server (or a ZeroGPU Space with your own quota), then you will need to provide your Hugging Face token when you make your request. To do this, simply add it as a header in your config like this:
{ "mcpServers": { "gradio": { "url": "https://abidlabs-mcp-tools.hf.space/gradio_api/mcp/sse", "headers": { "Authorization": "Bearer " } } } } Authentication and Credentials You may wish to authenticate users more precisely or let them provide other kinds of credentials or tokens in order to provide a custom experience for different users.
Gradio allows you to access the underlying starlette.Request that has made the tool call, which means that you can access headers, originating IP address, or any other information that is part of the network request. To do this, simply add a parameter in your function of the type gr.Request, and Gradio will automatically inject the request object as the parameter.
Here's an example:
import gradio as gr
def echo_headers(x, request: gr.Request): return str(dict(request.headers))
gr.Interface(echo_headers, "textbox", "textbox").launch(mcp_server=True) This MCP server will simply ignore the user's input and echo back all of the headers from a user's request. One can build more complex apps using the same idea. See the docs on gr.Request for more information (note that only the core Starlette attributes of the gr.Request object will be present, attributes such as Gradio's .session_hash will not be present).
Using the gr.Header class A common pattern in MCP server development is to use authentication headers to call services on behalf of your users. Instead of using a gr.Request object like in the example above, you can use a gr.Header argument. Gradio will automatically extract that header from the incoming request (if it exists) and pass it to your function.
In the example below, the X-API-Token header is extracted from the incoming request and passed in as the x_api_token argument to make_api_request_on_behalf_of_user.
The benefit of using gr.Header is that the MCP connection docs will automatically display the headers you need to supply when connecting to the server! See the image below:
import gradio as gr
def make_api_request_on_behalf_of_user(prompt: str, x_api_token: gr.Header): """Make a request to everyone's favorite API. Args: prompt: The prompt to send to the API. Returns: The response from the API. Raises: AssertionError: If the API token is not valid. """ return "Hello from the API" if not x_api_token else "Hello from the API with token!"
demo = gr.Interface( make_api_request_on_behalf_of_user, [ gr.Textbox(label="Prompt"), ], gr.Textbox(label="Response"), )
demo.launch(mcp_server=True) MCP Header Connection Page
Sending Progress Updates The Gradio MCP server automatically sends progress updates to your MCP Client based on the queue in the Gradio application. If you'd like to send custom progress updates, you can do so using the same mechanism as you would use to display progress updates in the UI of your Gradio app: by using the gr.Progress class!
Here's an example of how to do this:
import gradio as gr import time
def slow_text_reverser(text: str, progress=gr.Progress()): for i in range(len(text)): progress(i / len(text), desc="Reversing text") time.sleep(0.3) return text[::-1]
demo = gr.Interface(slow_text_reverser, gr.Textbox("Hello, world!"), gr.Textbox())
if name == "main": demo.launch(mcp_server=True) Here are the docs for the gr.Progress class, which can also automatically track tqdm calls.
Modifying Tool Descriptions Gradio automatically sets the tool name based on the name of your function, and the description from the docstring of your function. But you may want to change how the description appears to your LLM. You can do this by using the api_description parameter in Interface, ChatInterface, or any event listener. This parameter takes three different kinds of values:
None (default): the tool description is automatically created from the docstring of the function (or its parent's docstring if it does not have a docstring but inherits from a method that does.) False: no tool description appears to the LLM. str: an arbitrary string to use as the tool description. In addition to modifying the tool descriptions, you can also toggle which tools appear to the LLM. You can do this by setting the show_api parameter, which is by default True. Setting it to False hides the endpoint from the API docs and from the MCP server. If you expose multiple tools, users of your app will also be able to toggle which tools they'd like to add to their MCP server by checking boxes in the "view MCP or API" panel.
Here's an example that shows the api_description and show_api parameters in actions:
import numpy as np import gradio as gr from pathlib import Path import os from PIL import Image
def prime_factors(n: str): """ Compute the prime factorization of a positive integer.
Args:
n (str): The integer to factorize. Must be greater than 1.
"""
n_int = int(n)
if n_int <= 1:
raise ValueError("Input must be an integer greater than 1.")
factors = []
while n_int % 2 == 0:
factors.append(2)
n_int //= 2
divisor = 3
while divisor * divisor <= n_int:
while n_int % divisor == 0:
factors.append(divisor)
n_int //= divisor
divisor += 2
if n_int > 1:
factors.append(n_int)
return factors
def generate_cheetah_image(): """ Generate a cheetah image.
Returns:
The generated cheetah image.
"""
return Path(os.path.dirname(__file__)) / "cheetah.jpg"
def image_orientation(image: Image.Image) -> str: """ Returns whether image is portrait or landscape.
Args:
image (Image.Image): The image to check.
Returns:
str: "Portrait" if image is portrait, "Landscape" if image is landscape.
"""
return "Portrait" if image.height > image.width else "Landscape"
def sepia(input_img): """ Apply a sepia filter to the input image.
Args:
input_img (np.array): The input image to apply the sepia filter to.
Returns:
The sepia filtered image.
"""
sepia_filter = np.array([
[0.393, 0.769, 0.189],
[0.349, 0.686, 0.168],
[0.272, 0.534, 0.131]
])
sepia_img = input_img.dot(sepia_filter.T)
sepia_img /= sepia_img.max()
return sepia_img
demo = gr.TabbedInterface( [ gr.Interface(prime_factors, gr.Textbox("1001"), gr.Textbox()), gr.Interface(generate_cheetah_image, None, gr.Image(), api_description="Generates a cheetah image. No arguments are required."), gr.Interface(image_orientation, gr.Image(type="pil"), gr.Textbox(), show_api=False), gr.Interface(sepia, gr.Image(), gr.Image(), api_description=False), ], [ "Prime Factors", "Cheetah Image", "Image Orientation Checker", "Sepia Filter", ] )
if name == "main": demo.launch(mcp_server=True) MCP Resources and Prompts In addition to tools (which execute functions generally and are the default for any function exposed through the Gradio MCP integration), MCP supports two other important primitives: resources (for exposing data) and prompts (for defining reusable templates). Gradio provides decorators to easily create MCP servers with all three capabilities.
Creating MCP Resources Use the @gr.mcp.resource decorator on any function to expose data through your Gradio app. Resources can be static (always available at a fixed URI) or templated (with parameters in the URI).
""" Adapts the FastMCP quickstart example to work with Gradio's MCP integration. """ import gradio as gr
@gr.mcp.tool() # Not needed as functions are registered as tools by default def add(a: int, b: int) -> int: """Add two numbers""" return a + b
@gr.mcp.resource("greeting://{name}") def get_greeting(name: str) -> str: """Get a personalized greeting""" return f"Hello, {name}!"
@gr.mcp.prompt() def greet_user(name: str, style: str = "friendly") -> str: """Generate a greeting prompt""" styles = { "friendly": "Please write a warm, friendly greeting", "formal": "Please write a formal, professional greeting", "casual": "Please write a casual, relaxed greeting", }
return f"{styles.get(style, styles['friendly'])} for someone named {name}."
demo = gr.TabbedInterface( [ gr.Interface(add, [gr.Number(value=1), gr.Number(value=2)], gr.Number()), gr.Interface(get_greeting, gr.Textbox("Abubakar"), gr.Textbox()), gr.Interface(greet_user, [gr.Textbox("Abubakar"), gr.Dropdown(choices=["friendly", "formal", "casual"])], gr.Textbox()), ], [ "Add", "Get Greeting", "Greet User", ] )
if name == "main": demo.launch(mcp_server=True) In this example:
The get_greeting function is exposed as a resource with a URI template greeting://{name} When an MCP client requests greeting://Alice, it receives "Hello, Alice!" Resources can also return images and other types of files or binary data. In order to return non-text data, you should specify the mime_type parameter in @gr.mcp.resource() and return a Base64 string from your function. Creating MCP Prompts Prompts help standardize how users interact with your tools. They're especially useful for complex workflows that require specific formatting or multiple steps.
The greet_user function in the example above is decorated with @gr.mcp.prompt(), which:
Makes it available as a prompt template in MCP clients Accepts parameters (name and style) to customize the output Returns a structured prompt that guides the LLM's behavior Adding MCP-Only Functions So far, all of our MCP tools, resources, or prompts have corresponded to event listeners in the UI. This works well for functions that directly update the UI, but may not work if you wish to expose a "pure logic" function that should return raw data (e.g. a JSON object) without directly causing a UI update.
In order to expose such an MCP tool, you can create a pure Gradio API endpoint using gr.api (see full docs here). Here's an example of creating an MCP tool that slices a list:
import gradio as gr
def slice_list(lst: list, start: int, end: int) -> list: """ A tool that slices a list given a start and end index. Args: lst: The list to slice. start: The start index. end: The end index. Returns: The sliced list. """ return lst[start:end]
with gr.Blocks() as demo: gr.Markdown( """ This is a demo of a MCP-only tool. This tool slices a list. This tool is MCP-only, so it does not have a UI. """ ) gr.api( slice_list )
_, url, _ = demo.launch(mcp_server=True) Note that if you use this approach, your function signature must be fully typed, including the return value, as these signature are used to determine the typing information for the MCP tool.
Gradio with FastMCP In some cases, you may decide not to use Gradio's built-in integration and instead manually create an FastMCP Server that calls a Gradio app. This approach is useful when you want to:
Store state / identify users between calls instead of treating every tool call completely independently Start the Gradio app MCP server when a tool is called (if you are running multiple Gradio apps locally and want to save memory / GPU) This is very doable thanks to the Gradio Python Client and the MCP Python SDK's FastMCP class. Here's an example of creating a custom MCP server that connects to various Gradio apps hosted on HuggingFace Spaces using the stdio protocol:
from mcp.server.fastmcp import FastMCP from gradio_client import Client import sys import io import json
mcp = FastMCP("gradio-spaces")
clients = {}
def get_client(space_id: str) -> Client: """Get or create a Gradio client for the specified space.""" if space_id not in clients: clients[space_id] = Client(space_id) return clients[space_id]
@mcp.tool() async def generate_image(prompt: str, space_id: str = "ysharma/SanaSprint") -> str: """Generate an image using Flux.
Args:
prompt: Text prompt describing the image to generate
space_id: HuggingFace Space ID to use
"""
client = get_client(space_id)
result = client.predict(
prompt=prompt,
model_size="1.6B",
seed=0,
randomize_seed=True,
width=1024,
height=1024,
guidance_scale=4.5,
num_inference_steps=2,
api_name="/infer"
)
return result
@mcp.tool() async def run_dia_tts(prompt: str, space_id: str = "ysharma/Dia-1.6B") -> str: """Text-to-Speech Synthesis.
Args:
prompt: Text prompt describing the conversation between speakers S1, S2
space_id: HuggingFace Space ID to use
"""
client = get_client(space_id)
result = client.predict(
text_input=f"""{prompt}""",
audio_prompt_input=None,
max_new_tokens=3072,
cfg_scale=3,
temperature=1.3,
top_p=0.95,
cfg_filter_top_k=30,
speed_factor=0.94,
api_name="/generate_audio"
)
return result
if name == "main": import sys import io sys.stdout = io.TextIOWrapper(sys.stdout.buffer, encoding='utf-8')
mcp.run(transport='stdio')
This server exposes two tools:
run_dia_tts - Generates a conversation for the given transcript in the form of [S1]first-sentence. [S2]second-sentence. [S1]... generate_image - Generates images using a fast text-to-image model To use this MCP Server with Claude Desktop (as MCP Client):
Save the code to a file (e.g., gradio_mcp_server.py) Install the required dependencies: pip install mcp gradio-client Configure Claude Desktop to use your server by editing the configuration file at ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{ "mcpServers": { "gradio-spaces": { "command": "python", "args": [ "/absolute/path/to/gradio_mcp_server.py" ] } } } Restart Claude Desktop Now, when you ask Claude about generating an image or transcribing audio, it can use your Gradio-powered tools to accomplish these tasks.
Troubleshooting your MCP Servers The MCP protocol is still in its infancy and you might see issues connecting to an MCP Server that you've built. We generally recommend using the MCP Inspector Tool to try connecting and debugging your MCP Server.
Here are some things that may help:
- Ensure that you've provided type hints and valid docstrings for your functions
As mentioned earlier, Gradio reads the docstrings for your functions and the type hints of input arguments to generate the description of the tool and parameters. A valid function and docstring looks like this (note the "Args:" block with indented parameter names underneath):
def image_orientation(image: Image.Image) -> str: """ Returns whether image is portrait or landscape.
Args:
image (Image.Image): The image to check.
"""
return "Portrait" if image.height > image.width else "Landscape"
Note: You can preview the schema that is created for your MCP server by visiting the http://your-server:port/gradio_api/mcp/schema URL.
- Try accepting input arguments as str
Some MCP Clients do not recognize parameters that are numeric or other complex types, but all of the MCP Clients that we've tested accept str input parameters. When in doubt, change your input parameter to be a str and then cast to a specific type in the function, as in this example:
def prime_factors(n: str): """ Compute the prime factorization of a positive integer.
Args:
n (str): The integer to factorize. Must be greater than 1.
"""
n_int = int(n)
if n_int <= 1:
raise ValueError("Input must be an integer greater than 1.")
factors = []
while n_int % 2 == 0:
factors.append(2)
n_int //= 2
divisor = 3
while divisor * divisor <= n_int:
while n_int % divisor == 0:
factors.append(divisor)
n_int //= divisor
divisor += 2
if n_int > 1:
factors.append(n_int)
return factors
- Ensure that your MCP Client Supports SSE
Some MCP Clients, notably Claude Desktop, do not yet support SSE-based MCP Servers. In those cases, you can use a tool such as mcp-remote. First install Node.js. Then, add the following to your own MCP Client config:
{ "mcpServers": { "gradio": { "command": "npx", "args": [ "mcp-remote", "http://your-server:port/gradio_api/mcp/sse" ] } } } 4. Restart your MCP Client and MCP Server
Some MCP Clients require you to restart them every time you update the MCP configuration. Other times, if the connection between the MCP Client and servers breaks, you might need to restart the MCP server. If all else fails, try restarting both your MCP Client and MCP Servers!
β Fastapi App With The Gradio Client
File Upload Mcp
β Gradio logo Status Twitter logo Github logo Hugging Face's logo Hugging Face Models Datasets Spaces Community Docs Enterprise Pricing
Hub
Hub
Search documentation βK
EN
433 π€ Hugging Face Hub Repositories Getting Started with Repositories Repository Settings Pull Requests & Discussions Notifications Collections Webhooks Notebooks Storage Backends Storage Limits Next Steps Licenses Models The Model Hub Model Cards Gated Models Uploading Models Downloading Models Integrated Libraries Model Widgets Model Inference Models Download Stats Model Release Checklist Local Apps Frequently Asked Questions Advanced Topics Datasets Datasets Overview Dataset Cards Gated Datasets Uploading Datasets Uploading Datasets (for LLMs) Downloading Datasets Integrated Libraries Data Studio Datasets Download Stats Data files Configuration Spaces Spaces Overview Spaces GPU Upgrades Spaces ZeroGPU Spaces Dev Mode Spaces Persistent Storage Spaces as MCP servers Gradio Spaces Streamlit Spaces Static HTML Spaces Docker Spaces Embed your Space Run Spaces with Docker Spaces Configuration Reference Sign-In with HF button Spaces Changelog Advanced Topics Other Organizations Enterprise Hub Billing Security Agents on Hub Moderation Paper Pages Search Digital Object Identifier (DOI) Hub API Endpoints Sign-In with HF Spaces as MCP servers You can expose any public Space that has a visible MCP badge into a callable tool that will be available in any MCP-compatible client, you can add as many Spaces as you want and without writing a single line of code.
Setup your MCP Client From your Hub MCP settings, select your MCP client (VSCode, Cursor, Claude Code, etc.) then follow the setup instructions.
image/png
You need a valid Hugging Face token with READ permissions to use MCP tools. If you don't have one, create a new "Read" access token here. Add an existing Space to your MCP tools image/png
Browse compatible Spaces to find Spaces that are usable via MCP. You can also look for the grey MCP badge on any Spaces card. Click the badge and choose Add to MCP tools then confirm when asked. The Space should be listed in your MCP Server settings in the Spaces Tools section. image/png
Use Spaces from your MCP client If your MCP client is configured correctly, the Spaces you added will be available instantly without changing anything (if it doesnβt restart your client and it should appear). Most MCP clients will list what tools are currently loaded so you can make sure the Space is available.
For ZeroGPU Spaces, your quota will be used when the tool is called, if you run out of quota you can subscribe to PRO to get 25 minutes of daily quota (x8 more quota than free users). For example your PRO account lets you generate up to 600 images per day using FLUX.1-schnell. Build your own MCP-compatible Gradio Space To create your own MCP-enabled Space, you need to Create a new Gradio Space then make sure to enable MCP support in the code. Get started with Gradio Spaces and make sure to check the detailed MCP guide for more details.
First, install Gradio with MCP support:
Copied pip install "gradio[mcp]" Then create your app with clear type hints and docstrings:
Copied import gradio as gr
def letter_counter(word: str, letter: str) -> int: """Count occurrences of a letter in a word.
Args:
word: The word to search in
letter: The letter to count
Returns:
Number of times the letter appears in the word
"""
return word.lower().count(letter.lower())
demo = gr.Interface(fn=letter_counter, inputs=["text", "text"], outputs="number") demo.launch(mcp_server=True) # exposes an MCP schema automatically Push the app to a Gradio Space and it will automatically receive the MCP badge. Anyone can then add it as a tool with a single click.
It's also quite easy to convert an existing Gradio Space to MCP server. Duplicate it from the context menu then just add the mcp_server=True parameter to your launch() method, and ensure your functions have clear type hints and docstrings - you can use AI tools to automate this quite easily (example of AI generated docstrings). Be creative by mixing Spaces! As Hugging Face Spaces is the largest directory of AI apps, you can find many creative tools that can be used as MCP tools. Mixing and matching different Spaces can lead to powerful and creative workflows.
This video demonstrates the use of Lightricks/ltx-video-distilled and ResembleAI/Chatterbox in Claude Code to generate a video with audio. <
Update on GitHub Spaces β Spaces Persistent Storage Gradio Spaces β Spaces as MCP servers Setup your MCP Client Add an existing Space to your MCP tools Use Spaces from your MCP client Build your own MCP-compatible Gradio Space Be creative by mixing Spaces!
Gradio Agents & MCP Hackathon
Winners β Gradio logo β‘ Quickstart βοΈ Docs π’ Playground πΌοΈ Custom Components π Community
Search βK
5.44.0 Getting Started Quickstart Building Interfaces The Interface Class More On Examples Flagging Interface State Reactive Interfaces Four Kinds Of Interfaces Building With Blocks Blocks And Event Listeners Controlling Layout State In Blocks Dynamic Apps With Render Decorator More Blocks Features Custom CSS And JS Using Blocks Like Functions Additional Features Queuing Streaming Outputs Streaming Inputs Alerts Progress Bars Batch Functions Sharing Your App File Access Multipage Apps Environment Variables Resource Cleanup Themes Client Side Functions View Api Page Internationalization Chatbots Creating A Chatbot Fast Chatinterface Examples Agents And Tool Usage Creating A Custom Chatbot With Blocks Chatbot Specific Events Creating A Discord Bot From A Gradio App Creating A Slack Bot From A Gradio App Creating A Website Widget From A Gradio Chatbot Data Science And Plots Creating Plots Time Plots Filters Tables And Stats Connecting To A Database Streaming Streaming Ai Generated Audio Object Detection From Webcam With Webrtc Object Detection From Video Conversational Chatbot Real Time Speech Recognition Automatic Voice Detection Custom Components Custom Components In Five Minutes Key Component Concepts Configuration Backend Frontend Frequently Asked Questions Pdf Component Example Multimodal Chatbot Part1 Documenting Custom Components Gradio Clients And Lite Getting Started With The Python Client Getting Started With The Js Client Querying Gradio Apps With Curl Gradio And Llm Agents Gradio Lite Gradio Lite And Transformers Js Fastapi App With The Gradio Client Mcp Building Mcp Server With Gradio File Upload Mcp Using the File Upload MCP Server Conclusion Building An Mcp Client With Gradio Using Docs Mcp Other Tutorials[ show ] β Building Mcp Server With Gradio
Building An Mcp Client With Gradio
β The File Upload MCP Server If you've tried to to use a remote Gradio MCP server that takes a file as input (image, video, audio), you've probably run into this error:
The reason is that since the Gradio server is hosted on a different machine, any input files must be available via a public URL so that they can downloaded in the remote machine.
There are many ways to host files on the internet, but they all require adding a manual step to your workflow. In the age of LLM agents, shouldn't we expect them to handle this step for you?
In this post, we'll show how you can connect your LLM to the "File Upload" MCP server so that it can handle the file uploading for you when appropriate!
Using the File Upload MCP Server As of version 5.36.0, Gradio now comes with a built-in MCP server that can upload files to a running Gradio application. In the View API page of the server, you should see the following code snippet if any of the tools require file inputs:
The command to start the MCP server takes two arguments:
The URL (or Hugging Face space id) of the gradio application to upload the files to. In this case, http://127.0.0.1:7860. The local directory on your computer with which the server is allowed to upload files from (). For security, please make this directory as narrow as possible to prevent unintended file uploads. As stated in the image, you need to install uv (a python package manager that can run python scripts) before connecting from your MCP client.
If you have gradio installed locally and you don't want to install uv, you can replace the uvx command with the path to gradio binary. It should look like this:
"upload-files": { "command": "", "args": [ "upload-mcp", "http://localhost:7860/", "/Users/freddyboulton/Pictures" ] } After connecting to the upload server, your LLM agent will know when to upload files for you automatically!
Conclusion In this guide, we've covered how you can connect to the Upload File MCP Server so that your agent can upload files before using Gradio MCP servers. Remember to set the as small as possible to prevent unintended file uploads!
β Building Mcp Server With Gradio
Building An Mcp Client With Gradio
β Gradio logo Status Twitter logo Github logo File Upload Mcp