MedSAM2 - Custom Inference Endpoint
This repository contains a custom handler to deploy the MedSAM2 model using Hugging Face Inference Endpoints.
MedSAM2 is a state-of-the-art medical image segmentation model designed for zero-shot and fine-tuned applications across ultrasound, CT, and MRI modalities.
π§ Files
handler.py
: Defines a custom inference handler for processing inputs and generating predictions.model.py
: Loads the pretrainedMedSAM2_latest.pt
weights directly from the Hugging Face Hub.requirements.txt
: Python dependencies required for running the model.MedSAM2_latest.pt
: (Optional) If you are using local weights instead of downloading from the Hub.
π₯ Input Format
The model expects a JSON payload with a base64-encoded image:
Example input: { "image": "" }
Encode an image like this in Python: import base64
with open("your_image.png", "rb") as f:
encoded = base64.b64encode(f.read()).decode("utf-8")
π€ Output Format
The model returns predictions in a JSON object:
Example output: { "output": [[...]] }
Note: The current
handler.py
uses placeholder preprocessing. You should update it with the actual image preprocessing and model inference logic according to the MedSAM2 architecture.
π Deployment Instructions
- Push this repo to the Hugging Face Hub
- Go to: https://huggingface.co/inference-endpoints
- Create a new endpoint
- Select:
- Your model repo (e.g.
username/my-medsam2-endpoint
) - "Custom handler"
- Hardware: Choose GPU if required
- Your model repo (e.g.
- Deploy
π§ Resources
- MedSAM2 Model: https://huggingface.co/wanglab/MedSAM2
- MedSAM2 Paper: https://arxiv.org/abs/2403.03662
- WangLab GitHub: https://github.com/WANG-lab/MedSAM
π€ License
Please review the license of the original model here: https://huggingface.co/wanglab/MedSAM2
This repository is for research and educational purposes.