Post
1253
Excited to announce *Open Responses* – a self-hosted alternative to OpenAI's new _Responses API_ that you can run locally, and use with ANY LLM model / provider and not just with OpenAI Responses API. What's more is that this is also compatible with their agents-sdk so everything just works out of the box!
To try it out, just run
Would love feedback and support for adding local HF models, @akhaliq @bartowski @prithivMLmods @julien-c @clefourrier @philschmid
We’d love feedback from the Hugging Face community on how it integrates with your pipelines (support for Hugging Face models landing soon!). Let’s push open-source AI forward together!
Docs:
https://docs.julep.ai/responses/quickstart
Repo:
https://github.com/julep-ai/open-responses
agents-sdk:
https://platform.openai.com/docs/guides/agents
To try it out, just run
npx -y open-responses init
(or uvx
) and that's it! :)Would love feedback and support for adding local HF models, @akhaliq @bartowski @prithivMLmods @julien-c @clefourrier @philschmid
We’d love feedback from the Hugging Face community on how it integrates with your pipelines (support for Hugging Face models landing soon!). Let’s push open-source AI forward together!
Docs:
https://docs.julep.ai/responses/quickstart
Repo:
https://github.com/julep-ai/open-responses
agents-sdk:
https://platform.openai.com/docs/guides/agents