Spaces:
Sleeping
Sleeping
A newer version of the Gradio SDK is available:
5.16.0
metadata
title: Advanced Dual Prompting
emoji: π
colorFrom: green
colorTo: purple
sdk: gradio
sdk_version: 4.16.0
app_file: app.py
pinned: false
Simple Dual LLM Chatbot
This is a playground for testing out Standford's 'Meta-Prompting' logic (paper link), in whcih for every user request, it first passes the request to a 'meta' bot, the 'meta' bot will then generate a system prompt of a field-related 'Expert' bot for answering user's request.
That is, for each round, the LLM should accordingly assigns the best expert for answering user's specific request.
Standford claimed that this simple implementation result in a 60%+ better accuracy compared to a standard 'syst_prompt + chat_history' logic.
Hence, one can't be too curious in checking it out, here is a simple implemnetaion for everybody to play around.
Something to keep in mind:
- Currently it requires an api key from chatglm (get one here if you don't have one: link)
- To balance contextual-understanding and token-saving, the meta bot's logic is modified to have access to only the last round of chat and the current user request when 'generating' an expert.