Dummy Agent Library

Unit 1 planning

This course is framework agnostic because we want to focus on the concepts of AI agents and avoid getting bogged down in the specifics of a particular framework. Also, we want students to be able to use the concepts they learn in this course in their own projects, in any framework.

Therefore, for this introductory unit, we will use a dummy agent library and simple severless API, which will be our LLM engine. In reality, you would not use these in production, but they will serve as a good starting point for understanding how agents work. In the following units we will use AI Agent libraries like smolagents, LangGraph, LangChain, and LlamaSmith.

Good Old Python

To keep things simple we will use a simple Python function as a tools and agents. We will use built in python packages like datetime and os so that you can try it out in any environment.

Below is an example of a dummy agent that returns the full name of a person and the date.

from datetime import datetime

def get_current_time():
    return datetime.now().strftime("%Y-%m-%d %H:%M:%S")

def get_surname(first_name: str):
    return {
        "bill": "smith",
        "john": "doe",
        "jane": "smith",
    }[first_name]

def agent(first_name: str):
    """ I'm a simple agent that returns the full name of a person and the date """
    surname = get_surname(first_name)
    date = get_current_time()
    return f"{first_name} {surname} {date}"

Serverless API

We will use a simple serverless API to interact with the LLM. We will use the huggingface_hub library to interact with the LLM served on Hugging Face via Inference API. In reality, frameworks integrate with models or API providers like Hugging Face, OpenAI, Anthropic, and Google.

from huggingface_hub import InferenceClient
client = InferenceClient(api_key="hf_xxx")
< > Update on GitHub