About this Learning Path

Who is this for?

This Learning Path is for LLM and IoT developers who want to run and interact with AI agents on edge devices like the Raspberry Pi 5. You'll learn how to deploy a lightweight Model Context Protocol (MCP) server and use the OpenAI Agent SDK to create and register tools for intelligent local inference.

What will you learn?

Upon completion of this learning path, you will be able to:

  • Deploy a lightweight Model Context Protocol (MCP) server on Raspberry Pi 5 for local AI agent execution.
  • Use the OpenAI Agent SDK to interact with a local AI agent.
  • Design and register custom tools for the agent tasks.
  • Learn about uv — a fast, efficient Python package manager for efficient local deployment.

Prerequisites

Before starting, you will need the following:

  • A Raspberry Pi 5 with a Linux-based OS installed.
  • Familiarity with Python programming and prompt engineering techniques.
  • Basic understanding of Large Language Models (LLMs) and how they are used in local inference.
  • Understanding of AI agents and the OpenAI Agent SDK (or similar frameworks).
Next