Writing mini-Auto-GPT
In this section, we will write a mini-Auto-GPT model that uses a local LLM. To avoid reaching the limits of small LLMs, we will have to make a smaller version of Auto-GPT.
The mini-Auto-GPT model will be able to handle a context length of 4,000 tokens and will be able to generate up to 2,000 tokens at once.
I have created a mini-Auto-GPT model just for this book. It’s available on GitHub at https://github.com/Wladastic/mini_autogpt.
We will start by planning the structure of the mini-Auto-GPT model.
Planning the structure
The mini-Auto-GPT model will have the following components:
- Telegram chatbot
- Prompts for the LLM and basic thinking
- Simple memory to remember the conversation
Let’s take a closer look at these.
Telegram chatbot
Because chatting with your AI over Telegram enables you to interact with it from anywhere, we will use a Telegram chatbot as the interface for the mini-Auto-GPT model. We’...