Custom AI Clients
Fellow allows you to define your own AI backend by implementing the Client
protocol.
This can be used to connect to services like Claude, Mistral, or even local LLMs.
Core Concepts
A custom AI client must:
- Define a subclass of
ClientConfig
(a Pydantic model for your settings). - Implement the
Client
protocol (either manually or using theinit-client
command generator). - Register your client in your
config.yml
.
Client Interface Overview
The Client
protocol defines five key methods:
create(config: ClientConfig) -> Client
Used to instantiate the client based on configuration data.
This is the method Fellow uses when initializing the AI engine.
chat(...) -> ChatResult
Handles user messages and optional function results.
Returns a ChatResult
object that may include:
message
: the assistant’s textual replyfunction_name
: name of the function the assistant wants to call (if any)function_args
: the JSON-encoded arguments to that function
Parameters:
-
functions
(List[Function])
A list of all available commands (as JSON schemas) that the AI is allowed to call during this interaction.
These are generated bycommand.get_function_schema(...)
. -
message
(str)
The raw user prompt to be sent to the AI model. This may be empty if the previous step involved a tool call and you’re now injecting the result back. -
function_result
(Optional[FunctionResult])
If the last action involved calling a command, this parameter contains the result of that command execution.
Used for models that support tool-use feedback (e.g., OpenAI or Gemini with function response injection).
This method is the heart of your client implementation. It is called repeatedly during a reasoning cycle, alternating between message → function_call → function_result → message
.
store_memory(filename: str)
Exports the full conversation history to a file (e.g. for saving session context).
set_plan(plan: str)
Stores a plan for the assistant (usually injected during reasoning).
get_function_schema(command) -> Function
Returns the JSON schema used for tool/function calling.
Use the init-client
Generator
Fellow includes a built-in helper to create a boilerplate file for your client:
fellow init-client my_client
This will generate a file like MyClient.py
in the first directory listed under custom_clients_paths
.
Make sure your config.yml
points to the correct directory:
custom_clients_paths:
- .fellow/clients
Configuration
Once your client is implemented and placed in a path listed in custom_clients_paths
, you can enable it by updating your config.yml
:
ai_client:
client: myclient # lowercase name of your file (e.g., myclient -> MyClient)
config:
system_content: "You are a helpful assistant."
model: "custom-model-name"
Fellow will automatically locate and load the class MyClient
from MyClient.py
.