Skip to the content.

Aider can connect to most LLMs

connecting to many LLMs

Best models

Aider works best with GPT-4 Turbo and Claude 3 Opus, as they are the very best models for editing code.

Free models

Aider works with a number of free API providers:

Local models

Aider can work also with local models, for example using Ollama. It can also access local models that provide an Open AI compatible API.

Use a capable model

Check Aider’s LLM leaderboard to see which models work best with aider.

Be aware that aider may not work well with less capable models. If you see the model returning code, but aider isn’t able to edit your files and commit the changes… this is usually because the model isn’t capable of properly returning “code edits”. Models weaker than GPT 3.5 may have problems working well with aider.

Configuring models

Aider uses the LiteLLM package to connect to LLM providers. The LiteLLM provider docs contain more detail on all the supported providers, their models and any required environment variables.

OpenAI

To work with OpenAI’s models, you need to provide your OpenAI API key either in the OPENAI_API_KEY environment variable or via the --openai-api-key command line switch.

Aider has some built in shortcuts for the most popular OpenAI models and has been tested and benchmarked to work well with them:

pip install aider-chat

export OPENAI_API_KEY=<key> # Mac/Linux
setx   OPENAI_API_KEY <key> # Windows

# GPT-4 Turbo is used by default
aider

# GPT-4 Turbo with Vision
aider --4-turbo-vision

# GPT-3.5 Turbo
aider --35-turbo

# List models available from OpenAI
aider --models openai/

You can use aider --model <model-name> to use any other OpenAI model. For example, if you want to use a specific version of GPT-4 Turbo you could do aider --model gpt-4-0125-preview.

Anthropic

To work with Anthropic’s models, you need to provide your Anthropic API key either in the ANTHROPIC_API_KEY environment variable or via the --anthropic-api-key command line switch.

Aider has some built in shortcuts for the most popular Anthropic models and has been tested and benchmarked to work well with them:

pip install aider-chat

export ANTHROPIC_API_KEY=<key> # Mac/Linux
setx   ANTHROPIC_API_KEY <key> # Windows

# Claude 3 Opus
aider --opus

# Claude 3 Sonnet
aider --sonnet

# List models available from Anthropic
aider --models anthropic/

You can use aider --model <model-name> to use any other Anthropic model. For example, if you want to use a specific version of Opus you could do aider --model claude-3-opus-20240229.

Gemini

Google currently offers free API access to the Gemini 1.5 Pro model. This is the most capable free model to use with aider, with code editing capability that’s comparable to GPT-3.5. You’ll need a Gemini API key.

pip install aider-chat

export GEMINI_API_KEY=<key> # Mac/Linux
setx   GEMINI_API_KEY <key> # Windows

aider --model gemini/gemini-1.5-pro-latest

# List models available from Gemini
aider --models gemini/

GROQ

Groq currently offers free API access to the models they host. The Llama 3 70B model works well with aider and is comparable to GPT-3.5 in code editing performance. You’ll need a Groq API key.

To use Llama3 70B:

pip install aider-chat

export GROQ_API_KEY=<key> # Mac/Linux
setx   GROQ_API_KEY <key> # Windows

aider --model groq/llama3-70b-8192

# List models available from Groq
aider --models groq/

Cohere

Cohere offers free API access to their models. Their Command-R+ model works well with aider as a very basic coding assistant. You’ll need a Cohere API key.

To use Command-R+:

pip install aider-chat

export COHERE_API_KEY=<key> # Mac/Linux
setx   COHERE_API_KEY <key> # Windows

aider --model command-r-plus

# List models available from Cohere
aider --models cohere_chat/

Azure

Aider can connect to the OpenAI models on Azure.

pip install aider-chat

# Mac/Linux:                                           
export AZURE_API_KEY=<key>
export AZURE_API_VERSION=2023-05-15
export AZURE_API_BASE=https://myendpt.openai.azure.com

# Windows:
setx AZURE_API_KEY <key>
setx AZURE_API_VERSION 2023-05-15
setx AZURE_API_BASE https://myendpt.openai.azure.com

aider --model azure/<your_deployment_name>

# List models available from Azure
aider --models azure/

OpenRouter

Aider can connect to models provided by OpenRouter: You’ll need an OpenRouter API key.

pip install aider-chat

export OPENROUTER_API_KEY=<key> # Mac/Linux
setx   OPENROUTER_API_KEY <key> # Windows

# Or any other open router model
aider --model openrouter/<provider>/<model>

# List models available from OpenRouter
aider --models openrouter/

In particular, Llama3 70B works well with aider, at low cost:

pip install aider-chat

export OPENROUTER_API_KEY=<key> # Mac/Linux
setx   OPENROUTER_API_KEY <key> # Windows

aider --model openrouter/meta-llama/llama-3-70b-instruct

Ollama

Aider can connect to local Ollama models.

# Pull the model
ollama pull <model>

# Start your ollama server
ollama serve

# In another terminal window...
pip install aider-chat

export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx   OLLAMA_API_BASE http://127.0.0.1:11434 # Windows

aider --model ollama/<model>

In particular, llama3:70b works very well with aider:

ollama pull llama3:70b
ollama serve

# In another terminal window...
export OLLAMA_API_BASE=http://127.0.0.1:11434 # Mac/Linux
setx   OLLAMA_API_BASE http://127.0.0.1:11434 # Windows

aider --model ollama/llama3:70b 

Also see the model warnings section for information on warnings which will occur when working with models that aider is not familiar with.

Deepseek

Aider can connect to the Deepseek API, which is OpenAI compatible. They appear to grant 5M tokens of free API usage to new accounts.

pip install aider-chat

# Mac/Linux:
export OPENAI_API_KEY=<key>
export OPENAI_API_BASE=https://api.deepseek.com/v1

# Windows:
setx OPENAI_API_KEY <key>
setx OPENAI_API_BASE https://api.deepseek.com/v1

aider --model openai/deepseek-coder

See the model warnings section for information on warnings which will occur when working with models that aider is not familiar with.

OpenAI compatible APIs

Aider can connect to any LLM which is accessible via an OpenAI compatible API endpoint.

pip install aider-chat

# Mac/Linux:
export OPENAI_API_BASE=<endpoint>
export OPENAI_API_KEY=<key>

# Windows:
setx OPENAI_API_BASE <endpoint>
setx OPENAI_API_KEY <key>

# Prefix the model name with openai/
aider --model openai/<model-name>

See the model warnings section for information on warnings which will occur when working with models that aider is not familiar with.

Other LLMs

Aider uses the litellm package to connect to hundreds of other models. You can use aider --model <model-name> to use any supported model.

To explore the list of supported models you can run aider --models <model-name> with a partial model name. If the supplied name is not an exact match for a known model, aider will return a list of possible matching models. For example:

$ aider --models turbo

Aider v0.29.3-dev
Models which match "turbo":
- gpt-4-turbo-preview (openai/gpt-4-turbo-preview)
- gpt-4-turbo (openai/gpt-4-turbo)
- gpt-4-turbo-2024-04-09 (openai/gpt-4-turbo-2024-04-09)
- gpt-3.5-turbo (openai/gpt-3.5-turbo)
- ...

See the list of providers supported by litellm for more details.

Model warnings

Aider supports connecting to almost any LLM, but it may not work well with less capable models. If you see the model returning code, but aider isn’t able to edit your files and commit the changes… this is usually because the model isn’t capable of properly returning “code edits”. Models weaker than GPT 3.5 may have problems working well with aider.

Aider tries to sanity check that it is configured correctly to work with the specified model:

Sometimes one or both of these checks will fail, so aider will issue some of the following warnings.

Missing environment variables

Model azure/gpt-4-turbo: Missing these environment variables:
- AZURE_API_BASE
- AZURE_API_VERSION
- AZURE_API_KEY

You need to set the listed environment variables. Otherwise you will get error messages when you start chatting with the model.

Unknown which environment variables are required

Model gpt-5: Unknown which environment variables are required.

Aider is unable verify the environment because it doesn’t know which variables are required for the model. If required variables are missing, you may get errors when you attempt to chat with the model. You can look in the litellm provider documentation to see if the required variables are listed there.

Unknown model, did you mean?

Model gpt-5: Unknown model, context window size and token costs unavailable.
Did you mean one of these?
- gpt-4

If you specify a model that aider has never heard of, you will get an “unknown model” warning. This means aider doesn’t know the context window size and token costs for that model. Some minor functionality will be limited when using such models, but it’s not really a significant problem.

Aider will also try to suggest similarly named models, in case you made a typo or mistake when specifying the model name.

Editing format

Aider uses different “edit formats” to collect code edits from different LLMs:

Different models work best with different editing formats. Aider is configured to use the best edit format for the popular OpenAI and Anthropic models and the other models recommended on this page.

For lesser known models aider will default to using the “whole” editing format. If you would like to experiment with the more advanced formats, you can use these switches: --edit-format diff or --edit-format udiff.

Using a .env file

Aider will read environment variables from a .env file in root of your git repo or in current directory. You can give it an explicit file to load with the --env-file <filename> parameter.

You can use a .env file to store various keys and other settings for the models you use with aider.

Here is an example .env file:

OPENAI_API_KEY=<key>
ANTHROPIC_API_KEY=<key>
GROQ_API_KEY=<key>
OPENROUTER_API_KEY=<key>

AZURE_API_KEY=<key>
AZURE_API_VERSION=2023-05-15
AZURE_API_BASE=https://example-endpoint.openai.azure.com

OLLAMA_API_BASE=http://127.0.0.1:11434