Skip to content

Environment Variables

Once you have installed readmeai, set up the required environment variables for the API key of your chosen language model provider. The following examples demonstrate how to set the environment variables for each provider supported by readme-ai:

Setting API Keys

 OpenAI

export OPENAI_API_KEY=<your_api_key>

 Anthropic

export ANTHROPIC_API_KEY=<your_api_key>

 Google Gemini

export GOOGLE_API_KEY=<your_api_key>

Window Users

On Windows, use set instead of export to set environment variables.

set OPENAI_API_KEY=<your_api_key>

 Ollama

1. Pull your preferred language model from the Ollama server.

ollama pull <model_name>:<model_version>
For example, to pull the mistral model:
ollama pull mistral:latest
2. Set the OLLAMA_HOST environment variable and start the Ollama server.
export OLLAMA_HOST=127.0.0.1 && ollama serve

Unsupported Language Models

If your preferred LLM API is not supported, open an issue or submit a pull request and we'll review the request!