Skip to content

Prerequisites

This guide covers everything you need to know about setting up and using readme-ai, including system requirements, supported platforms, and LLM providers.

Prerequisites

ReadmeAI requires Python 3.9 or higher, and one of the following installation methods:

Requirement Details
Python ≥3.9 Core runtime
Installation Method (choose one)
pip Default Python package manager
pipx Isolated environment installer
uv High-performance package manager
docker Containerized environment

Supported Repository Platforms

To generate a README file, provide the source repository. ReadmeAI supports these platforms:

Platform Details
File System Local repository access
GitHub Industry-standard hosting
GitLab Full DevOps integration
Bitbucket Atlassian ecosystem

Supported LLM Providers

ReadmeAI is model agnostic, with support for the following LLM API services:

Provider Best For Details
OpenAI General use Industry-leading models
Anthropic Advanced tasks Claude language models
Google Gemini Multimodal AI Latest Google technology
Ollama Open source No API key needed
Offline Mode Local operation No internet required

Next Steps

  1. Follow our Installation Guide to set up readme-ai
  2. Learn the basics in our CLI Reference guide
  3. Get help with common issues in our Troubleshooting guide