Unix process model for LLMs. Build with familiar abstractions.
Python (pip)
pip install llmproc Run without installing
uvx llmproc Real-world automation using LLMProc in GitHub workflows:
@llmproc /resolveAutomatically resolve merge conflicts
Intelligently merges conflicting changes when possible
@llmproc /askAnswer questions on issues/PRs
Get AI-powered answers with full context awareness
@llmproc /codeImplement features from comments
Turn feature requests into working code automatically
Quick Setup
# Install workflows automatically in your repository root:
uvx --from llmproc llmproc-install-actions The installer will download workflow files and provide step-by-step instructions.
Required secrets in repository settings:
ANTHROPIC_API_KEY - API key for ClaudeLLMPROC_WRITE_TOKEN - GitHub PAT with write permissions (contents, pull-requests)No built-in system prompt overhead. Control every token sent to the LLM.
Register functions as tools with automatic schema generation from type hints.
Define LLM programs in YAML/TOML with complete customization control.
Built-in tools, MCP protocol support, tool aliases, and access control.
Prompt caching, automatic retries, callbacks, and robust error handling.
Native support for Anthropic Claude, OpenAI, and Google Gemini models.
Production-ready LLM runtime. Complete control. Zero overhead.
Install via pip:
Basic installation (Anthropic support)
pip install llmprocRun without installing (requires uv)
uvx llmproc --help0
Token Overhead
100%
Open Source
∞
Customization