LLMProc

Unix process model for LLMs. Build with familiar abstractions.

Zero token overhead - you control every token
Open-source alternative to Claude Code
Python SDK with type-safe function tools

Quick Install

Python (pip)

pip install llmproc

Run without installing

uvx llmproc

GitHub Actions Integration

Real-world automation using LLMProc in GitHub workflows:

@llmproc /resolve

Automatically resolve merge conflicts

Intelligently merges conflicting changes when possible

@llmproc /ask

Answer questions on issues/PRs

Get AI-powered answers with full context awareness

@llmproc /code

Implement features from comments

Turn feature requests into working code automatically

Quick Setup

# Install workflows automatically in your repository root:

uvx --from llmproc llmproc-install-actions

The installer will download workflow files and provide step-by-step instructions.

Required secrets in repository settings:

  • ANTHROPIC_API_KEY - API key for Claude
  • LLMPROC_WRITE_TOKEN - GitHub PAT with write permissions (contents, pull-requests)

Production-Ready Features

Zero Token Overhead

No built-in system prompt overhead. Control every token sent to the LLM.

Type-Safe Python SDK

Register functions as tools with automatic schema generation from type hints.

Configuration-Driven

Define LLM programs in YAML/TOML with complete customization control.

Flexible Tool System

Built-in tools, MCP protocol support, tool aliases, and access control.

Production Features

Prompt caching, automatic retries, callbacks, and robust error handling.

Multi-Provider Support

Native support for Anthropic Claude, OpenAI, and Google Gemini models.

Ready to Build with LLMProc?

Production-ready LLM runtime. Complete control. Zero overhead.

Install via pip:

Basic installation (Anthropic support)

pip install llmproc

Run without installing (requires uv)

uvx llmproc --help

0

Token Overhead

100%

Open Source

Customization