Show HN: Axe A 12MB binary that replaces your AI framework
Axe introduces a lightweight CLI for running single-purpose AI agents, built on the Unix philosophy of small, composable tools. It allows developers to define focused LLM agents via TOML, integrating them seamlessly into existing workflows like Git hooks and cron jobs. This appeals to Hacker News's appreciation for minimalist, powerful utilities that enhance developer control and automation.
The Lowdown
Axe is a command-line interface (CLI) tool designed to manage and execute focused, single-purpose AI agents. Rejecting the prevalent model of large, monolithic chatbots, Axe champions the Unix philosophy: each LLM-powered agent is a small, specialized, and composable unit. Users define these agents via TOML configuration files, enabling straightforward integration with standard developer workflows and existing infrastructure.
- Unix-Inspired Design: Agents are designed to do one thing well, composable via pipes and integrated with standard CLI tools like
git diff | axe run reviewer. - TOML Configuration: Agent definitions are declarative TOML files, detailing system prompts, models, skills, memory, and sub-agent delegation.
- Broad Provider Support: Compatible with Anthropic, OpenAI, and local Ollama models.
- Seamless Integration: Orchestrates agents without reinventing scheduling; it's designed to be used with cron, git hooks, and file watchers.
- Advanced Agent Functionality: Features include sub-agent delegation (with configurable depth and parallel execution), persistent memory with LLM-assisted garbage collection, and a reusable skill system based on
SKILL.md. - Sandboxed Built-in Tools: Agents can perform file operations (read, write, edit, list) and execute shell commands within a secure, sandboxed working directory.
- Developer Utilities: Includes
stdinpiping, dry-run mode for context inspection, JSON output for scripting, and minimal Go-based dependencies. - Docker Compatibility: Offers a Docker image for hardened, isolated agent execution, complete with flexible volume mounting for configuration and data.
In essence, Axe presents itself as a robust, minimalist alternative to traditional AI frameworks, offering a developer-centric methodology for crafting and deploying LLM agents. By embracing the Unix principle of