Open Unmanned Artificial Intelligence — an autonomous AI agent that reacts to events, executes tasks, and runs as a single binary on any OS.
Available for Linux (amd64, arm64), macOS (universal), and Windows (amd64)
Plan-first execution with full tool access: filesystem, Bash, Git, web browsing. The agent works autonomously and reports back.
Subscribe to WhatsApp, Teams, email, webhooks, and more. Define rules and the agent reacts to events automatically.
Spawn concurrent sub-agents for parallel task execution. The parent decomposes complex tasks into independent sub-tasks.
Connect to any MCP server to extend capabilities. Use community connectors for Slack, Notion, Google Drive, and more.
Push-to-talk speech input with Whisper transcription and text-to-speech output. Auto-detects language.
No Docker, no Node, no Python. Download one file and run it. Built with Go + Wails for native performance.
Runs in background with native OS notifications. Quick actions from the tray menu, hide on close.
Run headless as a local API server. 18 endpoints + WebSocket for real-time integration with other apps.
Real-time token usage, cost per request, accumulated costs. Track spending across all LLM providers.
┌──────────────────────────────────────────────────────────────┐ │ openuai (single binary) │ │ │ │ [Event Sources] [Engine] [Actions] │ │ WhatsApp ──┐ ┌── Filesystem │ │ Email ─────┤ ├── Reply │ │ Webhooks ──┤ ├── Bash/Git │ │ Teams ─────┼→ Event Bus → LLM ────┐ ├── APIs │ │ Slack ─────┤ Agent ├→ Agent├── Voice Out │ │ Cron ──────┤ │ └── Notify │ │ Clipboard ─┘ │ │ │ ├→ Sub-Agent (parallel) │ │ └→ Sub-Agent (parallel) │ │ │ │ [System Tray] [Embedded UI] [LLM Client] [Cost Tracker] │ │ [Memory] [MCP Client/Server] [REST API] [Voice In] │ └──────────────────────────────────────────────────────────────┘