Apfel: Unlocking Apple's On-Device AI as Free CLI Tool and OpenAI-Compatible Server
Apfel is a new open-source tool that unlocks Apple's built-in LLM on Apple Silicon Macs, providing free, 100% on-device AI through a CLI tool, OpenAI-compatible HTTP server, and interactive chat in...
Apfel is a new open-source tool that unlocks Apple's built-in LLM on Apple Silicon Macs, providing free, 100% on-device AI through a CLI tool, OpenAI-compatible HTTP server, and interactive chat interface.
How It Works
Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence, exposed through the FoundationModels framework. Apple only uses it for Siri — Apfel sets it free.
Three Ways to Use It
- CLI Tool: Pipe-friendly, composable, JSON output, file attachments
- OpenAI Server: Drop-in replacement at localhost:11434 with streaming and tool calling
- Interactive Chat: Multi-turn with automatic context management
Key Specs
- Cost: $0 — no API keys, no subscriptions
- Privacy: 100% on-device, nothing leaves your machine
- Context window: 4,096 tokens (input + output combined)
- Install:
brew install Arthur-Ficial/tap/apfel
Significance
Apfel demonstrates the growing capability of on-device AI and challenges the assumption that powerful AI requires cloud infrastructure. For simple tasks, local inference on Apple Silicon is increasingly sufficient.
← Previous: Switzerland Launches 'CERN of Semiconductor Research' InitiativeNext: Zep AI Raises the Bar: Building the Agent Context Layer with $24K+ GitHub Stars →
0