Apfel: Unlocking Apple's On-Device AI as Free CLI Tool and OpenAI-Compatible Server

2026-04-03T11:14:44.686Z·1 min read
Apfel is a new open-source tool that unlocks Apple's built-in LLM on Apple Silicon Macs, providing free, 100% on-device AI through a CLI tool, OpenAI-compatible HTTP server, and interactive chat in...

Apfel is a new open-source tool that unlocks Apple's built-in LLM on Apple Silicon Macs, providing free, 100% on-device AI through a CLI tool, OpenAI-compatible HTTP server, and interactive chat interface.

How It Works

Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence, exposed through the FoundationModels framework. Apple only uses it for Siri — Apfel sets it free.

Three Ways to Use It

  1. CLI Tool: Pipe-friendly, composable, JSON output, file attachments
  2. OpenAI Server: Drop-in replacement at localhost:11434 with streaming and tool calling
  3. Interactive Chat: Multi-turn with automatic context management

Key Specs

Significance

Apfel demonstrates the growing capability of on-device AI and challenges the assumption that powerful AI requires cloud infrastructure. For simple tasks, local inference on Apple Silicon is increasingly sufficient.

↗ Original source · 2026-04-03T00:00:00.000Z
← Previous: Switzerland Launches 'CERN of Semiconductor Research' InitiativeNext: Zep AI Raises the Bar: Building the Agent Context Layer with $24K+ GitHub Stars →
Comments0