Your Personal AI Power Box

A self-contained AI box inside your device

Delta is a self-contained AI box inside your device. It includes AI models, search, document analysis, and other tools — all running offline, securely, and privately.

What's Inside Your AI Power Box?

Everything you need for powerful AI, all running locally on your device

Multiple AI Models

Use different LLM models from 0.5B to 8B+ parameters. Switch models as needed.

Document Analysis

Analyze documents, PDFs, and text files locally. Your data stays on your device.

Terminal & Web UI

Terminal interface and web UI. Choose how you want to interact with Delta.

Complete Privacy

Your data stays on your device. All processing happens locally. No cloud required.

GPU Acceleration

GPU support for CUDA, Metal, Vulkan, and ROCm. Faster inference on supported hardware.

Easy Management

One-command model downloads, automatic updates, and simple configuration.

Quick Installation

One command to install your Personal AI Power Box

macOS

macOS 10.15+

Install via Homebrew

Windows

Windows 10, 11

Install via Winget

Linux

Debian, Ubuntu, Fedora, RHEL, Arch

Install via Install Script

macOS Installation

Homebrew (Recommended):

brew tap nile-agi/delta-cli && brew install --HEAD nile-agi/delta-cli/delta-cli

What it does: Automatically clones repository, installs dependencies (including Node.js for web UI), builds from source (~40 seconds), builds custom web UI, and configures PATH. Users don't need to know about git!

Alternative - Installation Script:

curl -fsSL https://raw.githubusercontent.com/nile-agi/delta/main/install.sh | bash

Quick Start

1. Run Delta CLI

delta

This will auto-download the default model (qwen2.5:0.5b, ~400MB) if not installed, then start interactive mode.

2. Download More Models (Optional)

delta pull llama3.1:8b      # 4.7 GB - powerful, versatile\ndelta pull mistral:7b         # 4.3 GB - great for coding

3. Start Web Server

delta server

Then open http://localhost:8080 in your browser to use the web interface.

Open Source & Built on llama.cpp

Delta CLI is released under the MIT License, which means it's free to use, modify, and distribute for any purpose, including commercial use.

Delta CLI is built on top of llama.cpp, a popular open-source project for running LLMs efficiently. This provides proven performance, active development, community support, model compatibility (all GGUF format models), and GPU acceleration (CUDA, Metal, Vulkan, ROCm).

View on GitHub