Skip to main content
Bodhi App - Run LLMs Locally - Your Personal, Private, Powerful AI Assistant | Free & OSS

Your Complete AI Infrastructure: Local Privacy, Cloud Power

Unified platform combining local GGUF models with API providers (OpenAI, Anthropic, Groq). Enterprise-ready with user management, OAuth2 security, and production deployment options.

Bodhi Chat Interface
Powered by llama.cpp
HuggingFace Ecosystem
OpenAI API Compatible

Core Features

Everything you need to build AI-powered applications

User Experience

Built-in Chat UI

Intuitive chat interface with full markdown and settings.

Privacy First

Run everything locally on your machine with complete data control.

Model Management

One-click downloads from HuggingFace with real-time progress tracking.

Hybrid AI Architecture

Use local GGUF models alongside API providers (OpenAI, Anthropic, Groq) in one unified interface.

Real-time Streaming

Server-Sent Events provide instant response feedback with live token streaming.

Advanced Configuration

12+ parameters for fine-tuning: temperature, top-p, frequency penalty, and more.

Technical Capabilities

API Compatibility

Drop-in replacement for OpenAI APIs. Use your existing code and tools.

Local Processing

Run models on your hardware for enhanced privacy and control.

High Performance

Optimized inference with llama.cpp. 8-12x speedup with GPU acceleration (CUDA, ROCm).

Model Aliases

Save and switch between inference configurations instantly without restarts.

Performance Metrics

Real-time statistics showing tokens per second and processing speed.

Background Downloads

Download models asynchronously with progress tracking and auto-resumption.

Enterprise & Team Ready

Built for secure collaboration with enterprise-grade authentication and comprehensive user management

User Management Dashboard

Comprehensive admin interface for managing users, roles, and access requests.

Role-Based Access Control

4 role levels (User, PowerUser, Manager, Admin) with granular permission management.

Access Request Workflow

Self-service access requests with admin approval gates and audit trail.

OAuth2 + JWT Security

Enterprise-grade authentication with PKCE, session management, and token lifecycle control.

Multi-User Deployment

Secure team collaboration with session invalidation and role change enforcement.

Developer Tools & SDKs

Everything developers need to integrate AI into applications with production-ready tools

TypeScript SDK

Production-ready npm package @bodhiapp/ts-client for seamless integration.

API Token Management

Scope-based permissions with SHA-256 hashing and database-backed security.

OpenAPI/Swagger UI

Interactive API documentation with auto-generated specs and live testing.

OpenAI Compatible

Drop-in replacement for OpenAI APIs - use existing libraries and tools seamlessly.

Ollama Compatible

Additional API format support for Ollama chat and models endpoints.

Flexible Deployment Options

Deploy anywhere - from desktop to cloud, with hardware-optimized variants for maximum performance

Multi-Platform Desktop

Native desktop apps for Windows, macOS (Intel/ARM), and Linux with Tauri.

Docker Variants

CPU (AMD64/ARM64), CUDA, ROCm, and Vulkan optimized images for every hardware.

Cloud Ready

RunPod auto-configuration and support for any Docker-compatible cloud platform.

GPU Acceleration

8-12x speedup with CUDA/ROCm GPU support for NVIDIA and AMD graphics cards.

Volume Management

Persistent storage with backup/restore strategies and migration support.

Production Ready

Health checks, monitoring, log management, and automatic database migrations.

Download for your platform

Choose your operating system to download BodhiApp. All platforms support running LLMs locally with full privacy.

macOS

Apple Silicon

File type: DMG

Download for macOS

Windows

x64

File type: MSI

Download for Windows

Linux

x64

File type: RPM

Download for Linux
Loading Docker releases...