Your Complete AI Infrastructure: Local Privacy, Cloud Power
Unified platform combining local GGUF models with API providers (OpenAI, Anthropic, Groq). Enterprise-ready with user management, OAuth2 security, and production deployment options.

Core Features
Everything you need to build AI-powered applications
User Experience
Use local GGUF models alongside API providers (OpenAI, Anthropic, Groq) in one unified interface.
Server-Sent Events provide instant response feedback with live token streaming.
12+ parameters for fine-tuning: temperature, top-p, frequency penalty, and more.
Technical Capabilities
Optimized inference with llama.cpp. 8-12x speedup with GPU acceleration (CUDA, ROCm).
Download models asynchronously with progress tracking and auto-resumption.
Enterprise & Team Ready
Built for secure collaboration with enterprise-grade authentication and comprehensive user management
Comprehensive admin interface for managing users, roles, and access requests.
4 role levels (User, PowerUser, Manager, Admin) with granular permission management.
Self-service access requests with admin approval gates and audit trail.
Enterprise-grade authentication with PKCE, session management, and token lifecycle control.
Secure team collaboration with session invalidation and role change enforcement.
Developer Tools & SDKs
Everything developers need to integrate AI into applications with production-ready tools
Scope-based permissions with SHA-256 hashing and database-backed security.
Interactive API documentation with auto-generated specs and live testing.
Drop-in replacement for OpenAI APIs - use existing libraries and tools seamlessly.
Flexible Deployment Options
Deploy anywhere - from desktop to cloud, with hardware-optimized variants for maximum performance
Native desktop apps for Windows, macOS (Intel/ARM), and Linux with Tauri.
CPU (AMD64/ARM64), CUDA, ROCm, and Vulkan optimized images for every hardware.
RunPod auto-configuration and support for any Docker-compatible cloud platform.
8-12x speedup with CUDA/ROCm GPU support for NVIDIA and AMD graphics cards.
Health checks, monitoring, log management, and automatic database migrations.
Download for your platform
Choose your operating system to download BodhiApp. All platforms support running LLMs locally with full privacy.
macOS
Apple Silicon
File type: DMG
Download for macOSWindows
x64
File type: MSI
Download for WindowsLinux
x64
File type: RPM
Download for Linux