Bodhi App - Run LLMs locally

2 min read Original article ↗

Bodhi App - Run LLMs Locally - Your Personal, Private, Powerful AI Assistant | Free & OSS

Bodhi Chat Interface

Core Features

Everything you need to build AI-powered applications

User Experience

Intuitive chat interface with full markdown and settings.

Learn more

Run everything locally on your machine with complete data control.

Learn more

One-click downloads from HuggingFace with real-time progress tracking.

Learn more

Use local GGUF models alongside API providers (OpenAI, Anthropic, Groq) in one unified interface.

Learn more

Server-Sent Events provide instant response feedback with live token streaming.

Learn more

12+ parameters for fine-tuning: temperature, top-p, frequency penalty, and more.

Learn more

Technical Capabilities

Drop-in replacement for OpenAI APIs. Use your existing code and tools.

Learn more

Run models on your hardware for enhanced privacy and control.

Learn more

Optimized inference with llama.cpp. 8-12x speedup with GPU acceleration (CUDA, ROCm).

Learn more

Save and switch between inference configurations instantly without restarts.

Learn more

Real-time statistics showing tokens per second and processing speed.

Learn more

Download models asynchronously with progress tracking and auto-resumption.

Learn more

Enterprise & Team Ready

Built for secure collaboration with enterprise-grade authentication and comprehensive user management

User Management Dashboard

Comprehensive admin interface for managing users, roles, and access requests.

Learn more

Role-Based Access Control

4 role levels (User, PowerUser, Manager, Admin) with granular permission management.

Learn more

Self-service access requests with admin approval gates and audit trail.

Learn more

Enterprise-grade authentication with PKCE, session management, and token lifecycle control.

Learn more

Secure team collaboration with session invalidation and role change enforcement.

Learn more

Developer Tools & SDKs

Everything developers need to integrate AI into applications with production-ready tools

Production-ready npm package @bodhiapp/ts-client for seamless integration.

Learn more

Scope-based permissions with SHA-256 hashing and database-backed security.

Learn more

Interactive API documentation with auto-generated specs and live testing.

Learn more

Drop-in replacement for OpenAI APIs - use existing libraries and tools seamlessly.

Learn more

Additional API format support for Ollama chat and models endpoints.

Learn more

Flexible Deployment Options

Deploy anywhere - from desktop to cloud, with hardware-optimized variants for maximum performance

Native desktop apps for Windows, macOS (Intel/ARM), and Linux with Tauri.

Learn more

CPU (AMD64/ARM64), CUDA, ROCm, and Vulkan optimized images for every hardware.

Learn more

RunPod auto-configuration and support for any Docker-compatible cloud platform.

Learn more

8-12x speedup with CUDA/ROCm GPU support for NVIDIA and AMD graphics cards.

Learn more

Persistent storage with backup/restore strategies and migration support.

Learn more

Health checks, monitoring, log management, and automatic database migrations.

Learn more

Download for your platform

Choose your operating system to download BodhiApp. All platforms support running LLMs locally with full privacy.

Package Managers

brew install BodhiSearch/apps/bodhi

Loading Docker releases...