Chat first
Users spend most of their time in the main chat workspace, working with the local model that an admin has loaded.
Open Chat UXLLM Controller Community Edition is a self-hosted browser app for chatting with local GGUF models through a configured llama-server executable.
The main experience is the chat UI. Signed-in users create chats, send prompts to the currently loaded main model, stream responses, attach supported files, and manage their own history.
A drawer bar sits beside that chat experience. It opens model status, live logs, GPU monitoring, analytics, chat import/export, benchmarks, and admin settings so operators can see and control what is happening behind the scenes.
Users spend most of their time in the main chat workspace, working with the local model that an admin has loaded.
Open Chat UXLogs, model status, GPU telemetry, analytics, and chat history tools stay close to the chat without taking over the page.
Open Runtime VisibilityAdmins manage the local model registry, model launch controls, users, password policy, auth email settings, benchmarks, and destructive chat tools.
Open AdministrationThe installer and Admin Settings drawer define the database connection, runtime defaults, attachment limits, SMTP settings, and local prerequisites.
Open Setup & SettingsCE launches the configured llama-server executable against local GGUF model files. Admins choose which scanned models are enabled, which one is loaded as the main chat model, and whether a separate title model process is running.
MySQL stores accounts, app settings, login attempt records, and benchmark metadata/results. A local SQLite database stores chat sessions, chat turns, response metadata, usage metrics, and version/variant state.