b5e60ff0e2386699a622d7fb3c4ca952a827696b
All checks were successful
continuous-integration/drone/push Build is passing
Aethera
A web dashboard for AI-powered conversations and image generation, backed by any OpenAI-compatible API.
Features
- Chat Interface — streaming responses with Markdown rendering and syntax highlighting
- Thinking Support — displays model reasoning/thinking content when available
- Multiple Conversations — switch between threads with auto-generated titles
- Image Generation & Editing — create and edit images with customizable prompts, masks, and seeds
- Token Statistics — real-time prompt/generation throughput and timing metrics
- Theme Support — light and dark mode toggle
- Structured Output — JSON schema-based structured responses from models
- Embedded Frontend — single binary deployment with assets compiled in
Quick Start
Prerequisites
- Go 1.25.5+
- Bun
- An OpenAI-compatible API endpoint
Using Make
make all # Build frontend + backend
./backend/dist/aethera
Using Docker
make docker
docker run -p 8080:8080 \
-e AETHERA_LLM_ENDPOINT=https://api.example.com/v1 \
-e AETHERA_LLM_KEY=your-key \
-v aethera-data:/app/data aethera
Manual Build
# Frontend
cd frontend && bun install && bun run build && cd ..
# Copy assets to backend
mkdir -p backend/web/static
cp -R frontend/public/. backend/web/static/
# Backend
cd backend && go build -o ./dist/aethera ./cmd
./dist/aethera
Open http://localhost:8080 in your browser.
Configuration
Configuration is available via CLI flags and environment variables (prefixed AETHERA_):
| Flag | Env Var | Default | Description |
|---|---|---|---|
--llm-endpoint |
AETHERA_LLM_ENDPOINT |
(required) | OpenAI-compatible API endpoint URL |
--llm-key |
AETHERA_LLM_KEY |
API key for authentication | |
--data-dir |
AETHERA_DATA_DIR |
./data |
Directory for chats, settings, and images |
--static-dir |
AETHERA_STATIC_DIR |
(embedded) | Serve frontend from disk (for development) |
--listen |
AETHERA_LISTEN |
localhost |
Listen address |
--port |
AETHERA_PORT |
8080 |
Listen port |
Example:
AETHERA_LLM_ENDPOINT=https://api.example.com/v1 AETHERA_LLM_KEY=your-key ./backend/dist/aethera
Development
A Nix flake is provided for the development environment:
nix develop # or use direnv with .envrc
This provides Go, Bun, gopls, typescript-language-server, golangci-lint, and watchman.
For hot-reload development:
make dev
This starts the Go backend (serving frontend from disk) and the frontend in watch mode concurrently.
Getting Started
- Configure Your API — set
AETHERA_LLM_ENDPOINTand optionallyAETHERA_LLM_KEYenvironment variables - Start the Server — run the binary and navigate to
http://localhost:8080 - Configure Model Selectors — navigate to Settings to configure model selectors for chat and image generation
Supported AI Services
Aethera works with any OpenAI-compatible API, including:
- OpenAI
- Local LLMs (Ollama, llama.cpp, LocalAI, etc.)
- Any other compatible service
Llama.cpp-specific features like per-token timings are automatically detected.
License
See LICENSE file for details.
Description
Languages
Go
50.1%
HTML
22.9%
TypeScript
21.2%
CSS
3.4%
Makefile
0.8%
Other
1.6%