- Add new `qwen3-8b-vision` model with multimodal support using mmproj file
- Add new `qwen2.5-coder-7b-instruct` model with FIM enabled via `--fim-qwen-7b-default`
- Update CUDA device usage from `CUDA0` to `CUDA1` for `olmoe-7b-instruct` and `phi-mini-8b-instruct`
- Upgrade llama.cpp to version 7426 with updated hash and CUDA architectures (61;86)
- Add Copilot acceptance shortcut `<C-J>` in insert mode and disable tab mapping
- Improve cache type settings across multiple models for better performance
This commit includes significant updates to the Nix configuration including:
- Added common NixOS module with system packages
- Updated home-manager configurations for various systems
- Enhanced terminal programs including Claude Code integration
- Updated LLM server configurations with new models
- Improved btop configuration with custom package
- Added opencode support with permission settings
- Updated tmux and nvim configurations
The changes introduce new features, update existing configurations, and improve the overall system setup.
- Remove old code-companion-config.lua file
- Move LLM configuration to llm.lua with LlamaSwap and Copilot integration
- Add copilot-vim plugin to default.nix
- Update which-key bindings to include new CodeCompanion commands
- Configure CodeCompanion with chat, inline, and command strategies using LlamaSwap adapter
- Add memory configuration with default project files
- Update LSP configuration for CodeCompanion markdown rendering