A powerful coc.nvim extension that brings AI-powered code generation, editing, and assistance directly to your Neovim editor. Built on top of the Llamautoma framework.
- 🤖 AI-powered chat interface for code assistance
- ✏️ Smart code editing and modifications
- 🔨 Code generation and file composition
- 🔄 Workspace synchronization with AI context
- ⚡ Real-time streaming responses
- 🔒 Built-in safety controls
- Neovim >= 0.8.0
- coc.nvim
- Node.js >= 16.0.0
- Running Llamautoma server
- Ollama
- Default:
qwen2.5-coder:7b
- Default:
(Cloud-based solution coming soon) (VSCode extension maybe coming soon?)
- Install using your preferred package manager:
" Using vim-plug
Plug 'neoclide/coc.nvim', {'branch': 'release'}
Plug 'dgpt/coc-llamautoma'
" Using packer.nvim
use {'neoclide/coc.nvim', branch = 'release'}
use {'dgpt/coc-llamautoma'}
" Using Vundle
Plugin 'neoclide/coc.nvim'
Plugin 'dgpt/coc-llamautoma'
- Configure the extension in your coc-settings.json:
{
"llamautoma.enable": true,
"llamautoma.url": "http://localhost:3000",
"llamautoma.timeout": 30000,
"llamautoma.model": "qwen2.5-coder:7b",
"llamautoma.autoSync": true,
"llamautoma.syncOnSave": true,
"llamautoma.syncIgnorePatterns": [
"node_modules",
"dist",
"build",
".git"
],
"llamautoma.maxFileSize": 1000000,
"llamautoma.logLevel": "info"
}
The following commands can be executed using :CocCommand
:
llamautoma.chat
: Open interactive chat for code assistancellamautoma.sync
: Synchronize workspace with AI context
Each command supports streaming responses for real-time feedback.
" Example key mappings
nmap <silent> <Leader>lc :CocCommand llamautoma.chat<CR>
nmap <silent> <Leader>ls :CocCommand llamautoma.sync<CR>
Available settings in coc-settings.json:
{
// Server Configuration
"llamautoma.enable": true,
"llamautoma.url": "http://localhost:3000", // Local server URL
"llamautoma.timeout": 30000, // Request timeout in ms
// Model Configuration
"llamautoma.model": "qwen2.5-coder:7b", // Default model
// Sync Configuration
"llamautoma.autoSync": true, // Auto-sync workspace
"llamautoma.syncOnSave": true, // Sync on file save
"llamautoma.syncIgnorePatterns": [ // Files to ignore
"node_modules",
"dist",
"build",
".git"
],
// Safety Configuration
"llamautoma.maxFileSize": 1000000, // Max file size in bytes
"llamautoma.logLevel": "info" // Logging verbosity
}
coc-llamautoma includes several safety features:
- Request timeouts and size limits
- Configurable file exclusions
- Workspace synchronization controls
- Error handling and recovery
- Activity logging
Common issues and solutions:
-
Server Connection
# Check server status curl http://localhost:3000/health
-
Extension Loading
:CocList extensions " Check if extension is loaded :CocInfo " Check extension status
-
Logs
:CocCommand workspace.showOutput
MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.