

magenta.nvim is a plugin for leveraging LLM agents in neovim. Think cursor-compose, cody or windsurf, but open source.
Rather than writing complex code to compress your repo and send it to the LLM (like a repomap, etc...), magenta is built around the idea that the LLM can get ask for what it needs to via tools. Flagship models will continue to get better at tools use.
Alongside general tools like reading or editing a file, and listing a directory, this plugin also grants the LLM access to the language server via nvim's lsp client.
See the implemented tools.
Install bun
{
"dlants/magenta.nvim",
lazy = false, -- you could also bind to <leader>m
build = "bun install --frozen-lockfile",
config = function()
require('magenta').setup()
end
},
By default, <leader>m
will toggle the input and display the magenta side panel. The chat window submits your query on <CR>
in normal mode.
The display window is not modifiable, however you can interact with some parts of the chat by pressing <CR>
. For example, you can expand the tool request and responses to see their details, and you can trigger a diff to appear on file edits.
Currently there's not a way to invoke context-gathering commands yourself (#TODO), but you can ask the LLM to gather context via tools. For example: "I have some buffers open, could you see if you can change abc to xyz?".
You can see
Some cool things I've gotten to work so far:
- It uses bun for faster startup, a lower memory footprint, and ease of development with Typescript.
- It uses the new rpc-pased remote plugin setup. This means more flexible plugin development (can easily use both lua and typescript), and no need for :UpdateRemotePlugins! (h/t wallpants).
- The state of the plugin is managed via an elm-inspired architecture (The Elm Architecture or TEA) code. This makes it very predictable for code generation, and makes adding new functionality really easy and robust, as well as eases testing and makes some cool future features possible (like the ability to save a chat state into a file and restore previous chats from file on startup).
- In order to use TEA, I had to build a VDOM-like system for rendering text into a buffer. code example defining a tool view
- since it's mostly written in Typescript, we can leverage existing libraries to communicate with LLMs, and async/await to manage side-effect chains, which greatly speeds up development code
If you'd like to contribute, please reach out to me. My email is listed at my blog: dlants.me