Fidget.nvim integration #813
Replies: 5 comments 18 replies
-
This is fantastic and such a clean implementation. To get additional information regarding the chat buffer you could do: local chat = require("codecompanion").buf_get_chat(request.data.bufnr) That gives you access to everything in the chat buffer so you could then access My reason for only emitting the I would very much love to add this to the docs pages. I've purposefully never added anything that resembles aesthetics because it's so subjective. But it would be great to showcase what others have done. |
Beta Was this translation helpful? Give feedback.
-
@olimorris here's the final demo + code. Thanks for all of the input, quick responses, and CodeCompanion in general. I had a lot of fun hacking on this small integration this weekend. 😆 codecompanion-fidget-demo.mp4Here's what I came up with in terms of code: -- lua/plugins/codecompanion.lua
return {
{
"olimorris/codecompanion.nvim",
dependencies = {
-- ...
"j-hui/fidget.nvim"
},
opts = {
strategies = {
-- ...
},
},
init = function()
require("plugins.codecompanion.fidget-spinner"):init()
end,
},
} -- lua/plugins/codecompanion/fidget-spinner.lua
local progress = require("fidget.progress")
local M = {}
function M:init()
local group = vim.api.nvim_create_augroup("CodeCompanionFidgetHooks", {})
vim.api.nvim_create_autocmd({ "User" }, {
pattern = "CodeCompanionRequestStarted",
group = group,
callback = function(request)
local handle = M:create_progress_handle(request)
M:store_progress_handle(request.data.id, handle)
end,
})
vim.api.nvim_create_autocmd({ "User" }, {
pattern = "CodeCompanionRequestFinished",
group = group,
callback = function(request)
local handle = M:pop_progress_handle(request.data.id)
if handle then
M:report_exit_status(handle, request)
handle:finish()
end
end,
})
end
M.handles = {}
function M:store_progress_handle(id, handle)
M.handles[id] = handle
end
function M:pop_progress_handle(id)
local handle = M.handles[id]
M.handles[id] = nil
return handle
end
function M:create_progress_handle(request)
return progress.handle.create({
title = " Requesting assistance (" .. request.data.strategy .. ")",
message = "In progress...",
lsp_client = {
name = M:llm_role_title(request.data.adapter),
},
})
end
function M:llm_role_title(adapter)
local parts = {}
table.insert(parts, adapter.formatted_name)
if adapter.model and adapter.model ~= "" then
table.insert(parts, "(" .. adapter.model .. ")")
end
return table.concat(parts, " ")
end
function M:report_exit_status(handle, request)
if request.data.status == "success" then
handle.message = "Completed"
elseif request.data.status == "error" then
handle.message = " Error"
else
handle.message = " Cancelled"
end
end
return M |
Beta Was this translation helpful? Give feedback.
-
I adapted your integration to work with -- plugins/ai/companion.lua
return {
"olimorris/codecompanion.nvim",
init = function()
require("plugins.ai.extensions.companion-notification").init()
end,
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
"folke/noice.nvim",
}
} -- plugins/ai/extensions/companion-notification.lua
local Format = require("noice.text.format")
local Message = require("noice.message")
local Manager = require("noice.message.manager")
local Router = require("noice.message.router")
local ThrottleTime = 200
local M = {}
M.handles = {}
function M.init()
local group = vim.api.nvim_create_augroup("NoiceCompanionRequests", {})
vim.api.nvim_create_autocmd({ "User" }, {
pattern = "CodeCompanionRequestStarted",
group = group,
callback = function(request)
local handle = M.create_progress_message(request)
M.store_progress_handle(request.data.id, handle)
M.update(handle)
end,
})
vim.api.nvim_create_autocmd({ "User" }, {
pattern = "CodeCompanionRequestFinished",
group = group,
callback = function(request)
local message = M.pop_progress_message(request.data.id)
if message then
message.opts.progress.message = M.report_exit_status(request)
M.finish_progress_message(message)
end
end,
})
end
function M.store_progress_handle(id, handle)
M.handles[id] = handle
end
function M.pop_progress_message(id)
local handle = M.handles[id]
M.handles[id] = nil
return handle
end
function M.create_progress_message(request)
local msg = Message("lsp", "progress")
local id = request.data.id
msg.opts.progress = {
client_id = "client " .. id,
client = M.llm_role_title(request.data.adapter),
id = id,
message = "Awaiting Response: ",
}
return msg
end
function M.update(message)
if M.handles[message.opts.progress.id] then
Manager.add(Format.format(message, "lsp_progress"))
vim.defer_fn(function()
M.update(message)
end, ThrottleTime)
end
end
function M.finish_progress_message(message)
Manager.add(Format.format(message, "lsp_progress"))
Router.update()
Manager.remove(message)
end
function M.llm_role_title(adapter)
local parts = {}
table.insert(parts, adapter.formatted_name)
if adapter.model and adapter.model ~= "" then
table.insert(parts, "(" .. adapter.model .. ")")
end
return table.concat(parts, " ")
end
function M.report_exit_status(request)
if request.data.status == "success" then
return "Completed"
elseif request.data.status == "error" then
return " Error"
else
return " Cancelled"
end
end
return M Which does have load wheel that will cycle(sorry I don't have a workflow for screencasts lmao) It's probably a bit messier than it needs to be since I basically copy pasted your work as a starting point and then chipped away to get it to match Noice's implementation for lsp progress events, but it gets the job done! |
Beta Was this translation helpful? Give feedback.
-
Here's a version for fellow local group = vim.api.nvim_create_augroup("CodeCompanionFidgetHooks", { clear = true })
vim.api.nvim_create_autocmd({ "User" }, {
pattern = "CodeCompanionRequest*",
group = group,
callback = function(request)
local msg
if request.match == "CodeCompanionRequestStarted" then
msg = "[CodeCompanion] starting..."
elseif request.match == "CodeCompanionRequestStreaming" then
msg = "[CodeCompanion] streaming..."
else
msg = "[CodeCompanion] finished"
end
vim.notify(msg, "info", {
id = "code_companion_status",
title = "Code Companion Status",
opts = function(notif)
notif.icon = request.match == "CodeCompanionRequestFinished" and " "
---@diagnostic disable-next-line: undefined-field
or spinner[math.floor(vim.uv.hrtime() / (1e6 * 80)) % #spinner + 1]
end,
})
end,
}) NVIDIA_Overlay_2025-05-08_21-12-08.webm |
Beta Was this translation helpful? Give feedback.
-
I was inspired by this thread and made something similar for lualine; I have requested this here -- nvim-lualine/lualine.nvim#1419 -- but I will leave the code below: 2025-05-22.23-45-16.mp4
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
CodeCompanion is AWESOME!
I'm working on a cool fidget.nvim integration to show signs of life when CodeCompanion is waiting.
It's pretty cool!
Tip
See the finalized demo & code here.
Demo 👀
Screen.Recording.2025-02-01.at.10.47.21.mov
See the source code 😎
Notes:
Anyway... I wanted to improve my experience a bit and show what adapter (and underlying model) are being used to make the request. I think this could improve the experience of
:CodeCompanion*
commands where (unlike in the chat) you don't really know what the current adapter and model are?I don't have a code-companion model selection in my status line or anything (since I use lualine with a unique status line on each buffer... indicating the current adapter and model on every single one seems a bit excessive.)
Perhaps my use-case isn't the best.. Nevertheless...
I figured providing data should be pretty easy! I traced the
CodeCompanionRequestStarted
event here:codecompanion.nvim/lua/codecompanion/http.lua
Line 174 in d75641e
codecompanion.nvim/lua/codecompanion/utils/init.lua
Lines 8 to 11 in d75641e
I printed
event.data
and it seems pretty minimal for me:Output:
data.adapter
Beta Was this translation helpful? Give feedback.
All reactions