-
Notifications
You must be signed in to change notification settings - Fork 56
fix: XYN-175 chatpage space buffer reducer for sources/followps divs #1087
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Summary of ChangesHello @MayankBansal2004, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request refactors and enhances the dynamic bottom spacing mechanism on the chat page. The primary goal is to ensure that the buffer space at the bottom of the chat window correctly accommodates the height of dynamically rendered elements such as sources, action buttons, and follow-up questions, particularly after an assistant message has completed streaming. This prevents UI elements from being cut off or having excessive whitespace, improving the overall user experience. Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughCentralizes chat bottom-space management by adding adjustBottomSpaceForContent with ResizeObserver + interval-driven updates during streaming, a post-stream finalization flow, and a follow-up-loaded callback. Threads a new onFollowUpQuestionsLoaded prop through VirtualizedMessages → ChatBox → FollowUpQuestions. (≤50 words) Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant User
participant Chat as Chat Route
participant RO as ResizeObserver
participant T as Interval Timer
participant VM as VirtualizedMessages
participant CB as ChatBox
participant FU as FollowUpQuestions
User->>Chat: open chat / start streaming
Chat->>RO: observe last assistant message
Chat->>T: start periodic spacing ticks
RO-->>Chat: resize event
T-->>Chat: tick
Chat->>Chat: adjustBottomSpaceForContent()
Chat->>VM: render (onFollowUpQuestionsLoaded)
VM->>CB: pass onFollowUpQuestionsLoaded
CB->>FU: render (onQuestionsLoaded wired)
FU-->>CB: onQuestionsLoaded
CB-->>VM: onFollowUpQuestionsLoaded
VM-->>Chat: onFollowUpQuestionsLoaded
Chat->>Chat: adjustBottomSpaceForContent(forceUpdate)
Chat->>T: stop interval
Chat->>RO: finalize observe + timers
Chat->>Chat: final adjust for actions/sources/follow-ups
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request refactors the logic for adjusting the empty space at the bottom of the chat page to reduce it as content like sources and follow-up questions appear. The changes introduce a shared callback function and new effects to handle this adjustment both during and after message streaming.
My review focuses on improving the performance and robustness of this new implementation. I've identified a couple of areas for improvement:
- A potential performance issue due to a dependency cycle in a
useCallback
, which can be fixed by using a functional state update. - The use of brittle
setTimeout
calls with magic numbers andResizeObserver
s attached to the wrong elements. I've suggested a more robust approach usingResizeObserver
on the correct element, which should make the layout adjustments more reliable and remove the need for timers.
Overall, the goal of the PR is good, and the introduction of the onFollowUpQuestionsLoaded
callback is a solid improvement. The suggested changes aim to make the implementation more efficient and maintainable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
frontend/src/routes/_authenticated/chat.tsx (1)
891-918
: Consider removing the redundant 400ms timeout for follow-up questions.The effect uses a hardcoded 400ms timeout to catch follow-up questions loading (Line 904), but there's also a dedicated callback
handleFollowUpQuestionsLoaded
(Lines 920-923) that's triggered when follow-up questions actually finish loading (Line 2323). The callback-based approach is more reliable than a fixed timeout, so the 400ms timeout might be redundant.Consider simplifying by removing the 400ms timeout and relying solely on the callback:
- // Timed adjustments to catch elements as they appear: - // 50ms: action buttons, 200ms: sources, 400ms: follow-up loading, 600ms: safety check - const timeouts = [50, 200, 400, 600].map((delay) => + // Timed adjustments to catch elements as they appear: + // 50ms: action buttons, 200ms: sources, 600ms: safety check + const timeouts = [50, 200, 600].map((delay) => setTimeout(() => adjustBottomSpaceForContent(true), delay), )
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
frontend/src/routes/_authenticated/chat.tsx
(6 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: MayankBansal2004
PR: xynehq/xyne#765
File: server/api/chat/chat.ts:5706-5712
Timestamp: 2025-09-04T08:53:33.995Z
Learning: MayankBansal2004 prefers to defer technical improvements like input validation and type safety when the current implementation is working, even when acknowledging potential benefits. He takes a pragmatic approach of "if it's not broken, don't fix it" and will revisit improvements when he has bandwidth.
Learnt from: MayankBansal2004
PR: xynehq/xyne#532
File: frontend/src/routes/_authenticated/chat.tsx:1238-1242
Timestamp: 2025-06-16T08:36:57.488Z
Learning: MayankBansal2004 prefers to keep working implementations even when security improvements are suggested, but appreciates being made aware of potential security considerations for future reference.
🔇 Additional comments (2)
frontend/src/routes/_authenticated/chat.tsx (2)
920-923
: LGTM! Callback mechanism for follow-up questions is well-implemented.The new
handleFollowUpQuestionsLoaded
callback and its integration through the component tree is clean:
- Callback triggers spacing recalculation after follow-ups load (Lines 920-923)
- Properly threaded through
VirtualizedMessages
props (Lines 1512, 1990, 2033)- Correctly passed to
FollowUpQuestions
asonQuestionsLoaded
(Line 2323)This event-driven approach is more reliable than fixed timeouts for handling dynamic content loading.
Also applies to: 1512-1512, 1990-1990, 2033-2033, 2323-2323
834-864
: Verify assistant message wrapper includes all dynamic content.Ensure the element with
data-message-role="assistant"
fully wraps each assistant message’s action buttons, source citations, and follow-up questions—any content rendered outside that wrapper won’t be included inoffsetHeight
, causing bottom-space miscalculations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
frontend/src/routes/_authenticated/chat.tsx (2)
869-893
: Observe the last assistant element; drop unused deps to prevent effect churnObserving the scrolling container won’t fire when inner content grows; observe the last assistant message element instead. Also, partial and currentResp aren’t used here and cause re-subscribes during streaming.
Apply:
useEffect(() => { if (!streamingStarted || (!isStreaming && !retryIsStreaming)) return if (initialBottomSpace === 0) return - const observer = new ResizeObserver(() => adjustBottomSpaceForContent()) - const interval = setInterval(() => adjustBottomSpaceForContent(), 100) + const observer = new ResizeObserver(() => adjustBottomSpaceForContent()) + const interval = setInterval(() => adjustBottomSpaceForContent(), 100) + // Initial sync + adjustBottomSpaceForContent() - const container = messagesContainerRef.current - if (container) observer.observe(container) + const container = messagesContainerRef.current + if (container) { + const assistants = container.querySelectorAll( + '[data-message-role="assistant"]', + ) + const lastMessage = assistants[assistants.length - 1] as HTMLElement | null + if (lastMessage) observer.observe(lastMessage) + } return () => { observer.disconnect() clearInterval(interval) } }, [ streamingStarted, isStreaming, retryIsStreaming, initialBottomSpace, - adjustBottomSpaceForContent, - partial, - currentResp, + adjustBottomSpaceForContent, ])
894-920
: Post-stream observer should also track the last assistant element; timers can be simplifiedSame as above, observe the last assistant message, not the container. Optional: the multiple timeouts can be replaced by a single RAF + one safety timeout now that you also get an explicit onQuestionsLoaded callback from FollowUpQuestions.
Apply (observer target change):
- const container = messagesContainerRef.current - if (container) observer.observe(container) + const container = messagesContainerRef.current + if (container) { + const assistants = container.querySelectorAll( + '[data-message-role="assistant"]', + ) + const lastMessage = assistants[assistants.length - 1] as HTMLElement | null + if (lastMessage) observer.observe(lastMessage) + }Optional simplification (timers):
- const timeouts = [50, 200, 400, 600].map((delay) => - setTimeout(() => adjustBottomSpaceForContent(true), delay), - ) + const rafId = requestAnimationFrame(() => + adjustBottomSpaceForContent(true), + ) + const safetyTimeout = setTimeout( + () => adjustBottomSpaceForContent(true), + 300, + ) return () => { observer.disconnect() - timeouts.forEach(clearTimeout) + cancelAnimationFrame(rafId) + clearTimeout(safetyTimeout) }
🧹 Nitpick comments (2)
frontend/src/routes/_authenticated/chat.tsx (2)
834-867
: Core logic OK; consider avoiding repeated DOM queries in hot pathadjustBottomSpaceForContent looks correct and stable. Minor: calling querySelectorAll on every tick (observer/interval) is extra work. Cache the last assistant element ref outside the function and reuse it across calls within an effect cycle.
921-925
: Good hook; prefer RAF over fixed delayUse requestAnimationFrame to avoid arbitrary 150ms delay and reduce jank.
-const handleFollowUpQuestionsLoaded = useCallback(() => { - setTimeout(() => adjustBottomSpaceForContent(true), 150) -}, [adjustBottomSpaceForContent]) +const handleFollowUpQuestionsLoaded = useCallback(() => { + requestAnimationFrame(() => adjustBottomSpaceForContent(true)) +}, [adjustBottomSpaceForContent])
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
frontend/src/routes/_authenticated/chat.tsx
(6 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: MayankBansal2004
PR: xynehq/xyne#765
File: server/api/chat/chat.ts:5706-5712
Timestamp: 2025-09-04T08:53:33.995Z
Learning: MayankBansal2004 prefers to defer technical improvements like input validation and type safety when the current implementation is working, even when acknowledging potential benefits. He takes a pragmatic approach of "if it's not broken, don't fix it" and will revisit improvements when he has bandwidth.
Learnt from: MayankBansal2004
PR: xynehq/xyne#532
File: frontend/src/routes/_authenticated/chat.tsx:1238-1242
Timestamp: 2025-06-16T08:36:57.488Z
Learning: MayankBansal2004 prefers to keep working implementations even when security improvements are suggested, but appreciates being made aware of potential security considerations for future reference.
🔇 Additional comments (3)
frontend/src/routes/_authenticated/chat.tsx (3)
1991-1992
: LGTM: Prop added to type surfaceonFollowUpQuestionsLoaded added to VirtualizedMessagesProps is clear and minimal.
2034-2035
: LGTM: Prop threaded into componentDestructuring onFollowUpQuestionsLoaded in VirtualizedMessages is correct.
2314-2325
: LGTM: Follow-ups wired with onQuestionsLoadedHooking onQuestionsLoaded to onFollowUpQuestionsLoaded enables precise spacing finalization after follow-ups load.
## [3.13.4](v3.13.3...v3.13.4) (2025-10-10) ### Bug Fixes * XYN-175 chatpage space buffer reducer for sources/followps divs ([#1087](#1087)) ([6ed3d75](6ed3d75))
Description
chatpage space buffer reducer for sources/followps divs
Testing
tested locally.
Additional Notes
Summary by CodeRabbit
New Features
Bug Fixes