Skip to content

New functions architecture - Refactoring - Query and display multiple LLM responses simultaneously - backend #266

@tubamos

Description

@tubamos

Related tasks

The current task #266 might interfere with how the queries are sent to the LLMs at the very beginning of the common user interaction with the model, during the answering of the quesions included in the initialisation prompt which is related to task

The current task #266 might interfere with how the LLM answers are saved which is part of task:

Description

The main functionality that needs to be re-introduced is what has been the goal of:

The new way the frontend components work after the changes Preet made in sprint 11 require changes in some of the features that interface with them.

User Story

Acceptance Criteria (generally the same as #208)

  • After the user has selected the LLMs they require answer from, the queries are sent to all of them.
  • Each of the replies coming from different LLMs of each LLM is presented in one subsection of the chat bubble compoent which is distinct from the replies coming from the other LLMs.

Definition of Done

  • The feature has been fully implemented.
  • The feature has been manually tested and works as expected without critical bugs.
  • The feature code is documented with clear explanations of its functionality and usage.
  • The feature code has been reviewed and approved by at least one team member.
  • The feature branches have been merged into the main branch and closed.
  • The feature utility, function and usage have been documented in the respective project wiki on github.

Metadata

Metadata

Labels

app backendItems related to the app backendsprint-12Items assigned to sprint 12

Type

No type

Projects

Status

Item Archive

Relationships

None yet

Development

No branches or pull requests

Issue actions