Skip to content

Answer extraction without LLMs #25

@DrDub

Description

@DrDub

While this projects contains the SOTA in terms of embeddings and local LLMs, the answer extraction is too slow.

It might be possible to use an answer extraction system based on RNNs (like Mamba) trained on the output of a local LLM.

This is a research endeavour.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions