Skip to content

[Feature]: Request for Support of Dense and Sparse Features in bge-m3 Embedding Model #15384

@Syl123Syl

Description

@Syl123Syl

🚀 The feature, motivation and pitch

Dear VLLM Team,
I hope this message finds you well. I am writing to request support for the bge-m3 embedding model, specifically regarding the implementation of dense and sparse features.
The bge-m3 model, available at https://modelscope.cn/models/BAAI/bge-m3, is a promising embedding model that I believe could significantly enhance our capabilities in various NLP tasks. However, to fully leverage its potential, it is crucial to support both dense and sparse feature representations.
Dense embeddings are essential for capturing the nuanced relationships within the data, while sparse embeddings can be highly beneficial for handling large vocabularies and reducing computational overhead. Supporting both dense and sparse features would make the bge-m3 model more versatile and applicable to a wider range of use cases.
I kindly ask if you could consider adding support for dense and sparse features in the bge-m3 model. It would be greatly appreciated if you could also share the roadmap or timeline for this feature implementation, if available.
Thank you very much for your attention to this request. I look forward to your response and the possibility of seeing this feature supported in the future.
Best regards,

Alternatives

https://github.com/FlagOpen/FlagEmbedding

Additional context

No response

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature requestNew feature or requeststaleOver 90 days of inactivity

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions