Replies: 1 comment
-
@elPapo it is in interesting idea I wanted to incorporate something similar recently in llama.cui using transformers.js it has dedicated framework for this Text-to-Text, take a look https://huggingface.co/tasks/text-generation#completion-generation-models and here https://huggingface.co/docs/transformers.js/en/index |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I would like to create a tool to help people (especially children) with specific speech and language difficulties which makes it very difficult for them to write by hand or using a keyboard.
The high-level idea is to provide a tool that would help them by providing (numerous) suggestions for the next word in a very visual way (for children, the words would be filtered through a lexic of safe words and some words would be illustrated). Clicker is a similar tool (but only runs on Windows, and doesn't have all the features I'm considering).
There are many ways to approach this, including Bayesian filtering of the words, etc... but I thought that it sounded like a good fit for llms, as they could capture the context efficiently.
This led me here, I have experimented with a couple of things, and I have so many questions!
I see two different approaches:
In the second approach, I could ideally keep track of the probability of the whole sequences of tokens, and repeat the softmax to always keep n sequences, allowing to keep several sequences from the top first token if it leads to different words with high probability~
And that's where I'd really appreciate your help!
Thank you very much for reading this far!
Beta Was this translation helpful? Give feedback.
All reactions