Replies: 1 comment
-
I know there's a project called Petals where they do something like this, but I've never tried it. As far as I understand it, one of the big problems here is that a lot of data has to be moved around and the inference process (excluding prompt processing) is highly linear/sequential. So that doesn't lend itself very well to running distributed over the internet. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, sorry in advance if this idea is really dumb (I'm a total noob in coding etc), but could it be possible, if a user allows it, to make a part of his GPU accessible to every other users when it is not used at times and if he has a good internet connexion? I was thinking that if a large number of users allows this, and with a good open source model, it could build a real free and decentralized GPT access for everybody.
Beta Was this translation helpful? Give feedback.
All reactions