What would it take train it? #76
Replies: 3 comments 8 replies
-
that's the billion dollar question, quite literally. just join Laion discord and start collaborating with researchers and other ML folks out there! many people want this answered. |
Beta Was this translation helpful? Give feedback.
-
I'm thinking something like crowdsourcing a GPU cloud, like folding-at-home but for GPUs? Plenty of people would be willing to donate their idle GPU hours to this, I think. Maybe talk to run.ai and see if they're interested in helping out? This could be a great opportunity for them to back one of the most exciting projects in AI right now, without investing tons of money or GPU resources. |
Beta Was this translation helpful? Give feedback.
-
Have you heard of this? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What would it take to build my own, perhaps a minimal proof of concept with smaller data set for training?
I guess it is not feasible to assume a single machine could do it (in less than a couple of months) and distributed training is a must.
And not only hardware-wise, about the code, I know Open AI for a very long time but never actually got into it, from that I mean, is everything really open if I dig deep enough?
About the training data used, is it available somewhere?
This assuming the already trained parameters are not available somewhere for download. Which I guess are not.
And I am sorry if I asked something that is answered somewhere else, I did not dig into it too much, but got pretty excited to get back into AI from this!
Beta Was this translation helpful? Give feedback.
All reactions