Please help decode #4250
Unanswered
jmikedupont2
asked this question in
Q&A
Replies: 2 comments 2 replies
-
You could use |
Beta Was this translation helpful? Give feedback.
2 replies
-
so in that case I dont really want to dump it out. that is how i got the 20
gb of dump because the model gets read in a few times. So that is good.
thanks for helping me understand.
…On Wed, Nov 29, 2023, 07:45 slaren ***@***.***> wrote:
Yes, it is just the model weights.
—
Reply to this email directly, view it on GitHub
<#4250 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AD5KQ2OWYHFVFZ26XEHXX6DYG4U5ZAVCNFSM6AAAAAA76WQSRWVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TOMBUGI4TM>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
need help, how to get a list of floats from this ? num_elements = 4194304
(gdb) p *tensor
$4 = {type = GGML_TYPE_Q4_0, backend = GGML_BACKEND_CPU, buffer = 0x0, n_dims = 2, ne = { 4096, 1024, 1, 1}, nb = {18, 2304, 2359296, 2359296}, op = GGML_OP_NONE,?
https://twitter.com/introsp3ctor/status/1729646601155600837?t=XEOrcmg9d3LvLq4TX6bAlg&s=19
https://twitter.com/introsp3ctor/status/1729651540892307469?t=w6k5V4gqXVyCzh6XfzmVwQ&s=19
Basically I'm trying to figure out how to decode these quantitized columns because they're showing as being absolutely huge
One of them had 300,000 elements in it
Beta Was this translation helpful? Give feedback.
All reactions