Replies: 4 comments 8 replies
-
Update: this discussion is outdated, PackBytes now has a new schema system From your code it looks like you are trying to automatically determine the number of bits needed for the field based on the value of the field for each item individually. You are doing that with this line: const bit = Math.floor(val ? Math.log2(val) + 1 : 1); This is a misconception on how it works. Your encoding function and decoding function should know before doing the encoding and decoding what the bit sizes are. It's a schema you design in advance. If you generate bit sizes for every value then you have to send bit sizes with the data to decode it, this will double the size sent and makes no sense
I use packBytes to take large JSON objects provided from a different source and compress them down 25 or 50 times smaller, the JSON has objects and arrays and strings in it etc. You simply loop through the arrays and objects and add them to your buffer 1 at a time, nothing complicated about it. You need to manually code this process though, looks like you are trying automate everything with a single simple encoding function, but each field has different needs, it's more of a manual process for each field. Usually each field has different optimizations you can make before encoding the value to reduce the size, your decoder then does the reverse process to get original value
Buffer has the buf.write(string) method to write strings into the buffer. If you write a string into the middle of a buffer you need to also add the length of the string so it knows when the string ends. Usually you do this by adding the string length to the buffer first, then write the string. Your decoding function looks at how long the string is, then decodes the string by slicing the buffer at that position and length. Strings take up a lot of space so ideally map strings to integers if possible
This value needs 34 bits so is outside the 32 bit range for packBytes. For this number I would use the Buffer writeUIntBE function which lets you write custom number of bytes, for this it needs 5 bytes. It will be less efficient than ideal because it wastes 6 bits, but it's a better option than writeBigUInt64BE which uses 8 bytes and would waste 30 bits. However if your data target is the web browser DataView can only decode 32 and 64 bit numbers so the 5 bytes encode won't work for that, in that case use the writeBigUInt64BE. I had a packBytes64 function that could pack 64 bits into 1 space but I removed it because it's slower and usually you don't need it, I could possibly add it back in |
Beta Was this translation helpful? Give feedback.
-
Hi, s = a special character meaning from this point until the next occurrence we have a string and the encoded data will be something like this 1111111s222222sa333333333aos444444444sa55555555555ao66666666oo translating to: I understand now, so the bits number will have to be a fixed value based on the field type so
and strings or other values bigger than int32 it would have to be the bits of each character ? for example
or can we convert the character ASCII Code (120 for x) and send that as 1 bit, but then how would we know this int is a character code maybe use a specific split char for example and know that everything between those split characters are ASCII codes and decode them as letters or characters... so "aaa" will have to end up |aaa| // 7+7+7+7+7 = 35 bits and how can we send an array with x elements is it safe to use a custom character, delimiter, so that we know that all values between are array elements and decode them inside an array something like this:
or the best way is to send the arrays elements as a separate transmission, but for array of objects inside array of objects that will be quite hard to manage... Thanks for your time |
Beta Was this translation helpful? Give feedback.
-
Here is the original object so you can understand better, I've translated all strings to ints and i have them available on the client browser, so i just need to send a bunch of objects that contains array of objects with ints and floats values
all objects are the same, only the arrays lengths differs so i wanted to pack the object and send something like this
o._id is a mongodb _id and needs to be sent as string so i was trying to create a function or a class that will scan my object, create a schema and then pack everything, on client i will need to have that schema to unpack and recreate my object translating the values to object properties. |
Beta Was this translation helpful? Give feedback.
-
o._id can be sent as blob if i can decode it to string, i just need it in case any object get edited and i have to update/remove it from db later. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I'm trying to implement packBytes into my workflow and was wandering if it's possible to encode strings or other data like objects or arrays. Also how can i process values like 12345678901?
this is what i did:
and i get the bits and the packed data, but it's not working for big integers, strings or anything else, any ideas on how to solve this?
And if we sort this out we can make a recursive function to process object values like arrays or other objects too and maybe use a custom character to split those custom values so they can have variable length..
Just wandering if this is possible..
Thanks
Beta Was this translation helpful? Give feedback.
All reactions