-
Notifications
You must be signed in to change notification settings - Fork 26
Open
Description
Im very excited about the new json v2 library and I'm having the challenge that i have to Marshal and Unmarshal json with multiple big base64 encoded files like pdf or images without reading the whole files into the memory. I have solved it by writing my own json parser, but Im curious if there is a way solve it with the current implementation. To make the problem more clear I have build this simple snippet:
package main
import (
"bytes"
"fmt"
"github.com/go-json-experiment/json
"github.com/go-json-experiment/json/jsontext"
)
type Test struct {
F BigBase64File
}
type BigBase64File string
func (f *BigBase64File) UnmarshalJSONFrom(dec *jsontext.Decoder) error {
// read a big file
v, err := dec.ReadValue() // the problem i dont have access to an io.Reader to stream it to a storage like s3 without reading it to the memory
if err != nil {
return err
}
*f = BigBase64File(v.String())
return nil
}
func main() {
jsonStr := `{"F":"<big base64 encoded file"}`
b := bytes.NewBufferString(jsonStr)
var t Test
err := json.UnmarshalRead(b, &t)
if err != nil {
panic(err)
}
fmt.Printf("%+v\n", t)
}
Is there a way read the file in chunks to stream it to a storage like s3?
Metadata
Metadata
Assignees
Labels
No labels