You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When a query has a large enough result set that we don't want to
materialize it at once, the SDK provides `query.each()`, which yields
each matching object to a processor, one at a time. This is a handy
tool, but if the number of records to process is really so large, we
also can take advantage of batching the processing. Compare the
following operations:
```
// Processing N items involves N calls to Parse Server
new Parse.Query('Item').each((item) => {
item.set('foo', 'bar');
return item.save();
})
// Processing N items involves ceil(N / batchSize) calls to Parse Server
const batchSize = 200;
new Parse.Query('Item').eachBatch((items) => {
items.forEach(item => item.set('foo', 'bar'));
return Parse.Object.saveAll(items, { batchSize });
}, { batchSize });
```
The `.each()` method is already written to do fetch the objects in
batches; we effectively are splitting it out into two:
- `.eachBatch()` does the work to fetch objects in batches and yield
each batch
- `.each()` calls `.eachBatch()` and handles invoking the callback for
every item in the batch
Aside: I considered adding the undocumented `batchSize` attribute
already accepted by `.each()`, `.filter()`, `.map()` and `.reduce()` to
the public API, but I suspect that at the time that you are performance
sensitive enough to tune that parameter you are better served by
switching to `eachBatch()`; the current implementation of `.each()` is
to construct a promise chain with a node for every value in the batch,
and my experience with very long promise chains has been a bit
frustrating.
Co-authored-by: Arthur Cinader <700572+acinader@users.noreply.github.com>
0 commit comments