-
I'm reading a document about parallel use of multiple crawlers and I can't understand why I need to specify a specific code from the guide: router.addHandler('CATEGORY', async ({ page, enqueueLinks, request, log }) => {
log.debug(`Enqueueing pagination for: ${request.url}`);
// We are now on a category page. We can use this to paginate through and enqueue all products,
// as well as any subsequent pages we find
await page.waitForSelector('.product-item > a');
await enqueueLinks({
selector: '.product-item > a',
label: 'DETAIL',
// highlight-next-line
requestQueue: await getOrInitQueue(), // <= Is this really necessary?
});
const nextButton = await page.$('a.pagination__next');
if (nextButton) {
await enqueueLinks({
selector: 'a.pagination__next',
label: 'CATEGORY',
});
}
}); https://crawlee.dev/js/docs/guides/parallel-scraping |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Hi there! Judging from a quick glance at the guide, it probably isn't necessary to pass the |
Beta Was this translation helpful? Give feedback.
Yeah, just setting it at the crawler level is enough.