Replies: 1 comment
-
may I suggest
this will save you some ram . 2- chunk the results . 3- add the chunks to queue 4- divide the the sitmap to parts based on the month or day and link them all in the index_sitemap . and save them to s3 or to disk later you dont need to update all . when you get content that has been updated , check createdat and update that month file . |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I'm currently working with a client who is using a rather modest EC2 instance (t2.medium) equipped with 2 vCPUs and 4GiB of RAM. We encountered a challenge while attempting to update their sitemaps - one sitemap per locale, each on its own subdomain. My initial approach was to utilize laravel-sitemap through Laravel's Artisan Tinker directly, without setting up any special configuration or creating dedicated controller methods/routes for this task.
Upon executing
\Spatie\Sitemap\SitemapGenerator::create('https://clientwebsite.org')->getSitemap()->writeToDisk('public', 'sitemap-en.xml');
in Artisan's Tinker, the website crashed within 10 seconds, resulting in multiple 504 timeout errors. Monitoring the process withhtop
, I observed a spike in CPU usage that confirmed the strain on the server resources.Does anyone have recommendations or workarounds for generating sitemaps in such an environment without overwhelming the server? I already tried to max out and increase the number of nginx workers with no help.
Beta Was this translation helpful? Give feedback.
All reactions