Allowed Memory Size Exhausted #1071
-
Hello, While trying to create a volume via API we are getting:
In the admin logs and it fails to create the volume. The volume uses an s3 as storage disk and the images are in a folder with a lot of images (100000). Do you have an explanation for that and/or a workaround? Thanks a lot! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 5 replies
-
Can you please share the full stack trace? The problem is that the PHP application consumes too much memory because it wants to store the full directory listing of the 100k files. Usually, something like this can be optimized with lazy loading and/or looping over the data but I need the full stack trace to see where exactly this happens. As a workaround, you can first create the volume with only a couple of files (e.g. 10k). Then, when the volume exists, you can add the remaining images in (10k) chunks. |
Beta Was this translation helpful? Give feedback.
Thanks. While this makes it hard to pin down the exact source of the error, I had a suspicion. I've released a code change in v3.94.3 that could fix the issue. Please update your instance (once the updated Docker images were built) and try again. Also mind the new migrations required by v3.94.2.