Throttling sometimes results in cURL error 6 #50
Unanswered
readyonelabs
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks for the great package to more elegantly implement Laravel rate limiting on queued jobs in Redis. I recently started using this package to throttle the processing of various queued jobs, by user. These jobs each make at least one http call to an external API endpoint. The endpoint itself imposes rate limiting (2 calls per 1 second), so the throttle (via the RateLimited middleware) attempts to keep job processing below the remote service's limits.
However, especially when there are a lot of jobs, they repeatedly fail (during the retryUntil period) with an error like this:
GuzzleHttp\\Exception\\ConnectException(code: 0): cURL error 6: getaddrinfo() thread failed to start
Here are the error lines from the stack trace relating to this package:
If I disable the RateLimited middleware then all of the jobs are processed by the queue without error, even though some would get rate limited by the remote service, which is handled in my code via a small sleep/retry loop.
My implementation is as follows:
Do you have any ideas on why rate limiting, or my implementation of it, would seemingly exhaust the system's ability to resolve a host and/or "break" curl? Are there better ways to implement job throttling? Thank you very kindly in advance.
Beta Was this translation helpful? Give feedback.
All reactions