Default to a 1 second backoff when hitting 429s #142
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
When importing a large amount of new entries, or deleting them we're hitting against the general API limit.
Using a pool improves our performance but we still seem to see the odd failure where the 10 routines are fighting one another enough that we fail after X retries (tried up to 10 and still saw issues).
The big fix that worked here was to ensure we backoff for a minimum of 1 second at least (anything smaller, or even negative means we still do 1 now).
This results in the rate we submit results peaking at several hundred requests per second until we hit our per min API rate limit, then we are at the mercy of the token bucket refill - essentially doing max we can as our rate limit refills.
This slows down a little, but it's in theory the best we can do per the rate limit and we're a better citizen talking to the API.