After hours of searching and thinking, I found out that, 'User Rate Limit Exceeded' has a spam protection which allow max 10 requests per second.
Thus I found out a lazy trick to do so by delaying the calls using:
usleep(rand(1000000,2000000);
It simply delays the call by a random duration between 1 and two seconds.
Answer from zamp on Stack OverflowAfter hours of searching and thinking, I found out that, 'User Rate Limit Exceeded' has a spam protection which allow max 10 requests per second.
Thus I found out a lazy trick to do so by delaying the calls using:
usleep(rand(1000000,2000000);
It simply delays the call by a random duration between 1 and two seconds.
There are sevrail types of quotas with Google apis.
Project based quotas which effect your project itself. These quotas can be extended. If for example you your project can make 10000 requests pre 100 seconds. you could request that this be extended.
Then there is the user based quotas. these quotas limit how much each user can send.
User Rate Limit Exceeded
Means that you are hitting a user rate quota. User rate quotas are flood protection they ensure that a single user of your application can not make to many requests at once.
These quotas can not be extended.
if you are getting a user rate limiting quota then you need to slow down your application and implement exponential backoff.
How you implement exponential backoff is up to you and the language you are using but it basically involves just retrying the same request again only adding wait times each time it fails
the graph
the graph in the google cloud console is an guestimate and it is not by any means accurate. If you are getting the error message you should go by that and not by what the graph says.
Videos
403: User Rate Limit Exceeded is flood protection. A user can only make so many requests at a time. unfortunately user rate limit is not shown in the graph you are looking at. That graph is actually really bad at showing what is truly happening. Google tests in the background and kicks out the error if you are exceeding your limit. They are not required to actually show us that in the graph
403: User Rate Limit Exceeded
The per-user limit has been reached. This may be the limit from the Developer Console or a limit from the Drive backend.
{ "error": { "errors": [ { "domain": "usageLimits", "reason": "userRateLimitExceeded", "message": "User Rate Limit Exceeded" } ], "code": 403, "message": "User Rate Limit Exceeded" } }
Suggested actions:
- Raise the per-user quota in the Developer Console project.
- If one user is making a lot of requests on behalf of many users of a G Suite domain, consider a Service Account with authority delegation (setting the quotaUser parameter).
- Use exponential backoff.
IMO the main thing to do when you begin to encounter this error message is to implement exponential backoff this way your application will be able to slow down and make the request again.
In my case, I was recursing through Google Drive folders in parallel and getting this error. I solved the problem by implementing client-side rate limiting using the Bottleneck library with a 110ms delay between requests:
const limiter = new Bottleneck({
// Google allows 1000 requests per 100 seconds per user,
// which is 100ms per request on average. Adding a delay
// of 100ms still triggers "rate limit exceeded" errors,
// so going with 110ms.
minTime: 110,
});
// Wrap every API request with the rate limiter
await limiter.schedule(() => drive.files.list({
// Params...
}));
Hey all, I’ve been facing a persistent issue using the Google Docs API through Make and I’m hoping someone else has been through this.
Last Friday (3 days ago), my scenario failed with the error [403] User rate limit exceeded.
It uses the “Create Document from Template” module, which replaces some placeholders (e.g. {{refNo}}, {{name}}, etc.). The document is automatically saved in Google Drive.
The scenario ran twice – The first run was successful, but the second run encountered the error [403] User rate limit exceeded.
I waited for the entire weekend without running anything in hopes of any relevant quotas imposed by Google being refreshed over time.
I did a simple test scenario today (Monday) and it still throws the same error [403] User rate limit exceeded when I do anything involving Google Docs (Create Document, Get Content, Download file, etc.). I can use the Google Sheets modules and Google Drive modules (except download a Google Doc) without any errors. I also checked the Google Cloud Platform quotas section, but:
I didn’t see anything that looked like I exceeded any limits (everything is at 0% usage),
… or maybe I just don’t know where to look as I am not very familiar with GCP.
Could my account have been silently throttled, or is there something else I might have missed? Any help or experience would be appreciated. Thanks!
Edit:
I tried looking at the API page but it shows that my usage is at 0%.
(Although I'm not sure if this is the correct page)
Google Cloud Platform API Screenshot