In today’s newsletter we are going to discuss:
What is rate limiting
How it works
Pros and cons
Code implementation
Adding middleware
Types of algorithms
How to verify rate limiting
What is rate limiting
Rate limiting is a technique used to control the amount of incoming and outgoing traffic to a network or service.
It is often used to protect servers and other resources from being overwhelmed by too many requests, or to prevent abuses such as distributed denial of service (DDoS) attacks.
How it works
It works by setting a limit on the number of requests that a client can make to a server within a specified time period.
If the client exceeds the rate limit, the server will return an error, typically an HTTP status code 429 (Too Many Requests), to the client.
Pros and cons
Benefits:
Improving user experience
Maintaining service availability
Reducing resource consumption
Detecting & blocking malicious behavior
Protecting against denial-of-service attacks of specific types
Cons:
It cannot distinguish between good and bad traffic, it will just look into IP and number of requests, so in some cases by changing the IP address attack is still possible.
How to add rate limiting middleware in .NET 7
It is quite simple , we can use this middleware by adding these lines and no need to install any package because it comes by default
builder.Services.AddRateLimiter();
app.UseRateLimiter();
This middleware gives us facility of choosing b/w 4 algorithms which suits our requirements:
Fixed window
Sliding window
Token bucket
Concurrency
Let’s see each one in depth:
1/ Fixed Window Rate Limiting
This says , “I can allow maximum X requests in Y time frame”. For example it will allow 5 requests per 10 seconds. Requests coming later on would not be entertained.
2/ Sliding Window Limiter
A sliding window algorithm helps control the number of requests allowed in a specific time frame, like X requests in Y seconds.
For example, if you want to allow 5 requests per 10 seconds, you'd divide the 10-second window into 2 segments, each lasting 5 seconds. During each 5-second segment, you can make up to 5 requests.
Any unused requests in one segment can't be carried over, but they're available again in the next 5-second segment. This ensures you don't exceed your limit of 5 requests in 10 seconds.
3/ Token Bucket Limiter
The token bucket limiter, like the sliding window limiter, controls requests over time. It adds a fixed number of tokens at set intervals (e.g., 10 tokens every 10 seconds) to a bucket with a limit (e.g., 100 tokens). You can make requests by spending tokens from the bucket, ensuring you don't exceed the token limit.
4/ Concurrency Limiter
The concurrency limiter restricts how many requests can happen simultaneously. For every request, it reduces the allowed concurrency by one. When a request finishes, the concurrency limit goes up by one.
Unlike other request limiters that restrict the total requests in a given time frame, this limiter focuses solely on how many requests can occur at the same time, without specifying a cap on requests over time.
Policy name could be anything , I would prefer to keep policy names and rate limiting options in
appSettings
and then read from it.
How to Verify Rate Limiting
We have following attributes
EnableRateLimiting
DisableRateLimiting
By using EnableRateLimiting we can enable rate limiting on controller level and method level like this:
[EnableRateLimiting("FiveRequestsPerTenSecond")]
If we have applied controller level rate limiting but you wanna disable it for some actions then we can add [DsiableRateLimiting]
attribute on that action.
Whenever you’re ready, there are 3 ways I can help you
Promote yourself to 7600+ subscribers by Sponsoring my Newsletter
Become a Patron and get access to 140+ .NET Questions and Answers
Get a FREE eBook from Gumroad that contains 30 .NET Tips (2900+ downloads)
Special Offers 📢
Ultimate ASP.NET Core Web API Second Edition - Premium Package
10% off with discount code: 9s6nuez