Amazon Web Services
docs.aws.amazon.com โบ aws lambda โบ developer guide โบ understanding lambda function scaling โบ configuring reserved concurrency for a function
Configuring reserved concurrency for a function - AWS Lambda
Reserving concurrency for a function impacts the concurrency pool that's available to other functions.
Amazon Web Services
docs.aws.amazon.com โบ aws lambda โบ developer guide โบ understanding lambda function scaling
Understanding Lambda function scaling - AWS Lambda
Both reserved concurrency and provisioned concurrency count towards your account concurrency limit and Regional quotas. In other words, allocating reserved and provisioned concurrency can impact the concurrency pool that's available to other functions. Configuring provisioned concurrency incurs ...
Videos
13:09
AWS Lambda Concurrency Explained - YouTube
02:33
AWS Lambda Concurrency Explained | Reserved vs Provisioned ...
13:02
AWS Lambda Concurrency - Provisional & Reserved - YouTube
17:51
How does AWS Lambda Concurrency Work? - YouTube
08:23
How to achieve Concurrency Control in AWS Lambda - YouTube
Amazon Web Services
docs.aws.amazon.com โบ aws lambda โบ developer guide โบ understanding lambda function scaling
Scaling and concurrency in Lambda
Both reserved concurrency and provisioned concurrency count towards your account concurrency limit and Regional quotas. In other words, allocating reserved and provisioned concurrency can impact the concurrency pool that's available to other functions. Configuring provisioned concurrency incurs ...
Serverless
serverless.com โบ blog โบ aws-lambda-provisioned-concurrency
Provisioned Concurrency: What it is and how to use it with the Serverless Framework
It was a little difficult to exactly control how many warm instances you wanted simultaneously and you then had to execute the Lambda you wanted with some kind of branching logic that determined whether this was a warm up execution or an actual execution. It was rather ugly. But it helped folks step past the cold start issues to some degree. However, AWS has now launched Provisioned Concurrency as a feature.
Amazon Web Services
docs.aws.amazon.com โบ aws lambda โบ developer guide โบ understanding lambda function scaling โบ configuring provisioned concurrency for a function
Configuring provisioned concurrency for a function - AWS Lambda
For functions using provisioned concurrency, Lambda runs any initialization code, such as loading libraries and instantiating clients, during allocation time. Therefore, it's advisable to move as much initialization outside of the main function handler to avoid impacting latency during actual function invocations...
Adveng
adveng.com โบ home โบ understanding aws lambda concurrency
Understanding AWS Lambda Concurrency | Advanced Engineering, Inc.
August 10, 2023 - AWS Lambda stands as a powerful tool for executing code without the burden of server provisioning or management. However, as applications scale and traffic surges, managing the concurrent execution of Lambda functions becomes critical. Running Lambda at scale is a balancing act between the ...
Caylent
caylent.com โบ blog โบ use-concurrency-on-aws-lambda-to-maximize-performance
Use Concurrency on AWS Lambda to Maximize Performance | Caylent
The best-case scenario for achieving concurrency from the post office example is exactly how Lambda operates. This is done in the context that multiple requests for a Lambda function can execute multiple instances of that single function in parallel. For example, as shown in the image below, if there are multiple requests for the Lambda function, AWS will automatically provision new environments to execute the required number of concurrent Lambda functions.
AWS
aws.amazon.com โบ blogs โบ compute โบ investigating-spikes-in-aws-lambda-function-concurrency
Investigating spikes in AWS Lambda function concurrency | Amazon Web Services
January 25, 2023 - As mentioned in an earlier post, a key benefit of serverless applications is the ease with which they can scale to meet traffic demands or requests. AWS Lambda is at the core of this platform. Although this flexibility is hugely beneficial for our customers, sometimes an errant bit of code or upstream scaling can lead to spikes in concurrency.
Top answer 1 of 2
4
At first, **YES, you're right**, you should think about your lambda function as "single request = single thread-safe stateless lambda".
Such paradigm of thinking force us, engineers, to detach compute from data (state), scale each of it independently, avoid shared state, side effects, and at the end โ reach high level of parallelism avoiding hard-to-debug pitfals of parallel programming.
Regarding the second half of your question: the real beauty of the AWS Lambda is that it allows to move away from thinking about low-level concepts as CPUs utilization and IO-waiting time, and focus only only on the "business logic" and what exactly you want to achieve with your code. (and of course ,internally, AWS Lambda doing quite extensive under-the-hood optimization to avoid wasting of resources)
So technically you can, run own event loop inside single lambda call and handle multiple request within each lambda call. However I would call it an anti-pattern and a maybe sign, that you might not need Lambda here at all.
I would recommend simply to give it a try, and don't think too much about underneath resources optimization. Also, if I am not mistaken, the default lambda reserved concurrency is 1000, so should be enough to experiment with high traffic.
2 of 2
0
Lambda functions run in run-time environments, each instance in its own environment. Each instance can handle a single request at a time. There is no way to route multiple requests to a single instance at the same time.
Saying that, there are event sources that support batching, e.g., SQS, Kinesis Data Streams, etc. For those event sources you could configure the function to be invoked for multiple events and then you can handle them within a single invocation and better utilize the CPU and reduce the cost for IO intensive workloads. If your workload can be asynchronous, consider using this method (e.g., if the request is received from API gateway don't invoke the function directly but rather send it to SQS and then invoke the function using batching.