I'm working on an open source CLI / Python SDK called LaunchFlow that sets up the exact automation you described! All you need to do is supply a requirements.txt for the Python dependencies, and the tool will zip up your local workspace + Python dependencies and release it to your Lambda function.
The code looks like this:
import launchflow as lf
import numpy as np
def handler(event, context):
result = np.random.randint(0, 100)
return {
"statusCode": 200,
"body": f"Random Number: {result}",
}
api = lf.aws.LambdaService(
name="my-lambda-api",
handler=handler,
runtime=lf.aws.lambda_service.PythonRuntime(
requirements_txt_path="requirements.txt"
),
)
LambdaService Docs link
Just pip install launchflow[aws] + lf deploy to set up the automation with your local AWS credentials.
If you'd rather create your own automation, you could look at the LambdaService source code to see how we automate the Lambda build / release steps with boto3.
Answer from Joshua Tanke on Stack Overflowamazon web services - How to package and deploy AWS python lambda functions automatically - Stack Overflow
amazon web services - How do add python libraries to AWS Lambda? - Stack Overflow
What's the fastest way to package requirements for a Lambda Function?
How to host Rust web servers?
Videos
I'm working on an open source CLI / Python SDK called LaunchFlow that sets up the exact automation you described! All you need to do is supply a requirements.txt for the Python dependencies, and the tool will zip up your local workspace + Python dependencies and release it to your Lambda function.
The code looks like this:
import launchflow as lf
import numpy as np
def handler(event, context):
result = np.random.randint(0, 100)
return {
"statusCode": 200,
"body": f"Random Number: {result}",
}
api = lf.aws.LambdaService(
name="my-lambda-api",
handler=handler,
runtime=lf.aws.lambda_service.PythonRuntime(
requirements_txt_path="requirements.txt"
),
)
LambdaService Docs link
Just pip install launchflow[aws] + lf deploy to set up the automation with your local AWS credentials.
If you'd rather create your own automation, you could look at the LambdaService source code to see how we automate the Lambda build / release steps with boto3.
For the git piece: I'd recommend just checking in your main python file and a related set of requirements.txt files.
The two paths that I've used for easier Lambda deployments are Serverless and Terraform. I'll outline them briefly (there's also Amazon's CDK but I am less familiar with it).
- Serverless is basically a wrapper around Amazon's Infrastructure as Code tool (CloudFormation); either a set of JSON or YAML that define your Lambda and any associated resources. Here's a decent example of your setup i.e. a git tracked
handler.py, arequirements.txtfile and aserverless.ymlto deploy the lambda. - For Terraform, it's a more general purpose Infrastructure as Code tool that has plugins for each major cloud provider. The best example I could dig up was a simplified repo
Terraform's main advantage is for change tracking in a more generic way, but Serverless I think is simpler if you're not familiar with either one. (Serverless uses CloudFormation stacks for tracking changes, which are a bit funkier than Terraform's state management).
Most ideally; you'd want to have these things run in a pipeline on merge, so that your lamda is updated with new code changes after a code review and merge into a git repo. It might look like:
- Someone submits a Pull Request to your Git repository
- The request is reviewed/ merged
- On before/merge, some tests run to verify the code
- You run the serverless/terraform deploy to update the lambda
- Do some final, end to end verification on the new deployment
Let me know if you need me to clarify anything!
» pip install python-lambda
To use any 3rd party library in lambda you can use a lambda layer.
install the dependency using following command
pip3 install <your_package> -t .
zip the package
zip -r your_pkg_layer.zip .
create the layer in aws and upload the zip, after that add the layer to your lambda function
you can follow this blog in medium.
I recommend that you look at AWS SAM
AWS SAM is an extension of CloudFormation that simplifies the development of serverless applications.
To deploy a AWS Lambda function using AWS Serverless Application Model (SAM), you need to follow these steps:
Create a SAM template: This is a YAML file that defines the AWS resources you want to deploy, including the Lambda function and its dependencies.
Package the function: Package the function code and any dependencies into a .zip file. (For this you'll need a requirements.txt file with all the dependencies your code needs)
Deploy the function: Use the AWS CLI command deploy to deploy the SAM template and function code to AWS. The command will create or update a CloudFormation stack, which creates or updates the specified AWS resources.
Example SAM template:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Resources:
MyFunction:
Type: AWS::Serverless::Function
Properties:
Handler: main.handler
Runtime: python3.8
CodeUri: .
Description: "This is my SAM function"
MemorySize: 128
Timeout: 3
Example AWS CLI command:
sam build --debug --template-file template.yaml
sam package --s3-bucket your_s3_bucket --s3-prefix sam-deployments \
--output-template-file packaged-template.yaml
sam deploy -t packaged-template.yaml --stack-name your_stack_name \
--region your_aws_region --capabilities CAPABILITY_IAM
A common folder structure for a Lambda project using AWS Serverless Application Model (SAM) would look something like this:
my-lambda-project/
├── main.py # Lambda function code
├── template.yaml # SAM template
├── requirements.txt # Python dependencies
I am dynamically creating lambda functions based on different requirements.txt and code supplied.
sometimes the requirements.txt changes and sometimes the code supplied changes.
currently i have a disguisting process of creating a venv then pip installing everything to a folder, zipping it up and send it to s3 and providing that s3 object zip file to my lambda for creation.
its really gross, especially if i want to modify the code i download it, unzip it, then compare and then zip it again then merge.
there HAS to be a better way, how are folks doing it?
EDIT: Wrote a quick script to test containerizing the code and useing ECS then sending the image url to lambda: https://github.com/esteininger/aws-docker-ecr-lambda/blob/main/ecr.py
unfortunately it's still much slower than zipping locally.