Best way to build a python lambda layer?
python - How to create a layer in lambda function - Stack Overflow
Help with Python Lambda Layers needed: Imports within a module fail unless also imported in the Lambda code.
How to create Lambda layer for python3.7
Youβre really close. Google how to apt install Python 3.7.
Also why not just use 3.9 or whatever is shipped with apt install python3.
More on reddit.comVideos
There are lots of different articles online describing ways to build dependency layers for python lambda functions. Is the definitive way to build a lambda layer with packages that will work to do the following:
spin up a (linux) EC2
build a virtualenv and source it
pip install the packages you want
download the /pythonx.x/ directory and zip it
upload it as a lambda layer
This seems to be the most common recommendation, but I wanted to know what the actual best practice is for doing this. Thanks,
If I may, I would like to recommend an alternative technique which has never failed me. The technique includes docker tool described in the recent AWS blog:
- How do I create a Lambda layer using a simulated Lambda environment with Docker?
Thus for this question, I verified it using elasticsearch as follows:
Create empty folder, e.g.
mylayer.Go to the folder and create
requirements.txtfile with the content of
elasticsearch
- Run the following docker command (may adjust python version to your needs):
docker run -v "$PWD":/var/task "lambci/lambda:build-python3.8" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.8/site-packages/; exit"
- Create layer as zip:
zip -r elastic.zip python > /dev/null
Create lambda layer based on
elastic.zipin the AWS Console. Don't forget to specifyCompatible runtimestopython3.8.Test the layer in lambda using the following lambda function:
import json
from elasticsearch import Elasticsearch, RequestsHttpConnection
def lambda_handler(event, context):
# TODO implement
print(dir(Elasticsearch))
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
The function executes correctly:
['__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__enter__', '__eq__', '__exit__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', 'bulk', 'clear_scroll', 'close', 'count', 'create', 'delete', 'delete_by_query', 'delete_by_query_rethrottle', 'delete_script', 'exists', 'exists_source', 'explain', 'field_caps', 'get', 'get_script', 'get_script_context', 'get_script_languages', 'get_source', 'index', 'info', 'mget', 'msearch', 'msearch_template', 'mtermvectors', 'ping', 'put_script', 'rank_eval', 'reindex', 'reindex_rethrottle', 'render_search_template', 'scripts_painless_execute', 'scroll', 'search', 'search_shards', 'search_template', 'termvectors', 'update', 'update_by_query', 'update_by_query_rethrottle']
As one option is already mentioned by @Marcin which is required Docker to be installed in the target machine. If you want to skip docker then you can use below script to create and publish layer to AWS.
All you need
./creater_layer.sh <package_name> <layer_name>
./creater_layer.sh elasticsearch my-layer
script creater_layer.sh
path="app"
package="${1}"
layername="${2}"
mkdir -p $path
pip3 install "${package}" --target "${path}/python/lib/python3.8/site-packages/"
cd $path && zip -r ../lambdalayer.zip .
aws lambda publish-layer-version --layer-name "${layername}" --description "My layer" --license-info "MIT" --zip-file "fileb://../lambdalayer.zip" --compatible-runtimes python3.8