Deploy AWS lambda function from container image

Subham Kumar Sahoo
8 min readJun 5, 2022

--

As some of you might know that there is size limit to the package that we can deploy to AWS lambda. Zip format limit is 50 MB and if not zipped then 250 MB. So, if we want external libraries and packages for our code then we might exceed this size limit.

So, one of the recommended workaround is to deploy lambda function as container image. Which is creating our function code as docker image and push it to ECR (Elastic Container Registry) on AWS and build lambda function from that image. In this case the Lambda functions built using container images can be up to 10 GB in size which is sufficient for most of the cases.

Note- Choose suitable region in AWS (nearest one) in top right corner on AWS console and use that only in other services and AWS CLI.

Step 1 : Create EC2 ubuntu instance

(You can create and deploy lambda code as docker image from your local system having AWS CLI and Docker insatalled. If you face any issue in installations, then it’s better to go with a n EC2 instance.)

  • Login to AWS console and search EC2 .
  • Click on launch instances.
  • Give suitable name (ex- lambda-container).
  • Choose Ubuntu image
  • Choose Instance type as t2.micro (comes under free-tier).

Note : If your chosen region does not have t2.micro, it will have t3.micro in free-tier.

  • Create a new key pair (or use existing ones if you have). Give a suitable name (ex- lambda-sks).

Type- RSA

Format- .ppk (putty private key — as we will be using putty to login). If you want to login through SSH command use .pem format.

Note : We can convert .pem to .ppk using PUTTYgen software.

  • Create and download the .ppk file.
  • Keep other configurations as default and create instance.
  • Wait for sometime and the instance will be in running state.

Step 2 : Connect to the Ubuntu instance

Get the alternative binary files (putty.exe)

  • Select your EC2 Ubuntu instance. Under Details tab, copy the public IPv4 address.
  • In Putty, put ubuntu@<IPv4 address> as Hostname. Default username for ubuntu instance is “ubuntu”. Similarly for linux instance it is “ec2-user”.
  • Click on Connection and put keep-alive interval as 180 seconds. So, Putty will ping your instance every 3 minutes so that it does not get disconnected due to inactivity.
  • Go to Auth, and browse your .ppk file.
  • Then click on open to connect to your ec2 instance.

Step 3 : Basic installations

Install docker compose and aws cli in your instance.

sudo apt-get update
sudo apt-get install docker-compose
sudo apt install awscli -y

Step 4 : Create folder and files

mkdir lambda-docker
cd lambda-docker

Create files app.py, requirements.txt and Dockerfile.

Note- To create a file you can use vim editor.

Commands — $vi app.py

Then press “I” on keyboard to go into insert/edit mode. Then type your code.

Then press “Escape” and “:wq” to save and exit or “:q!” to not save and exit.

app.py

import json
import requests
def handler(event, context):# TODO implementation

return {
‘headers’: {‘Content-Type’ : ‘application/json’},
‘statusCode’: 200,
‘body’: json.dumps({“message”: “Lambda Container image invoked!”,
“event”: event})
}

We have imported “requests” library just to show how to use external libraries (not already installed in lambda).

requirements.txt

(mention all the external libraries required)

requests==2.25.1

Dockerfile

# Import AWS lambda base image for python
FROM public.ecr.aws/lambda/python:3.8
# Copy function code to specific directory i.e. /var/task
COPY app.py ${LAMBDA_TASK_ROOT}
# Install the function’s dependencies using file requirements.txt
# from your project folder.
COPY requirements.txt .
RUN pip3 install -r requirements.txt — target “${LAMBDA_TASK_ROOT}”
# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ “app.handler” ]

The AWS base images provide the following environment variables:
LAMBDA_TASK_ROOT=/var/task
LAMBDA_RUNTIME_DIR=/var/runtime

Where to get the base image name?

  • Go to gallery.ecr.aws and search “python”.
  • Click on
  • Choose and suitable image (for this I have choosen python 3.8 image) and copy.

Step 5 : Create ECR repository

We need an ECR repo to push our docker image and from that image we will create a lambda function.

  • Search for Elastic Container Registry and go to repositories.
  • Click on create repository. Select private repo and give a name and create.
  • Click on that repository and click view push commands (top-right corner). Then select Linux tab.
    This will give you commands to login to the repo from EC2 instance, then build and push docker image.

Step 6 : Build and push image

To use AWS CLI for accessing AWS services, we need to have access key and secret access keys for our account.

Note — Here I have created a new user in my account (with Administrative access) and using that to login which is recommended that using root account. Else you can use root account too.

  • Search “IAM” in AWS and go to IAM dashboard. And click on “My security credentials” under Quick links.
  • Click on “create access key” and either copy the access and secret key or download as csv file.

Now we will configure our AWS CLI.

  • In your EC2 Ubuntu instance (In Putty window)
aws configure

And provide access key, secret key, region (like us-east-1, ap-south-1 etc.) and format (like text, json etc.).

  • Copy the first push command and run it to authenticate the docker client (on EC2) to ECR registry.

If get error — error saving credentials, cannot “autolaunch D-Bus without X11 DISPLAY” then run the below command. (Most probably you will not need it)

sudo apt-get install gnupg2 pass

Note- Before every “docker” command henceforth, use “sudo” before that.

  • Then build docker image.
  • Tag image.
  • Run docker and test your function.
sudo docker run -p 9000:8080 docker-lambda

To test, right click on Putty terminal top bar and choose duplicate session. In the new terminal run-

curl -XPOST “http://localhost:9000/2015-03-31/functions/function/invocations" -d ‘{}’

It will give you results in form of a json, with a message included that we mentioned in app.py.

  • Push image to repository on ECR.

After pushing the image we can see an image inside our ECR repository.

Step 7 : Create lambda function from the image

  • Go to lambda dashboard on AWS console and click on create function.
  • Select “Container image” and give a suitable name to lambda function.
  • Click on “Browse” for container image URI and choose the ECR repo and image.
  • Expand “container image overrides” and put /var/task under WORKDIR because as per our Dockerfile, our app.py file will be copied to this path only. And choose x86_64 for architecture.
  • Then under permissions, choose create a new role. This will create an IAM role for your lambda function with CloudWatch logs access. This will be helpful to view logs while or after we execute our function.
    You can attach other policies to the role after it gets created or you can attach an exiting role with atleast this policy attached “AWSLambdaBasicExecutionRole”.

Then create function.

Step 8 : Test the function

  • Click on the function. It will not show you the function code as it is created from an image but we can test it here.
  • Go to Test tab and select create new event. Then just give an event name, save and click on Test. It should give you status 200.

Step 9 : Add API gateway

For an end user we cannot expect to login to AWS and click on test. So, we will add an API (http URL) to our lambda function. When someone will ping to that URL our function will be invoked and it will send the response back to the user.

  • Click on the lambda function and at the top click on Add trigger.
  • Then choose “API gateway”, “Create an API”, “HTTP”, Security as “Open” and Additional settings > Deployment stage as “v1” (optional).

Then click on Add.

  • Click on that API gateway trigger and ping the API endpoint in your browser. You will be seeing a response.

Response example -

Note- You can also use Postman to test out this API.

Done!! We have successfully created and tested our lambda function with API gateway.

I will try to make another post regarding doing the same using AWS CDK (using code not AWS console).

Stay tuned for more such interesting use-cases.

References -

https://www.youtube.com/watch?v=23fE57EFRK4

https://docs.aws.amazon.com/lambda/latest/dg/images-create.html

Feel free to post your questions and feedback. Happy Data Engineering!! 😇

Connect with me at LinkedIn.

--

--

Subham Kumar Sahoo
Subham Kumar Sahoo

Written by Subham Kumar Sahoo

Data Engineer by profession, curious by nature😀 | AWS Certified | ▶️Linked-in: https://www.linkedin.com/in/subham-kumar-sahoo-55563a136/

No responses yet