6

I have a couple of AWS Lambda functions. All of those functions use some common helper functions. I have placed these helper functions in a separate file called helper_functions.py. I want to import this module in all of my AWS Lambda functions. I am unable to find a place to store this module (helper_functions.py), so when I make a change in this module I don't have to change anything in my Lambda functions.

Some of the options I thought of are:

  1. Uploading the module on AWS S3 and then loading it in each Lambda function in the start from S3 and using the functions. (if possible)

  2. Writing some script (which I haven't figured out yet) that packages the module along with the Lambda functions' Python file in a zip and uploads it on AWS Lambda

Please suggest a better solution to manage the module and import it in a much more efficient way.

5
  • 1
    Lambda package has a size limit. You should include all the modules that fits in that size inside the zip package. Rest of the modules you can store in s3 and import from there. Commented Dec 14, 2016 at 6:55
  • If I include the module in all the Lambda packages and then I want to change something in the module, I will have to update all the Lambda packages, right? Commented Dec 14, 2016 at 7:00
  • including modules from zip packages will be more efficient I guess. You can write a small script to update the modules in all the packages. Check this: docs.aws.amazon.com/cli/latest/reference/lambda/… Commented Dec 14, 2016 at 7:07
  • I don't quit understand the problem. If you have that single module and can import it from s3, changing it means you won't have to change anything in your scripts anyway. If there's a technical problem I'm missing here, you could always download that module in runtime and sys.path.append its path to use it. Not amazing, but will work. Commented Dec 14, 2016 at 11:55
  • Look at the parts about pip (and virtualenvs) here: docs.aws.amazon.com/lambda/latest/dg/… Commented Dec 30, 2016 at 11:49

1 Answer 1

5

I struggled with this for a long time. Here's my solution (there might be a better way):

setup your helper function in your file system like this:

pathToSomewhere/my_helper/helper_functions.py pathToSomewhere/my_helper/__init__.py pathToSomewhere/setup.py

Where __init__.py is:

from .helper_functions import *

and setup.py is

from setuptools import setup

setup(name='my_helper',
      version='0.10000',
      description='My helper functions',
      url='http://github.com/user/example',
      license='Proprietary',
      author='Null',
      author_email='[email protected]',
      packages=['my_helper'],
      install_requires=['boto3'],
      zip_safe=False)

Now let's package up my_helper. From pathToSomewhere/ run:

python setup.py sdist

I'm assuming you already know how to create and upload a virtual environment for running your lambda function. If not, let me know.

Now let's install my_helper into the virtual env of your lambda function. Let's assume your virtual environment is called worker_env

./worker-env/bin/pip install file://pathToSomewhere/my_helper

Now zip up worker-env and your actual lambda script and upload that.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks for the reply! But currently I am one step ahead doing continuous delivery using AWS Codepipeline which is really cool and is easy to package and deploy any of your serverless application. Check this out to get started.
I have a similar issue, see here: stackoverflow.com/questions/49730389/…). However, with your solution, if I change helper_functions.py I'll have to re-zip all my AWS functions are re-upload them. Can the helper_functions.py be stored somewhere (e.g. S3 maybe) and be imported by all my AWS Lambdas so that if I change it, I don't have to re-zip everything.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.