I have several different python APIs (i.e python scripts) that run using AWS lambda. The standard approach is to generate a zip file including all the external libraries that are necessary for the lambda function and then upload it to AWS. Now, I have some functions that are in common between different APIs (e.g. custom utils functions such as parse text files or dates). Currently, I am simpling duplicating the file utils.py in every zip file. However, this approach is quite inefficient (I don't like to duplicate code). I'd like to have a S3 bucket that contains all my .py shared files and have my APIs directly loading those. Is this possible?
A simple approach would be to download the files to a tmp folder and load them, but I am not sure this is the best/fastest way:
import boto3
client_s3 = boto3.client("s3")
client_s3.download_file("mybucket", "utils.py", "/tmp/utils.py")
Can this be done in a more elegant way?