Skip to content

Commit

Permalink
Merge pull request #2 from satyamsoni2211/feature/lambda_concurrency
Browse files Browse the repository at this point in the history
Feature/lambda concurrency
  • Loading branch information
satyamsoni2211 authored Jun 14, 2022
2 parents eae710a + d19016f commit ef5b7e9
Show file tree
Hide file tree
Showing 16 changed files with 428 additions and 71 deletions.
20 changes: 20 additions & 0 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
name: testing-python-code
on:
push:
branches:
- "feature/**"
pull_request:
branches:
- "feature/**"
jobs:
test-code:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v3
with:
python-version: 3.9
- run: pip install awscli tox
- run: aws configure set region us-west-2
- run: tox

2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -58,3 +58,5 @@ venv
tfplan.json
build/
dist/
.vscode
.tox
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,14 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).

### [0.1.4] - 2022-06-14

### Changed

- Added support for concurrency
- Added test cases for testing code
- Enhanced documentation

### [0.1.3] - 2022-05-19

### Changed
Expand Down
3 changes: 3 additions & 0 deletions Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,12 @@ name = "pypi"
pyyaml = "*"
pydantic = "*"
croniter = "*"
boto3 = "*"

[dev-packages]
twine = "*"
pytest = "*"
tox = "*"

[requires]
python_version = "3.9"
276 changes: 214 additions & 62 deletions Pipfile.lock

Large diffs are not rendered by default.

54 changes: 50 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ This is a utility project designed to cater neccessities for warming up `Lambda`
- [Installing Warmer](#installing-warmer)
- [Using Warmer](#using-warmer)
- [Setting up Event Bridge Notifications](#setting-up-event-bridge-notifications)
- [Working on enhancements](#working-on-enhancements)

#

Expand Down Expand Up @@ -33,29 +34,37 @@ This is very easy to incorporate in your existing Python Lambda Handlers. Follow

```python
from warmer import warmer
@warmer(flag="custom_event_key")
@warmer(flag="custom_event_key", _concurrency=1)
def handler(event, context):
pass
```

If you handler is a Flask/FastApi application, you may follow below steps:
> Parameters:
> *flag* (type: str)- name of the event flag to look for
> *_concurrency* (type: int)- (optional) Number of concurrent handlers to warm up, default: 1

If your handler is a Flask/FastApi application, you may follow below steps:

```python
from warmer import warmer
from flask import Flask
app = Flask()
@warmer(flag="custom_event_key")
@warmer(flag="custom_event_key",_concurrency=1)
def application(event, context):
return app(event, context)

# or

application = warmer(flag="custom_event_key")(app)
application = warmer(flag="custom_event_key",_concurrency=1)(app)

# you may now use application as your handler
```

> `warmer` will help you cater the custom events that are coming for warming _Lambda_ function.
> Though `_concurrency` is optional and by default it only warms up current execution. In case you want to warm up multiple instances of lambda handler, you may need to adjust `_concurrency` to *number of handlers running*.
> `Warmer` uses threading mechnism to ensure that the warming calls are actually happening concurrently and not serially.
<a name="setting-up-event-bridge-notifications"></a>

Expand Down Expand Up @@ -102,4 +111,41 @@ TransactionCompsAPI:
Input: '{"warmer": true}' # this refers to the warmer flag
```
In case you want to include concurrent executions, you may add below to include concurrent invocations.
```yaml
TransactionCompsAPI:
Type: "AWS::Serverless::Function"
Properties:
FunctionName: fake-function
Events:
WarmerSchedule: # add this event to the same template
Type: Schedule
Properties:
Schedule: cron(*/5 * ? * 2-6 *)
Name: fake-function-warmer-event
Description: Warmer Event for Lambda Function
Enabled: true
Input: '{"warmer": true, "concurrency": 5}' # this refers to the warmer flag and concurrency
```
<a name="working-on-enhancements"></a>
## Working on enhancements
If you want to work on enhancements or development, you may clone the project and run the below commands to setup environment:
```bash
python -m pip install pipenv
pipenv shell

# or

python -m pip install virtualenv
virtualenv venv
source venv/bin/activate
python -m pip install -r dev_requirements.txt
```

You may also raise a `PR` to get merged into existing project.

Happy Warming.
3 changes: 3 additions & 0 deletions dev_requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
-r requirements.txt
pytest
tox
4 changes: 4 additions & 0 deletions pytest.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
[pytest]
console_output_style = progress
python_files = test_*.py
addopts = --capture=tee-sys -v
2 changes: 1 addition & 1 deletion release
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.1.3
0.1.4
2 changes: 2 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
-i https://pypi.org/simple
boto3==1.24.8
13 changes: 11 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
long_description = fr.read()
setup(
name="py_lambda_warmer",
version="0.1.3",
version="0.1.4",
description="Warmer Utility for Lambda Function",
long_description=long_description,
long_description_content_type='text/markdown',
Expand All @@ -20,5 +20,14 @@
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9"
]
],
install_requires=[
'boto3',
],
extras_require={
"dev": [
"pytest",
"tox"
]
}
)
3 changes: 2 additions & 1 deletion terraform_resources/resource.tf
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@ resource "aws_cloudwatch_event_target" "lambda_warmer_target" {
rule = aws_cloudwatch_event_rule.trigger_lambda.name
input = <<-INPUT
{
"warmer": true
"warmer": true,
"concurrency": 1
}
INPUT
}
Expand Down
Empty file added tests/__init__.py
Empty file.
62 changes: 62 additions & 0 deletions tests/test_warmer.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
from warmer import warmer
from unittest import TestCase
from unittest.mock import patch, Mock


@warmer(_concurrency=5)
def fake_wsgi(*args, **kwargs):
return "_response"


@warmer()
def fake_wsgi_with_concurrency_event(*args, **kwargs):
return "_response"


class TestWarmerFunction(TestCase):
@patch("botocore.client.BaseClient._make_api_call")
def test_warmer_concurrency(self, mock: Mock):
# checking call for warmer event
response = fake_wsgi({"warmer": True}, {})
mock.assert_called()
self.assertEqual(mock.call_count, 4)
self.assertIn("statusCode", response)

@patch("botocore.client.BaseClient._make_api_call")
def test_warmer_response_body(self, mock: Mock):
# checking call for warmer event
response = fake_wsgi({"warmer": True}, {})
self.assertIn("statusCode", response)
self.assertIn("isBase64Encoded", response)
body = response.get("body")
self.assertIn("eventFlag", body)
self.assertEqual("warmed up", body.get("status"))

def test_wsgi_function_response(self):
# testing call being passed to function on no warmer event
self.assertAlmostEqual("_response", fake_wsgi({}, {}))

@patch("warmer.call_function_concurrently")
def test_concurrency_function_args(self, mock: Mock):
# function should be passed with same flag
# with is an indetifier for the
# warming event
fake_wsgi({"warmer": True}, {})
mock.assert_called()
mock.assert_called_with(5, "warmer")

@patch("botocore.client.BaseClient._make_api_call")
def test_for_concurrency_using_event(self, mock: Mock):
# function to check for concurrent calls when passed via event
fake_wsgi_with_concurrency_event(
{"warmer": True, "concurrency": 5}, {})
self.assertEqual(mock.call_count, 4)

@patch("botocore.client.BaseClient._make_api_call")
def test_for_concurrency_using_default_event(self, mock: Mock):
# No concurrent calls should be made in case
# of concurrency = 1
fake_wsgi_with_concurrency_event(
{"warmer": True}, {})
self.assertEqual(mock.call_count, 0)
mock.assert_not_called()
12 changes: 12 additions & 0 deletions tox.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
[tox]
envlist = python3.6,python3.7,python3.8,python3.9

[testenv]
# install pytest in the virtualenv where commands will be executed
deps =
pytest
boto3

commands =
# NOTE: you can run any command line tool here - not just tests
pytest
35 changes: 34 additions & 1 deletion warmer.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
import os
import json
import boto3
from functools import wraps
from concurrent.futures import ThreadPoolExecutor, as_completed

_AWS_LAMBDA_FUNCTION_NAME = os.getenv("AWS_LAMBDA_FUNCTION_NAME")
_AWS_LAMBDA_FUNCTION_VERSION = os.getenv("AWS_LAMBDA_FUNCTION_VERSION")
Expand Down Expand Up @@ -31,7 +34,34 @@ def generate_custom_reply(event, flag):
}


def warmer(flag="warmer"):
def call_function_concurrently(concurrent_calls: int, flag: str):
"""Function to call Lambda concurrently
This is using Threadpool executor for calling Lmabdas concurrently
so that we make sure Lmabda isn't available to pick
next warming request concurrently.
Args:
concurrent_calls (int): Number of concurrent calls for warming lambda
flag (str): Flag event for Lambda to identify warming call
"""
client = boto3.client('lambda')
data = json.dumps({flag: True, "concurrency": 1})
with ThreadPoolExecutor(max_workers=os.cpu_count()) as exe:
map_ = []
for _ in range(concurrent_calls-1):
map_.append(
exe.submit(client.invoke, FunctionName=_AWS_LAMBDA_FUNCTION_NAME,
Payload=data.encode("utf-8"),
Qualifier=_AWS_LAMBDA_FUNCTION_VERSION)
)
for f in as_completed(map_):
try:
print(f.result())
except:
pass


def warmer(flag="warmer", _concurrency=1):
"""This decorator adds and additional layer
on the top of your callable for warming
lambda
Expand All @@ -43,6 +73,9 @@ def decorator(func):
@wraps(func)
def inner_wrapper(event, context, *args, **kwargs):
if event.get(flag):
concurrency = event.get("concurrency") or _concurrency
if concurrency > 1:
call_function_concurrently(concurrency, flag)
print(
f"warming {_AWS_LAMBDA_FUNCTION_NAME}:{_AWS_LAMBDA_FUNCTION_VERSION} with custom event")
return generate_custom_reply(event=event, flag=flag)
Expand Down

0 comments on commit ef5b7e9

Please sign in to comment.