I’m a big fan of AWS Lambda. It has allowed me to implement services and automate processes very quickly and cheaply. It has a great free tier and I can’t emphasize enough how much agility it has brought to my development activities. Its built-in integrations with other AWS services have saved me weeks (even months) of work.
But there is one very important risk. Since it’s so easy and cheap to get started with Lambda, it’s very common to forget about cost during development. But once you start running some serious workloads and many functions, cost can be significant. If you don’t optimize and estimate Lamba cost early, your Lambda functions can unnecessarily cost you thousands of dollars. That’s money that you could use in other important areas of your business.
Pricing is straight-forward (kind of), but it’s deceivingly cheap at low volume. Unlike EC2, there’s really no incentive to optimize cost when you’re developing or even running your functions in Production, at low volume. The thing is, once you start executing Lambda functions at SCALE, cost can’t be ignored. With the mechanisms available today in AWS, calculating, monitoring and optimizing Lambda cost can be extremely time consuming.
That’s why I’ve built some tools to help me and my clients keep the cost of Lambda functions under control, and in many cases save thousands of dollars.
First, let’s take a look at how Lambda pricing works.
AWS Lambda is the ultimate pay-as-you-go cloud computing service. Upload your function code to the cloud and execute it. If you don’t execute a function, you don’t pay anything. You can have a dormant Lambda function waiting to be triggered and you won’t pay a penny until the function is executed.
These are the 3 factors that determine the cost of your Lambda function:
- Number of executions. You pay by each execution ($2E-7 per execution, or $0.20 per million)
- Duration of each execution. The longer your function takes to execute, the more you pay. This is a good incentive to write efficient application code that runs fast. You get charged in 100ms increments and there is a maximum function timeout of 5 minutes.
- Memory allocated to the function. When you create a Lambda function, you allocate an amount of memory to it, which ranges between 128MB and 1,536MB. If you allocate 512MB of memory to your function but each execution only uses 10MB, you pay for the whole 512MB. If your execution needs more memory than the one allocated to the function, the execution will fail. This means you have to configure the amount of memory that will guarantee successful executions, while avoiding excessive over-allocation.
- Data transfer. You pay standard EC2 rates for data transfer (out to the internet, inter-region, intra-region).
A great advantage over EC2 is that you don’t have to provision any billable infrastructure while your function is idle.
Examples of Lambda executions at scale
Let’s take a look at some examples of Lambda functions that go beyond the free-tier.
Example 1: Function triggered from a Kinesis stream
This is a very useful use case. Many AWS customers trigger a Lambda function to process incoming records in a Kinesis stream. The Lambda function can do transformations before sending data to other architecture components, such as S3 buckets, Elasticsearch or Redshift clusters, etc.
Example: 100 executions per second (259M executions per month), Average execution time: 500ms, Provisioned memory: 512MB. Approximate monthly cost: $1130 USD.
Example 2: Function triggered from an S3 event
Also a common use case. You can configure an S3 bucket such that it executes a Lambda function each time a new file is uploaded. Let’s say the function processes the contents of a relatively large file, therefore it takes some time and it consumes a high amount of memory, but it’s executed less frequently.
Example: 1 execution per second (2.6M executions per month), Average execution time: 30s, Provisioned memory: 1024MB. Approximate monthly cost: $1300 USD.
Example 3: Function triggered by a CloudWatch Logs subscription
You can integrate CloudWatch Logs very easily with AWS Lambda using Subscription Filters. This is very useful for server log analysis. You can have your application logs delivered to CloudWatch Logs and then have them pre-processed, analyzed or forwarded to other locations using a Lambda function.
Example: 10 EC2 servers, each sending 1 record per second (26M executions per month), Average execution time: 5s, provisioned memory: 256MB. Approximate monthly cost: $547 USD.
By the way, the price calculations above took me only a few seconds to complete… more on that below.
So, what can I do to keep AWS Lambda cost down?
As you can see, there are combinations of usage and configurations that could be problematic for some budgets. If you’re a small startup, you definitely don’t want to spend $1,000 per month on a sub-optimal Lambda function. Multiply this number by many Lambda functions and you could be wasting some considerable money.
Here are some ways in which you can keep your Lambda cost down:
Make sure your functions are executed at the right frequency
There are factors that can affect how frequently your Lambda function is triggered. For example, if you’re using Kinesis as a Lambda function trigger, you could adjust the batch size. A higher batch size means your Lambda function will be executed less frequently. Take a look at your triggers and see if you can do something to reduce the number of executions.
Write efficient code that executes fast.
A function that executes in half the time is a function that will cost you half the money. Since execution duration is directly proportional to how much you’ll pay, it’s important to keep an eye on the Duration metric in CloudWatch. If you see your function is taking suspiciously long to complete, then it’s time to look at ways to optimize it.
Thankfully, AWS X-Ray can help.
Provision the right amount of memory
Avoid a record like this in your Lambda function’s logstream:
Duration: 702.16 ms Billed Duration: 800 ms Memory Size: 512 MB Max Memory Used: 15 MB
You see the problem? This function is using 15MB in this execution, but it has 512MB configured (in other words, it’s running at 3% its capacity)! If that log entry belonged to a function that gets executed 100 times per second, you’d be looking at a $1785 USD charge in your AWS monthly bill. If you reduced the provisioned memory size to 128M, and the execution time did not change, you’d be looking at $485 USD. That’s $1,300 USD each month you could save ($15,600 at the end of the year), instead of spending that money on an over-provisioned Lambda function.
Keep in mind that a higher memory allocation often results in more CPU capacity allocated to your function, which could result in faster executions - and potentially lower cost. Therefore, you would have to test your function at scale with different memory allocations, measure execution times and calculate cost.
Keep an eye on data transfer
When you run a Lambda function, you will get charged at standard EC2 data transfer rates. This means you have to keep an eye on data transferred out to the internet, other regions or other AZs. Regarding inter-AZ data transfer, there’s nothing much you can do, since you can’t control the AZ your lambda function will run on. Also, there’s no CloudWatch metric that will tell you how much data is transferred in or out of your Lambda function. If data transfer is a concern, there are two things you can do:
- Take a look at your AWS Cost and Usage report. Filter by resourceId (your function) and find the different values in the transferType column. Then get the usage amount. This can be time-consuming, that’s why I recommend using tools like QuickSight. Take a look at this article I wrote about using QuickSight to do AWS Cost Optimization.
- Log the size of data transfer operations in your Lambda code and then configure a CloudWatch Metric Filter that becomes a CloudWatch Metric.
Ok, I get it, but that’s time consuming… do you have any tools that can help me lower the cost of my Lambda functions?
The answer is YES! That’s why I wrote this blog post!
Tool #1 - Near Real-time Price Calculator for Lambda executions
I started this GitHub project some time ago, while I was doing load tests for a project and I wanted to estimate AWS cost using real system metrics at different transaction volumes. I wanted to have this data for launch day usage and over time, given growth projections.
This exercise involved a lot of manual work, so I decided to experiment with the AWS Price List API. It worked, but every time I did a new test with different volume, I still had to run some scripts manually, which I didn’t like. I realized there was no way to get a fresh estimation of how much my client would pay at the end of the month for EC2 resources. Cost and Usage reports get delivered to an S3 bucket once per day. Cost Explorer is refreshed once per day. CloudWatch Billing Alarms have a 6-hour delay and are too broad.
What if I had a quick and automated way to see estimated monthly AWS cost, based on current usage, in near-real time, with only a 10 minute delay?
Well, here it is:
The project consists of a Lambda function that is triggered by a CloudWatch Events schedule. The schedule contains a tag key/value pair that is passed to the Lambda function. The function looks for resources with the tag, fetches CloudWatch metrics and calculates a monthly projection. Then it uses the Price List API to calculate an estimated monthly cost.
Here is the architecture:
Getting it to work goes like this:
- Tag your Lambda function (i.e. stack=datastreaming)
- Install the Near Real-time Price Calculator function using the CloudFormation template below.
- Specify the tag you want to calculate price for (i.e. stack=datastreaming)
- Make sure the tagged Lambda function is running. Wait 10 minutes.
- Look for the EstimatedCharges CloudWatch metric.
In this example, I ran a short test which resulted in a maximum projected monthly cost of $895 USD. You can see how the number of invocations affects price.
You can also compare function execution Duration against Price.
If you’re interested in the pricing details, you can take a look at the function’s CloudWatch Logs output:
For more details, see the GitHub repo.
Disclaimer: I mainly use this tool to estimate cost at scale while I’m doing load tests. You could use it in Production, but at the time it’s not yet optimized to support a large number of AWS resources. I’ve made a lot of performance improvements to this tool and I’m planning to keep working on it. Also, this is only an approximation and it doesn’t contain all possible usage - only AWS billing records contain the truth regarding your AWS monthly bill.
Tool #2 - Memory Metric Filters
Like I mentioned earlier, memory allocation can make a big difference in your Lambda function cost. Too much allocated memory and you’ll overpay. Too little and your function will be at risk of failing. Therefore, you want to keep a healthy balance when it comes to memory allocation.
The problem is, there are no CloudWatch metrics that tell you neither memory usage nor allocation.
The good news is, every time a Lambda function completes execution, it prints the following record:
REPORT Duration: {duration} ms Billed Duration: {billed duration} ms Memory Size: {allocated memory} MB Max Memory Used: {consumed memory} MB
That’s why I created a simple CloudWatch Logs Metric Filter that extracts “Memory Size” and “Max Memory Used” from the Lambda function execution logs.
All you have to do is launch the following CloudFormation stack:
In the CloudFormation console, specify the name of the function you want to monitor:
… and that’s it! In a few minutes you’ll be able to monitor your Lambda memory allocation. You can create an alarm that tells you when memory allocation is either too low (risk of failure) or too high (risk of over-paying).
In this example, I first had 448MB allocated to this function, which was too high given that the average used memory was 26MB. Therefore I decided to reduce the memory allocation to the lowest possible value of 128MB.
In addition, I set up a CloudWatch Alarm to notify me when my Used Memory is 70% or more, relative to my Memory Size.
Tool #3 - Lambda Cost Optimization script
My final tool in this article is the Lambda Optimization script.
This script parses usage information from the functions’ execution records in CloudWatch Logs. It finds log data for a given time window (i.e. the past 10 minutes) and it calculates pricing for different scenarios and tells you potential savings.
Like I mentioned earlier, keep in mind that higher memory allocation can sometimes result in lower execution times.
I induced some load to a Lambda function, which had 1024MB of allocated memory. Here is some sample output:
OPTIMIZATION SUMMARY
**Data sample used for calculation:**
CloudWatch Log Group: [/aws/lambda/hello-staged-world]
First Event time:[2017-05-16 02:32:17 UTC]
Last Event time:[2017-05-16 02:42:27 UTC]
Number of executions:[1668]
Average executions per second:[2.7343]
**Usage for Lambda function [hello-staged-world] in the sample period is the following:**
Average duration per Lambda execution: 7581ms
Average consumed memory per execution: 27MB
Configured memory in your Lambda function: 1024MB
Memory utilization (used/allocated): 2.64%
Total projected cost: $899.16USD - MONTHLY
The following Lambda memory configurations could save you money (assuming constant execution time):
memSizeMb|memUsedPct|cost|timePeriod|savingsAmt
---| ---| ---| ---| ---
128| 21.0|113.35| MONTHLY| 785.81
192| 14.0|169.32| MONTHLY| 729.84
256| 10.0|225.29| MONTHLY| 673.87
320| 8.0|281.26| MONTHLY| 617.9
384| 7.0|337.23| MONTHLY| 561.93
448| 6.0|393.2| MONTHLY| 505.96
512| 5.0|449.17| MONTHLY| 449.99
576| 4.0|505.13| MONTHLY| 394.03
640| 4.0|561.1| MONTHLY| 338.06
704| 3.0|617.07| MONTHLY| 282.09
768| 3.0|673.04| MONTHLY| 226.12
832| 3.0|729.01| MONTHLY| 170.15
896| 3.0|784.98| MONTHLY| 114.18
960| 2.0|840.95| MONTHLY| 58.21
Can you make your function execute faster? The following Lambda execution durations will save you money:
durationMs|cost|timePeriod|savingsAmt
---| ---| ---| ---
7500|887.35| MONTHLY| 11.81
7400|875.53| MONTHLY| 23.63
7300|863.72| MONTHLY| 35.44
7200|851.91| MONTHLY| 47.25
7100|840.1| MONTHLY| 59.06
7000|828.29| MONTHLY| 70.87
6900|816.47| MONTHLY| 82.69
6800|804.66| MONTHLY| 94.5
6700|792.85| MONTHLY| 106.31
6600|781.04| MONTHLY| 118.12
6500|769.22| MONTHLY| 129.94
6400|757.41| MONTHLY| 141.75
6300|745.6| MONTHLY| 153.56
6200|733.79| MONTHLY| 165.37
6100|721.97| MONTHLY| 177.19
6000|710.16| MONTHLY| 189.0
5900|698.35| MONTHLY| 200.81
5800|686.54| MONTHLY| 212.62
5700|674.72| MONTHLY| 224.44
5600|662.91| MONTHLY| 236.25
5500|651.1| MONTHLY| 248.06
5400|639.29| MONTHLY| 259.87
5300|627.47| MONTHLY| 271.69
5200|615.66| MONTHLY| 283.5
5100|603.85| MONTHLY| 295.31
5000|592.04| MONTHLY| 307.12
4900|580.22| MONTHLY| 318.94
4800|568.41| MONTHLY| 330.75
4700|556.6| MONTHLY| 342.56
4600|544.79| MONTHLY| 354.37
...
The script is telling me that at the current usage, this function will cost me $899 USD per month, but if I lower my allocated memory to 128MB then I could save $785 USD per month. Since my usage is 27MB, then I would use 21% of my allocated memory, which is a very safe range.
Then we have execution time. The script is telling me that if I reduce execution time to 6600ms (a 1 second optimization), I could save $118 USD per month. If I reduce it by 3 seconds to 4600ms, then I could save $354 per month.
If you want to give it a try, you can find detailed execution instructions in the GitHub repo.
Conclusion
Getting started with Lambda is easy - you don’t have to provision any infrastructure and it’s very cheap to have something useful up and running. This is extremely powerful. But it’s also one of Lambda’s biggest risks. It’s very easy to start with inefficiencies that go unnoticed. That is, until your application handles some real volume and then those inefficiencies turn into very expensive situations.
That’s why it’s very important to keep track of Lambda cost and usage, before this becomes a problem.
I hope you found these tools useful. If you want to know more about keeping your Lambda cost down, don’t hesitate to schedule a free 30-minute consultation or contact me using the form below. I’ll be happy to help!