WHAT IS AWS COLD START? AND HOW IS IT HANDLED? EXPLAIN WITH REAL-LIFE EXAMPLE
oh, it’s monday again, it was a fun weekend. Every Monday is the continuation of a Sunday that was celebrated. My favorite star’s movie from yesterday is still rolling in my thoughts. Where is my laptop charger? Who kept this laptop charger under the table? Where else should I put it, heh heh… When did last Friday end? The early hours of Monday are going to be followed by a lazy crazy work-life.
Still, why do organizations provide weekends? Is it their fault?
You are totally wrong bro ! This is a technique of your organization to restart and refresh your mind, from your stressful and pressurized working days.
It’s the same case with cold start in AWS. If you mistook this for a flaw of AWS, you are wrong. This is a feature. When this is said, two types of people may frown
1. Those who do not know what a cold start is.
2. Those who had to listen to the yelling of the client with the lazy and cold start of AWS
Let’s first consider the first group, what is a cold start
Cold start
_____________
Cold start is a feature of AWS serverless applications. To know more about it, it is necessary to understand what serverless applications are.
Serverless is the way of code execution by the developer, with no worries about the management of services.
In other words, before serverless applications, in order to manage the service, you used to buy the system from a service provider, configure it according to your needs and deploy it. One of the main issues in doing this was related to application scaling. That is, suppose you have built a system that can handle 1000 requests at a time but due to a sudden increase in demand the system received 5000 requests.
Your system will be down and your application will be unable to provide services properly. Similarly, if you don’t receive more than 100 requests in your system for a month, you have to pay the service provider for a system that can handle 1000 requests that you have configured.
This is where serverless applications come into play.
We just upload the code, dependencies, and related services to the cloud, and the scaling and service management is handled by the service provider itself. In case of AWS, Lambda functions, API Gateway and other related services are used for this.
These are two important advantages of using a serverless application
1. Easy to scaling
2. Pay-to-use.
Now let’s get to the cold start, AWS saves the uploaded services to its S3 bucket storage and downloads these saved services to the containers, and triggers and runs the Lambda function according to the user requests
Thus AWS provides us with the smoothest service when the requests are constantly coming and the container is up. But suppose there is no request coming to your application for 10-15 minutes and your containers are ready to handle the request,
assume that the container waits for 15 minutes, 30 minutes, 1 hour, 1 day, 1 week, and still no request!
Is it a good idea to keep the container up all the time for requests like this? If not, is it efficient?
Even if you look at your application’s view alone and say this is true, this is a big mistake in the view of AWS, which handles hundreds of thousands of functions (containers) per second. BecauseWhen aws needs hundreds of thousands of requests and containers to handle every second, here your container requests are also waiting. To overcome this scenario, if the application stays for more than the minimum time idle, it releases its containers and associated services and moves the application to an AWS s3 storage. Now suppose that a new request has come to the idle application. What should AWS do to handle this request?
1. Find a container that is free
2. Download the s3 code to it
3. Download and install the associated dependencies and services
4. The application should send the request-handled response
Unlike the previous scenario, AWS spends more time finding, downloading, and setting up the code container and environment.
We call this a cold start
Because of this, we only need to have containers up when we need them and pay only for the time of use. Although this is a feature of AWS, the End users may feel a lag.
Now let us consider the second group,
Since the cold start is a feature of AWS (like the weekend of organization) there are no mechanisms to fully handle it. But we can get rid of this to some extent by reducing the call start time
( I am using node js here to show example code, why? My hero is node js, which survives with a single thread, hard work, and persistence among geniuses with more than enough threads.)
Warm Functions
_______________
If you go to google or youtube to find ways to overcome AWS cold start, most of the solutions you will find will be related to using the warm function.
Before discussing warm functions, let me start with a way to overcome a lazy Monday morning.
That’s you work on Saturday and Sunday also, so the laptop charger is never lost anywhere and You will always remember where you left off… What’s up, isn’t it ok?
“Hey brother, are you crazy?”
A similarly crazy thing works with the Warm function
You keep pinging your service using the Warm function or ping function, and when doing this the application does not go to the idle stage and the container remains running. For this, the function is called at fixed time intervals using techniques like the cron process.
Here, in the case of Node js.
Lambda warmer: lambda-warmer – npm
Serverless-plugin-warmup: https://www.npmjs.com/package/serverless-plugin-warmup
functions or custom functions can be written.
The main problem with using warm functions is that it destroys the serverless concept. One of the main advantages of the serverless concept is to pay only for usage. You are constantly developing and paying for the application and associated functions to handle call initiation
2. Increase memory
________________
How to easily find your forgotten laptop charger on a cold Monday morning. One way to easily find anything you’ve forgotten is to increase your memory power. How is that?
“I don’t have any suggestions bro… plz search it on youtube it will show you more than a hundred solutions” (Company is not responsible for the result of such videos)
But in the case of AWS, it is very easy!
Increase memory on the template.yml
to
Otherwise, you can Directly increase it on the AWS console.
How does this overcome a cold start? Memory is directly related to CPU utilization, so providing more memory makes the CPU operation of the lambda function easier.
This results in the reduced call initiation time
Now you may be doubting that the cost of the service will increase as the memory increases. But this turns out to be wrong when You examine it with respect to the execution time.
Following is the memory price comparison for price and memory for 1000 lambda invocation
For more information about pricing, please refer: https://docs.aws.amazon.com/lambda/latest/operatorguide/computing-power.htmlDrawbacks
3. Optimize your service
____________________
How to clear your mind from cluttered thoughts? The easiest way is to avoid unnecessary thoughts. After watching my favorite hero in a movie theater, the only way to deal with the emotional impact is to leave the theater and return to real life.
In the case of AW, this works in several ways
a.Optimize Code With Neat Code:
__________________________
When writing code correctly by avoiding unnecessary loops and function comments, the size of the code and the run-time are reduced, which helps to easily load the function and reduce cold start.
That is, if we have an array operation that can be done using for each, map, reduce, filter, it is better to use for each because the operation time of for-each is less.
b.Accurate Import in Lambda Functions
______________________________
The first thing to do is to avoid unnecessary import functions, say the time taken to download the function dependency causes the cold start. Because if you use an import function that is not used, you lose the time to download the imported function along with it.
In Import the second thing to be noted is to import correctly. That is, without importing a library as a whole, import only the functions it needs that is,
Instead of
C. Minifying Your Code:
______________________
Just like optimizing code using webpack in the front end, Optimizing( Minifying) backend code using libraries, helps to reduce cold start.Things like database connection configuration should be taken care of while optimizing like this
In the Node.js program reducing the size of the source code by removing unnecessary characters like whitespace, comments, and line breaks, while preserving its functionality. This can be done using a tool like UglifyJS, which is a popular JavaScript minifier.
example for minifying a simple Node.js program using UglifyJS:
In index.js you have a program for adding two number
First, install UglifyJS using npm by running the following command in your terminal:
npm install uglify-js -g
To minify this code, run the following command in your terminal:
uglifyjs index.js -o index.min.js -c -m
then the resulting “index.min.js” file will look something like this:
4. Provision concurrency
__________________________
Why does the laptop charger go under the table?
Would this problem be solved by providing a proper place to store the laptop and charger?
The same is the case with provision concurrency.
We will first prepare the environment container to run. The environment arranged in this way is called provisioned concurrency
The most important way AWS recommends to reduce cold Start is through provisioned concurrency. Using provisioned concurrency reduces the warm time
provisioned concurrency comparison using an x-ray service map:
Before
After:
from the above x-ray map, the warm up time of userCrudFunction is reduced from 1.2 sec to 238 msec after adding provision concurrency
But we are ready. A point to note when adding provision concurrency is that containers will have to pay.
One of the important disadvantages of provision concurrency is that suppose we set 5 provision concurrency for a project, also imagine 6 requests coming in parallel after an idle state. Then, when five requests are handled with containers added to the provision concurrency, the sixth request will encounter a cold start.
5. Other Ways
____________
Constantly tracking and optimizing applications using a method like Xray Cloud Watch also helps reduce cold starts.
Conclusion
_______________
One of the phenomena of AWS service is a cold start which is explained here through a weekend example
From my research, I understand that we can’t remove cold start completely but we can minimize the start time with the help of warm functions (not recommended), increased memory, optimized code, provisioned concurrency etc.