Blogs

How Does Serverless Computing Work with Containers and Micro Services?

By Hubert Yoshida posted 04-24-2018 00:00

  

According to Wikipedia, Serverless computing is a cloud computing model in which the cloud service provider dynamically manages the allocation of machine resources. Serverless computing still requires servers. The name "serverless computing" is used because the server management and capacity planning decisions are completely hidden from the developer or operator. Serverless code can be used in conjunction with code deployed in traditional styles, such as micro servicesand run in containers. The drawing below illustrates what the cloud service provider provides and what the user developer or operator provides with serverless computing compared to an Infrastructure as a Service.

serverless

Serverless computing is provided by a cloud service provider like AWS Lambda. To use it you write code (in C#, java, Node.js or Python), set a few simple configuration parameters, and upload everything (along with required dependencies) to Lambda. This package is now what Lambda calls a function which can be automatically triggered from other AWS services or called directly from a web or mobile app. Serverless computing Lambda takes care of everything, all you do is provide the code and Lambda deploys the function in a container and provides everything required to run and scale the function with high availability. Lambda persists the container until the function has done its job, then disappears. Serverless computing is used with containers.  Another way to describe serverless computing would be Function as a Service or FaaS. AWS Lambda was introduced in 2014, and since then other cloud provides have rushed to provide similar capabilities.

The benefits, according to AWS includes no servers to manage, continuous scaling precisely to the size of the workload by running code in parallel based on processing individual triggers, and sub second metering where you are charged every 100ms that the code executes, and you don’t pay when your code is not running. Serverless computing is inexpensive. Serverless computing uses containers but does not need to deploy and manage the containers. It is low maintenance since you do not need to provision containers, set system policies and availability levels or handle any backend server tasks. The standardized programming environment and the lack of server and container overhead means that you can focus on writing code.

Serverless computing has some very definite limits. You are limited to the implementation constraints of the cloud service provider. For example, Lambda has built in restrictions on size, memory use, and time available for a function to run. There is also a limited list of natively supported programming languages and it is important to keep functions small since a few high demand functions can overload or lock everyone else out. Serverless computing runs in a multi-tenant environment so there is always exposure to speed and response time variations and outages due to the demands or bad behaviors of other tenants. Monitoring, debugging, and performance analysis capability may also be restricted due to lack of visibility into the backend services provided by Lambda. Since your software is hardwired into the providers (Lambda) interfaces, there is vendor lock in. However, if you Google serverless computing and vendor lock in you will see many arguments for the benefits, pro and con. 

So, what are the use cases for serverless computing? The best functions are short lived, small, and do not run for a lengthy period of time. Some functions that could fit this model are real time analytics that are triggered by anomalies in a data stream, ETL to perform data validation, filtering, sorting and other transformations before it loads the transformed data into another data store, and as the backend for an IoT application where sensors trigger the need for a spare part and a function automatically places the order. Here are a few uses cases that AWS Lambda proposes:

lambda-use-cases

In some ways serverless computing is the next abstraction beyond containers. A container provides the user with more control and portability, but that also comes with more administration.The main benefit of a container is that it consists of an entire runtime environment: an application, plus all its dependencies, libraries and other binaries, and configuration files needed to run it, are bundled into one package; it can run reliably when moved from one computing environment to another. Containers let developers deploy, replicate, move, and back up a workload even more quickly and easily than you can do using virtual machines. Container based application can be as large and complex as you need it to be. It would be easier to redesign a monolithic application into container based micro services than if you tried to redesign it with serverless computing due to the multiple bottlenecks based on size and memory constraints.

Serverless computing is also compared with micro services. Micro services are a change in architecture where a single monolithic application is broken down into a set of self-sustained small services running on their own machines (or instances.) They use light weight mechanisms like REST interfaces to allow communication into the micro services. Micro services can be reused in different applications, eliminating the duplication of work effort when the same service may be required by different applications. Micro services have an operational overhead which serverless computing does not have. It requires an underlying operating system which requires deployment and monitoring of the operating system for availability. There is also application deployment and configuration overhead and ongoing support and maintenance. With serverless computing, you leave that all to the cloud provider and you only pay for the time of use in 100ms increments. On the other hand, the advantage of micro services with containers is full control of the environment, while with serverless computing you are limited to what the cloud service provider enables for you.

Serverless computing services, micro services, and containers are not competing systems. They are complimentary. Serverless computing is another computing model that should be considered to increase agility and efficiency in code development and application deployment.


#Hu'sPlace
#Blog
0 comments
0 views

Permalink