Platform9 Blog

The Gorilla Guide to Serverless on Kubernetes – Chapter 1: The Serverless Revolution

This is an excerpt from The Gorilla Guide to Serverless on Kubernetes, written by Joep Piscaer. You can download the full guide here.


Serverless computing is a code execution model that abstracts away all the infrastructural plumbing underneath the code, allowing the developer to focus solely on their code. A serverless application is run by a platform that hides the implementation details from the user. These applications are made up of independent smaller services, many of which are event-driven, short-lived, and stateless.

Let’s get some terms defined for consistency. Functions-as-a-Service (FaaS) is a technology for serverless computation. In a FaaS system, the unit of execution is a function of code written in a programming language (most FaaS systems support a wide range of languages). A developer specifies one or more functions and the conditions (events) under which those functions shall execute. Since it is serverless, the FaaS system automatically provisions resources to host and execute the functions when the specified conditions are met, and later tears them down when no longer needed. Since FaaS is the most widely used form of serverless, those terms will be used interchangeably.

Serverless is a level of abstraction of, and a decoupling from, the underlying infrastructure constructs. In a microservice architecture, monolithic applications are broken up into small services that can be developed, deployed, and scaled individually. Serverless architectures are at the extreme end of the microservice spectrum, being even more fine-grained and loosely coupled. Serverless functions complement more traditional microservice and virtual machine (VM)-based approaches and regular third-party cloud services for event queueing, messaging, databases, and more.

Worth noting that a key characteristic of serverless is its statelessness. Functions are invoked from a clean state every time; any persistent state required for the function needs to be externally stored. This is similar to the twelve-factor app concept for building Software-as-a-Service (SaaS) apps. Generally, functions will use a database, and a distributed cache or object store to store state across requests.

With serverless, the organization or person writing the code doesn’t have to care about the infrastructure underneath. As such, it’s a form of utility computing. The serverless architecture is a boon for developers, allowing them to focus on just the code rather than all the surrounding plumbing like containers, deployment scripts, and monitoring. As is always the case, though, there are trade-offs in language support, code compatibility, performance, and cost.

Another way to understand FaaS is to compare it to the popular ‘IFTTT’, or ‘If This, Then That’ web service that allows you to glue together devices, web apps and more based on Triggers (this) and Actions (that). IFTTT is popular in home automation scenarios, creating interaction between the environment (the weather, time of day) and IoT devices like thermostats, video door bells, and lighting systems.

Serverless serves similar “glue between services” use cases, and was first popularized in mobile app development to stitch together databases, authentication, and other commodity services that make up the backend of an app. It continues to be used in similar ways today, building IoT back-ends, APIs, and data processing pipelines.

Not only do they serve similar use cases, their architectures are even similar with triggers (this), and functions (that). We’ll dive into specifics of the serverless architecture later.

Serverless Greases the DevOps Wheel

Although at first glance serverless may seem counterintuitive for those in DevOps-culture organizations where development teams do their own operations, it really isn’t. Serverless decouples the bits that make up the runtime (language-specific environments, containers, operating systems, VMs, physical hardware, networks, storage, and so on) and the tooling in the developer’s pipeline (to build, test, and deploy code) from the actual code put in production, minimizing the operational part of the development workflow. It frees the coder from any concern about the plumbing.

This diagram more clearly shows the relationship between serverless and the underlying architecture and how this is different from other service deployment models.

Comparing service architectures

This isn’t to say that operations aren’t done anymore, but rather that it’s abstracted away from the developer. The function still has to be monitored, deployed, secured, supported, scaled, and debugged; these things are still happening, but they’re merely packaged up as part of the service or platform.

While cost benefits are often cited as the major reason for using FaaS, it’s the reduction in lead time that may be the most exciting improvement. Instead of spending time on inefficiencies, product development teams can now focus more time on continuous experimentation, which will lead to more innovation and greater market advantages.

Waste Not

Looking at this from the lean software development perspective, we see that most steps in the developer pipeline are ‘waste.’ Waste, in this context, refers to technically necessary steps, like compiling or packaging, that provide no value to the customer. Even if those steps increase the quality of the code, like writing unit tests or deployment specifications, they provide little actual customer value. This means that it’s important to reduce variation and other waste in the pipeline, as this leads to better customer value, delivered more quickly.

The comparisons to serverless computing are obvious: serverless removes a tremendous amount of waste from the developer pipeline by abstracting and standardizing. Also, as any developer can tell you, shortening the pipeline from writing a line of code to putting it in production is a major advantage.  Any optimization in the compilation, testing and packaging portions of the pipeline seriously enhances developer speed and efficiency. It also contributes to creating short and specific feedback cycles, which helps improve quality of the code, as the developer doesn’t have to switch context between different features they might be working on.

On the next posts we’ll dive deeper into the Serverless landscape – on the public clouds and on-prem – as well as key benefits, challenges and use cases for real-world applications.


Can’t wait?

To learn more about Serverless, download the Gorilla Guide: Serverless on Kubernetes.

Vamsi Chemitiganti
Chief Strategist at Platform9 Systems. Vamsi works with Platform9's Client CXOs and Architects to help them on key business transformation initiatives. In previous roles, Vamsi was the CTO for RiskCounts - a FinTech based in NYC. Prior to that spent eight years as the Chief Architect for Red Hat’s Global Financial Services Vertical based out of NYC. Vamsi also spent two years as the General Manager (Financial Services) at Hortonworks. In both roles, Vamsi was responsible for driving Red Hat and Hortonworks technology vision from a client business standpoint. The clients Vamsi engages with on a daily basis span marquee financial services names across major banking centers in Wall Street, Toronto, London & in Asia. These include businesses in capital markets, retail banking, wealth management and IT operations. holds a BS in Computer Science and Engineering as well as an MBA from the University of Maryland, College Park. He is also a regular speaker at industry events on topics ranging from Cloud Computing, Big Data, AI, High-Performance Computing and Enterprise Middleware. In 2013, Wall Street and Technology Magazine identified Vamsi as a Global Thought Leader. Vamsi writes weekly on financial services business and technology landscape at his highly influential blog – http://www.vamsitalkstech.com

The browser you are using is outdated. For the best experience please download or update your browser to one of the following: