everythingpossible - Fotolia
Compare microservices deployment patterns for best strategy
To solve pesky deployment woes, match your app to a microservices deployment pattern, like single service instance per host, multiple service instances per host or even serverless.
In a monolithic application, all -- or at least almost all -- components are developed, deployed and scaled as a single unit. This type of deployment is usually seamless, but changes to such an application can be treacherous. Conversely, microservices are flexible; they scale and update independently. However, when you deploy microservices, you'll often end up with many interdependent services written in different languages and frameworks. This means resource requirements might significantly differ from service to service.
There are several microservices deployment patterns available -- both for single or multiple service instances per host -- that tackle these compatibility problems. Some microservice instances can share a host, with either individual processes or multiple instances in the same process. Others can run one service instance per container or VM.
Serverless deployment is a viable approach for microservices, as well, typically via a cloud provider's offering. Let's evaluate these microservices deployment patterns, including their benefits and drawbacks.
Multiple service instances per host
In the multiple-service-instances-per-host pattern, one or more physical or virtual hosts are provisioned, and then multiple services are executed on each of the hosts. This reflects the traditional approach to deploying applications, and there are two variants of this pattern.
In one of these variants, each service instance is a process. For example, you might deploy a .NET service instance as a web application on Internet Information Services (IIS). In the other variant of this pattern, there might be more than one service instance executing the same process -- for example, if you have several .NET web applications running inside IIS.
In this pattern, several services typically share the server and OS. Efficient resource usage is the biggest benefit of this pattern, along with seamless deployment. Another advantage of this pattern is fast startup time; less overhead means you can start the service quickly.
A major drawback of this pattern is a service instance needs to run as a separate process for it to run in isolation. Another disadvantage is when several service instances are deployed in the same process, it becomes difficult to determine and monitor the resource consumption of each service instance.
Single service instance per host
In the single-service-instance-per-host pattern, each microservice runs isolated on its own host. This pattern has two variants: service instance per VM and service instance per container.
In the service-instance-per-VM pattern, each microservice is packaged as a VM image, and each service instance runs as a separate VM. This pattern makes it easy to scale the service -- all you have to do is increase the number of instances. Even Netflix takes advantage of this pattern to deploy its video streaming service.
The biggest benefit of this pattern is each service instance runs in isolation. Another benefit of this pattern is you can take advantage of a cloud infrastructure that provides features such as autoscaling and load balancing. It also keeps all the implementation details and technical intricacies encapsulated.
But one primary downside of this pattern is resource utilization is not that efficient. VM images are slow to build and instantiate. Additionally, deploying a new version of service takes quite some time. This pattern also involves a degree of IT operations work to build and manage an ever-changing deployment of VMs at scale.
The service-instance-per-container pattern is lightweight and retains many of the benefits of virtualization that VMs boast, making it a better alternative to the service-instance-per-VM pattern. In this pattern, each microservice instance runs in its own container. Therefore, the service is packed as a container image and deployed in a container instance.
A container is a portable, resource-controlled operating environment virtualized at the OS or application layer, rather than the IT resources layer, where VMs work. It is an environment in which one or more processes can execute at the same time. Some of the popular container technologies include Docker, Linux Containers and Solaris Zones.
Like VMs, containers are lightweight and encapsulate the technology that's used to implement a service. Because code, runtime and dependencies are all packaged together, chances of service failures are minimal. Containers also promote fast application startup and scale-up and require much fewer resources, compared with VMs. A container image is a lightweight, stand-alone and executable piece of software that includes all the ingredients needed to run it: code, runtime environment, dependencies, etc.
The service-instance-per-container pattern promotes easy deployment and scalability, and the service instances are isolated from one another. This pattern enables quick container image builds and eases container management.
But despite these benefits, there are certain downsides to this pattern. Even though container technology is rapidly evolving, containers aren't as mature as VMs. Container security is different than VM security, because the containers share the OS kernel, and IT organizations must learn new security approaches.
Serverless deployment
Serverless deployment is another strategy to deploy a microservice-based application. In a serverless deployment model, the underlying infrastructure is encapsulated, and there's no responsibility to bear resource provisioning. However, a serverless deployment pattern often only fits niche use cases, while containers and VMs have more flexible purposes.
If you are using AWS to deploy a microservice, you can package the microservice as a zip file and then upload it to AWS Lambda. If you're using Azure, you might want to use Azure Functions. And, with Google Cloud, your choice would be Google Cloud Functions. Nonetheless, plenty of tutorials exist that might help you to deploy your microservice in a serverless environment.