Skip to main content

Microservices is an approach to software development that has seen a rising tide of interest over the last decade or so, going hand-in-hand with other trends such as cloud-native, agile development and, most notably, the use of containers as a vehicle for deploying software components.

Adoption of microservices has been increasing over the past several years. A survey carried out by O’Reilly in 2020 of over 1,500 organisations found that only about a quarter were not using microservices at all. Of the 75% that were, only about 10% had been using them for more than five years, which means the majority have taken the plunge with microservices during the past few years.

Microservices is not a specific technology, but instead is a style of software architecture and an approach to designing applications and services. Instead of creating an application as a single monolithic entity, the microservices approach is to break down the required functionality into a collection of smaller, independently deployable services that communicate with each other.

This approach has several advantages, one of which is easier scalability, as the individual components can be scaled independently of each other. Only the parts of the application that might be experiencing high demand, for example, need to be scaled, typically by starting up new instances of that component to balance the workload, rather than scaling the entire application.

Microservices also lends itself to an agile development process, because the smaller size of the component parts makes continuous improvement easier and allows for faster updates to the code. It also makes it possible for different programming languages to be used for different components, as some languages may be better suited to certain kinds of task. Because the component parts are small, it makes it easier for developers to understand the code, which can have a knock-on effect on reliability.

Another advantage is the potential to reduce downtime through better fault isolation. If a single microservice instance fails, it is less likely to bring down the entire application or service as a result.

Potential disadvantages

While there are advantages to a microservices approach, there are also some downsides that organisations need to consider. For example, although the development of each microservice component might be simpler, the application or service in its entirety might become more complex to manage.

This is especially so with a deployment of any scale, which might involve dozens or hundreds of individual instances of different microservice components. The complexity comes not just from managing the communication between all these separate instances, but monitoring each of them to ensure they are operating within expected parameters, and not consuming more resources than they would normally require, which may indicate there is a problem.

Testing and debugging may also become more of a challenge with microservices, simply because tracing the source of a bug can be more difficult among a web of microservice instances, each with its own set of logs. Testing the entire application can also be tricky, because each microservice can only be tested properly once all the services it is dependent upon have also been tested.

In particular, monitoring is more important in a microservices deployment, but the distributed nature of the components makes it more complex than traditional applications, which are largely self-contained. The monitoring system has to operate at the level of each individual microservice instance, while at the same time keeping an eye on the web of dependencies between the different components.

The way that microservices operate also has implications for the organisation’s infrastructure. Supporting automatic scaling to meet increased demand implies that resources for new microservice instances must be capable of being provisioned by application programming interface (API) calls, which implies a certain level of cloud-like, software-defined infrastructure.

Data can be another thorny issue when building an application or service based on a microservices architecture, or to be more precise, where to store data. This is because each microservice instance is likely to have its own data store, but some applications may call for the ability to access a shared repository. Different services will also have different data storage requirements, with some in the industry saying that a NoSQL database makes the most sense, while others advocate sticking to SQL, if that is what the organisation has already deployed.

There are other differing opinions on this issue, with some experts advising that a single database (but not perhaps a single schema) shared by multiple services is the best approach, because, for one thing, it allows organisations to reuse the procedures they have in place for database backup and restore. Others advise against this, because it creates a potential single point of failure that goes against the microservices ethos.

Plan carefully

What this all means is that the microservices architecture may not suit every organisation, nor every type of application. However, the reasons behind its growing adoption are that microservices make it easier to implement a more agile approach to the deployment of services, which many organisations are now seeking.

“Organisations going down the microservices route tend to be more cutting-edge than the rest,” says independent analyst Clive Longbottom. “As such, they will also tend to be more open to thinking of what a move to a new architectural topology needs. Historically, the majority of changes have been evolutionary: successful microservices architectures are revolutionary, requiring a complete rethink of what is being done.”

In other words, microservices might be more suitable to a “green field” deployment that is being built from scratch, rather than organisations trying to refactor or update an existing application.

As already noted, Docker-style software containers are a technology that has become associated in the minds of many with microservices, although they are just one way of implementing a distributed deployment such as microservices. Other ways might include lightweight virtual machines, or even deploying microservice instances as non-virtualised code running in a server environment, just like everyday applications. Serverless computing functions would be another way of implementing microservices.

Containers are perhaps better suited than virtual machines, because they are less resource-heavy, and it is much quicker to spawn a new container instance than spin up a new virtual machine. Containers are also now a relatively mature technology, with a broad ecosystem of tools to support orchestration (such as Kubernetes), communications (such as Istio) and monitoring.

Interestingly, the O’Reilly survey found that a higher-than-average proportion of respondents who reported success with microservices chose to instantiate them using containers, while a higher proportion of respondents who had described their microservices efforts as unsuccessful had not used containers.

This might suggest that containers are a less risky option when implementing microservices, but again it is more a matter of choosing the right technology for the organisation’s specific application and requirements.

“If we just look at a microservice, it is just a functional stub,” says Longbottom. “The container should provide the environment the microservice needs, with orchestration and so on managing the provisioning, patching, updating and movement of the microservices as required across the broader platforms.”

In other words, building microservices involves a different kind of complexity from traditional, more monolithic application styles. For this reason, it may be regarded as a technology better suited for new-build modern or cloud-native applications, or for organisations overhauling their IT as part of a digital transformation process.

Leave a Reply