Serverless computing, also known as serverless, is still relatively new to the tech scene, and it is an exciting advancement that could re-revolutionize the way businesses operate, and the PaaS industry, in the coming years.
When you couple serverless computing with the advancement of enterprise service bus, or ESB, and iPaaS software that is now providing more internal control over applications, it will be interesting to monitor how companies choose to run and manage business applications in 2018.
Simply put, it is a type of cloud service where hosting providers, such as Amazon, Microsoft, IBM and Google, manage applications and distribute resources for businesses as they are needed, as opposed to charging for dedicated servers or capacity in advance. The serverless pay-as-you-go model could be compared to a pay-per-visit gym membership, where you only pay for usage instead of a fixed time frame.
Serverless computing aims to save companies significant amounts of time, money and resources by hosting, running and managing applications and the core functions of a business. Using this model, a company is freed from the bulkiness and effort of running and maintaining front- and back-end server applications.
Many deem this the way of the future, especially with the increased interest in creating more agile, customizable software ecosystems. By running applications this way, especially with the speed performance edge computing brings to the table, serverless computing should continue to gain attention throughout 2018.
Google searches for “serverless” have increased significantly since 2015.
When compared to the number of people searching for microservices and containerization environments, serverless computing has only started to gain traction in the past two years. The interest level in this search term peaked at the end of 2017, meaning startups and enterprises alike are considering moving toward serverless technology as 2018 begins.
The serverless revolution
Serverless computing is thought of by some as the next phase in the evolution of infrastructure-as-a-service, or IaaS, by completely abstracting the underlying infrastructure from developers, and essentially virtualizing runtime and operational management through a third-party provider.
When you break down serverless computing, it can be thought of as a full-fledged SaaS solution. It allows businesses to pay a third party to manage servers, databases, application logic and other core business functions that have more commonly been run and managed internally in the past.
In the early days of cloud computing, massive data warehouses and servers were leased or owned by remote businesses for anything from storage, to processing, to security. This, in many cases, is still happening today, as the public cloud still offers a flexible, scalable and remote-friendly option for modern and long-standing companies.
Companies poured copious amounts of cash into IT infrastructure in the pre-cloud days, investing in servers, applications, networks and more. But as IaaS and PaaS emerged by way of the cloud, money and resources have been reallocated to operational costs instead of capital and assets.
Serverless systems can significantly reduce operational cost and complexity.
Serverless computing is the next phase of cloud computing as revenue models for cloud services continue to evolve. The pay-as-you-go nature of serverless looks to be a massive benefit not previously imagined with traditional and early-cloud services, and its relationship with edge computing and IoT connectivity could have massive impacts across industries.
From an infrastructure standpoint, serverless architectures are composed of applications that significantly depend on external IaaS and BaaS providers, or on custom code that’s run in ephemeral (temporary) containers. This has been referred to by many as FaaS, or function as a service, currently provided by the likes of Amazon, Microsoft and more.
Using FaaS allows for this activity to move to the front end, ultimately removing the need for a traditional server system that sits behind an application. These serverless systems can significantly reduce operational cost and complexity, but the shift to this model certainly places a dependence on third-party vendors and the current immaturity of the market in general.
FaaS, simply put, allows businesses to run back-end code without managing any of your own server systems or server applications. This is a major shift from traditional architectures, moving from a linear, underperforming model to a low-maintenance, flexible architecture operated by a third-party like AWS Lambda, IBM OpenWhisk and Azure serverless.
These products, and others like them, allow developers to easily upload their desired code to the platform without worrying about anything else. The third-party provider takes that code and manages everything required to run, manage and scale your code while still providing instant access to it should you wish to update, tweak or roll back.
Advantages of embracing the budding serverless revolution are related directly to its similarities to DevOps models. DevOps has become increasingly popular throughout the business landscape because of the flexibility and speed it promotes.
Due to the fact that third-party vendors are deploying, managing and scaling servers when using serverless, businesses are able to reallocate development resources away from infrastructure administration. This is not only a huge savings in overhead costs, but also allows developers to focus on producing the best possible products without worrying about how they will be run and maintained. Microservices and containers also play a significant role in the increased agility offered by serverless as they are all highly compatible.
The future of serverless
The future of serverless is yet to be determined as it is still a relatively young technology. As ecosystem environments continue to get more refined and security and IoT advance, IT infrastructure as we currently know it could radically change in the coming years.
Jason McGee, VP and CTO of IBM Cloud, said at ServerlessConf 2017 that IBM analysts predict the serverless market will grow 7–10 times by 2021, meaning cloud vendors are noting the early success of AWS Lambda. A Markets and Markets report predicted smaller, yet still substantial, growth, predicting the serverless market will grow from a $1.88 billion market in 2016 to $7.72 billion by 2021 (411 percent).
Serverless and IoT
Serverless computing could have big benefits when working in unison with the increasing amount of IoT connected devices in the world. Not only does it give companies more flexibility when it comes to allocating resources due to management by a third party, but it also increases speed because it is only running the business functions a company is actually using.
As a result, if a distribution company is using IoT sensors to track a fleet of vehicles through a serverless model, the sensors could be triggered in case of a late arrival to notify the recipient. Or, more critically, the sensors could be triggered if there is an accident or emergency to provide data on location and severity to the nearest hospital.
Serverless and security
Although everything is a challenge when it comes to stopping cyberattacks, serverless architecture provides a unique host of security benefits. DDoS attacks and other threats are considerably less worrisome to businesses through serverless, as computing is being managed externally.
Additionally, the elastic nature and automatic scaling capabilities of the serverless functions leave fewer holes and soft spots in your system. They also decrease the chances of someone installing malicious software on your servers or attacking your operating system for the same reason.
The litany of cyberattacks in 2017 is of great concern for companies heading into 2018, and serverless, along with advancements in microservices and containerization, could help limit the devastation caused by security threats.
We may not see serverless take hold for a bit, but in 2018 we should expect FaaS to gain traction and improve platform, infrastructure and database management environments, and have a significant impact on resource management.