Benefits of Serverless Computing
Serverless computing offers a number of advantages over traditional cloud-based or server-centric infrastructure. For many applications, serverless architectures offer greater scalability, more flexibility, and quicker time to release, all at a reduced cost. With serverless architectures, developers do not need to worry about the purchase, provisioning, or managing of backend servers.
By its nature, serverless platforms, particularly Function-as-a-Service (FaaS) via AWS Lambda Function services, fosters a distributed, service-oriented architecture. This is because the service provides discrete units of compute resources that are provisioned just-in-time (in most cases, though this is configurable) to handle a call to the function that the unit supports. Because of how the compute resources are organized, an application implemented against AWS Lambda Functions is naturally decomposed into methods or functions (hence the name). There are multiple benefits garnered from this type of organization:
Functional Clarity Through Decomposition
In order to use AWS Lambda Function-as-a-Service technology, developers and architects must decompose their application into functions. Functions can then be configured (using a number of techniques and technologies, such as Application Gateways, EventBridge, Step Functions, etc.) to respond to events. This requires that the scope of each function be tightly focussed on a singular purpose. The side effect of building in this manner is that an application's functionality ends up broken down into discrete logical components (the functions); each function is laser-focused on the task for which it has been designed to handle. That laser focus within the parts facilitates clarity for the whole, as this kind of decomposition is often easier to map directly to requirements, and when they take a step back to view the whole, designers can more easily see exactly what gaps may exist between the system and the requirements.
Function-level decomposition also decouples the manner in which each function is linked to the others from the function itself, which ties into concepts of scalability and modularity.
Scalability and Modularity
AWS Lambda Functions are independently scalable, with several options available to manage function warm-up and capabilities to respond to bursts of activities. The advantage of this is that, not only is auto-scaling handled automatically, it is implemented such that only the functions seeing increased activity will scale. This lowers costs because systems need not scale resources for components that are not seeing extended use (this ties into Serverless' pay-as-you go model); because functions' environments are decoupled from one-another, this also prevents a large number of calls to one portion of the system from starving another portion of the system of resources.
Implementing applications as orchestrated collections of independently managed functions also has the side effect of making systems extremely modular. As long as the functions are tightly focused with clearly defined scope, new functions can be added to provide new capabilities. Functions can be treated as "building blocks" of application logic, being reconfigured and reordered as processes change, while changes can be made to the orchestration (either Step Functions or Event Bridge) to add in new stages or capabilities.
In the example below, we have a Lambda function that generates events (the Generator), and another that processes events (the Consumer). If a new capability is needed, it can be added to the workflow as a new function:

Using AWS Step Functions, the orchestration between Lambda functions can be used to make decisions as to how data is to flow between functions, further enforcing the modular nature of the system and potentially reducing the amount of change that can cascade through systems with the introduction of new requirements:

As this example indicates, maintenance is one area where modularity has the potential for great impact.
Deployability and Maintainability
The level of modularity within applications designed against Serverless FaaS discussed previously can also have positive impacts on maintainability and deployability. The decoupled components that implement an application can be independently deployed, allowing for faster changes to systems less disruption. Smaller deployments equates to shorter (or no) disruption and downtime to accommodate deployments. Smaller units typically take less time to deploy, as well.
Modularity also facilitates maintainability. In addition to facilitating the introduction of new components that support new capabilities, the focused nature of each unit makes it easier to identify what components are affected by requirement changes; only the components affected by changes need to be modified and deployed.
Data Encapsulation
A key aspect of modern Service-Oriented Architectures is the concept of data encapsulation. This is the idea that data structures representing a core business concept or data entity are only to be accessed through dedicated service APIs. By requiring all interactions with the data (think the typical CRUD operations -- Create, Read, Update, and Delete) to go through a single API, two things occur:
-
Consumers of the data are decoupled from the manner in which the data is stored
-
As long as the covenant represented by the API does not change, changes can be made to how the service operates, or to how the data is stored
These are really two sides of the same coin, though they work together to prevent data implementation changes from impacting consumers. This protection can be further augmented through the addition of API versioning, which allows some consumers of a service to move to a new version of the API (meaning a version that has new behavior, as opposed to changes in implementation that still support the same behavior) while allowing some consumers to move to the new API version at a later date.
API versioning allows consumers to be decoupled from one-another, enabling the introduction of new features within an API or method to be introduced in support of Consumer A, without requiring them to have to wait until Consumers B through F are all ready to adapt to the changes in lockstep. The kind of benefits provided by API versioning are only available if a system already enforces all data interaction to occur through an API.
Goldie Locks and the Granularity of Decomposition
There are a great many benefits to a Service-Oriented architecture, especially when implemented upon a Serverless platform. However, decomposition must be done carefully; too large-grained, and a system loses many of the benefits of Microservices, while if system decomposition becomes too fine-grained, complexity is shifted from the code into the management processes. A system needs to strike a happy medium, such that its logical components are decoupled enough to gain the benefits of modularity, without having so many components that managing them becomes a nightmare. This balance comes from understanding the problem space, and using that information to inform decomposition in accordance with the correct set of characteristics appropriate for the system's problem space.
See SOA Application Decomposition and Organization Around Data Entities in the Cloud for details.