At the AWS re:invent 2016 conference this week, Amazon Web Services (AWS) signaled for the first time that its ultimate ambitions now extend beyond application workloads running in its own data centers.
While stopping short of employing its core software to enable IT organizations to stand up private clouds that would run outside of an AWS data center, AWS is moving to create a Greengrass platform that extends the Lambda serverless computing environment all the way out to specific classes of endpoints.
New opportunities with serverless computing
In the case of AWS, serverless computing is an instance of a microservices architecture that allows IT infrastructure to dynamically scale up and down to match changing application workload requirements. The Lambda service created by AWS presents developers with a functional programming model through which they can invoke IT infrastructure resources. That approach is more efficient from the perspective of AWS because it maximizes compute and storage utilization in a way that traditional computing frameworks can’t match.
Now AWS is making it possible to execute Lambda functionality on a Snowball appliance deployed locally. Previously, the role of a Snowball appliance was to make it simpler to physically ship data to AWS, which would then migrate that data into its cloud. But now AWS is making available a 100 TB Snowball Edge appliance capable of running Greengrass software that also includes support for messaging and data synchronization capabilities.
Initially at least, AWS sees Greengrass combined with Lambda and Snowball being used to drive Internet of Things (IoT) applications that require analytics to be run locally before data is shipped back to the AWS cloud. But, there’s no limit on the types of use cases that developers might decide to employ Greengrass on an AWS appliance that is now capable of providing access to both local compute and storage resources. In addition, AWS says Greengrass software should be able to run on a variety of IoT hardware, including Raspberry Pi and Qualcomm processors. The base requirement is 1Ghz CPU with 128M of memory in a platform capable of running either AWS Linux or Ubuntu from Canonical.
Impact on IT service providers
The message from AWS for IT services provider is clear. Serverless computing models will for the first time make it possible for AWS to extend its reach beyond the public cloud. In fact, AWS is now so significantly ahead of other public cloud platform providers that 451 Research is recommending that IT service providers adopt an AWS + 1 strategy. IT service providers can’t afford to ignore AWS, but 451 Research is advising IT service providers to be careful to balance the market power of AWS by having ties with at least one other public cloud service provider. Naturally, AWS CEO Andy Jassy doesn’t agree. AWS is advising partners that they will be much better off concentrating their resources on AWS.
Of course, AWS is not the only public cloud service provider to discover the potential of serverless computing. Whatever the strategy they employ, IT service providers need to be aware that multiple public cloud service providers are now on the verge of taking advantage of a variety of approaches to serverless computing to extend their reach and influence well beyond the data centers they physically manage.