Cloud computing is a big part of most company’s technology plan, and will continue to play a bigger role in the future. This guest blog from our partners at Rackspace talks about three trends in the public cloud world to watch for.
Public cloud adoption is soaring. As business leaders turn to public cloud to transform their applications and optimize their workloads, leading public cloud providers such as Amazon, Microsoft and Google are flush with demand.
Your organization is likely already using public cloud (or multiple clouds) to power parts of your business. Or you’re weighing the options for moving applications out of a legacy data center. Either way, there are some trends to be aware of that could affect how and which public cloud your company uses.
During a recent roundtable discussion, Rackspace public cloud experts Jeff DeVerter, Derek Remund and Eric Johnson noted three areas to keep an eye on:
Public clouds: more alike than different, but those differences matter
Although the leading cloud platforms have specific features that set them apart from one another, our experts agreed that the performance gap between today’s hyperscale clouds (AWS, Azure and Google Cloud Platform) in terms of raw compute power is steadily and rapidly shrinking.
“We alluded to this in our Cloud Smackdown, but the previously major differences between the leading clouds are diminishing,” said DeVerter, who has spent the last decade overseeing Microsoft cloud technologies at Rackspace. That’s a big difference from even two or three years ago, when users would often find basic performance or feature deficits when comparing clouds.
For example, Google is known for data storage and networking, while for Microsoft, its business processes. AWS‘s strength lies in serverless computing and future technologies like machine learning and artificial intelligence.
So a customer with an existing Microsoft Enterprise Agreement and an application heavily reliant on Active Directory should likely go with Azure, whereas a multinational research team working on a DevOps-heavy project where servers need to be provisioned on a per-second basis would probably find AWS more suitable.
What’s the bottom line when it comes to this new era of cloud equality? It means the planning process is now, more than ever, the most critical part of your cloud journey.
Containers will truly enable multi-cloud solutions
Containers have slowly been making their way into the mainstream over recent years, and our Rackspace cloud experts see 2018 as the year this technology comes into its own, offering greater flexibility across multiple clouds.
Because containers package up applications in a way that makes them faster, more reliable and — most importantly — portable, they offer a number of new possibilities when it comes to testing out different cloud platforms.
“If the point of a multi-cloud strategy is to optimize on a workload by workload basis,” noted DeVerter, “containers offer an easy way for organizations to move applications around and figure out where they fit best.”
Recent announcements from Amazon and Azure also mean Kubernetes is now available as a container orchestration engine across all three leading public cloud providers, enabling truly portable workloads — not just as a disaster recovery option, but actively running across multiple clouds, even taking advantage of each cloud’s native capabilities.
This increased portability will allow companies to discover whether a particular Kubernetes container runs better on AWS, Azure or Google.
Developers can now pick up a container of applications that runs in AWS, and move it to another cloud — seamlessly. Containers also make life much easier for developers working across multiple clouds, allowing them to take the same application from development to production across platforms, delivering a huge leap in efficiency.
“If an app works on the developer’s machine, then it will most likely work in production,” said Johnson, “and that adds a lot of reliability to the equation.”
Public cloud will support the next frontier: machine learning
Machine learning has already begun to play a crucial role in much of the innovation we’re seeing in 2018 and beyond. While only a few companies are successfully using machine learning in production environments today, public cloud platforms will play a major role helping businesses get up to speed.
Machine learning requires a great deal of compute resources. The cloud enables the rapid provisioning and scalability needed to build, test and run machine learning models and helps supply the massive amounts of needed input data. Public cloud also delivers prebuilt machine learning models used for common machine learning tasks — things like image recognition, phoneme separation for voice recognition and video analysis.
While APIs for these technologies exist already, each major cloud provider is racing to improve machine learning capabilities. AWS has already begun delivering on this, with features like SageMaker, a service with prebuilt models already in place to help with machine learning, and DeepLens, hardware with machine learning built into it, which accesses gigabytes of pre-modeled data.
As this trend continues to accelerate, we’ll also see the emergence of machine learning in the cloud as a service. After all, if a business executive can use Amazon’s Alexa to order home goods, read the news or play music, she’ll likely carry that expectation into the workplace. That might look like a machine learning-enabled device calling up data center analytics or shipping and receiving stats, in real time, and progress to the point where systems are able to optimize themselves based on those outputs.