When Cloud Native Computing Foundation general manager Priyanka Sharma has to explain what she does to someone at a dinner party, she starts by asking if they’ve heard of Kubernetes: CNCF’s primary open-source software project, which helps organizations manage their cloud computing resources.
Often, people say yes. Sometimes, they don’t. In that case, she’ll ask if they’ve heard of Amazon Web Services, a popular Amazon subsidiary that rents out cloud computing resources.
That usually takes care of the “cloud” portion. But what about the “native”?
“Cloud-native technology is when engineers and software people utilize cloud computing to build tech that’s faster and more resilient, and they do that to meet customer demand really quickly,” she explained.
What Is Cloud Native?
Cloud Native 101
What Is Cloud Computing?
To understand cloud-native technology, it’s important to understand cloud computing.
Storing and sharing digital information requires servers that provide computing power and storage capabilities. For a long time, most companies bought their own servers and kept them close by. They could adjust their computing power by adding or subtracting servers — as long as they had the necessary in-house IT support.
Then, things got a little easier with the popularization of virtual machines (VMs) — or virtual operating systems that can run alongside each other, powered by a single physical machine. With VMs, companies could get the same computing power with fewer physical servers. But still, there was room to grow.
Technologists turned physical machines into virtual machines by installing pieces of software called hypervisors. By equipping multiple machines with hypervisors, they found they could make virtual machines share resources like one big system. It was a cluster of nebulous computing resources — like a cloud.
Today, technologists can spin up their own cloud environments on private servers, or pay companies like Microsoft, Amazon or Google for computing resources in public clouds.
What Is Cloud Native?
Running software in public clouds relieved companies of some IT requirements and scalability headaches. But the transition to cloud alone didn’t create cloud-native technology.
“You could totally be on somebody else’s [computing resources] and do things exactly as you did in the past,” Sharma said.
As technologists understood more about the benefits of cloud environments, they started building software that took advantage of those benefits. The most notable change, Sharma said, was the move from tightly coupled systems with many dependent components to loosely coupled systems made of tiny components that could run quasi-independently.
Those components are containers. Thanks to Docker, technologists figured out how to package up — or containerize — software with all its libraries, so it could run anywhere. The accompanying system architecture is microservices, or collections of containers.
With containers and microservices, individual software components could be scaled up, scaled down, modified or removed in response to customer needs.
“When you hear the words ‘cloud native,’ you may hear all kinds of things. It’s a huge ecosystem.”
Consider a weather app, Sharma said. If a city experiences a natural disaster, traffic from that city would likely skyrocket. The app’s owners could increase the computing resources dedicated to data from that city, without affecting app speed for users elsewhere.
“Cloud native has been extremely successful because the business impact is straightforward,” she said. “People are able to address customer needs faster, delight them and make more money. So proliferation has been huge.”
And with that proliferation came a boom of platforms and services that support cloud-native technology — like Kubernetes, which helps organizations deploy and manage containerized applications across cloud environments.
“That’s why, when you hear the words ‘cloud native,’ you may hear all kinds of things,” Sharma said. “It’s a huge ecosystem.”
What Are the Advantages of Cloud-Native Technologies?
Perhaps the biggest advantage of cloud-native tech is fast deployment. Think about that weather app again. Let’s say a particular region of the country needed Air Quality Index data — and quickly.
If the app’s software system ran as dependent services, deploying that feature could take months of planning and dozens of conversations to ensure nothing else breaks in the process. With independent services, though, a team of two engineers could potentially build and deploy the feature in a matter of days.
Easier automation is another benefit. CNCF and its surrounding technologies — like GitOps and Hashicorp’s Vault — support secure, drag-and-drop deployment scripts that considerably speed things up. Now, each distinct deployment environment doesn’t require custom scripts.
Then there’s scalability and reliability. With cloud-based software, development teams can add features, traffic capacity, storage and more without messing with physical hardware. That’s because containerized software running in a distributed system doesn’t need to know much about the rest of the system to work. Engineers and ops professionals can add, reallocate or delete without affecting the rest of the system or risking data loss.
And last, there’s cost savings. Easy scalability makes for easy optimization. Particularly with managed cloud services — in which cloud providers or intermediaries help companies allocate their computing loads — companies only pay for the resources they need, rather than having a set number of servers running at all times.
What Are the Challenges of Cloud-Native Technologies?
With new technologies come new challenges, and, in the cloud-native world, observability — or the ability to see what’s going on inside a computer system — is a big one.
“People are trying to figure out how to make sure that they have the same information and ability to understand their own systems as they did before,” Sharma said. “One downside of complex, loosely coupled, distributed systems is that it’s very difficult as the company gets bigger for one person to understand it all.”
In other words, all those containers are running on all those VMs on all those different servers. Microservices — and even VMs themselves — can be spun up and shifted around at will. But who’s keeping track? If a system’s observability is poor, it’s tough to understand what happens to requests after they’re submitted.
How Has Cloud Native Changed the Work of Developers?
In some ways, cloud native hasn’t changed what developers do.
“You’re writing the same code, and you’re writing the same languages — the nuance here being some languages are better than others, Golang being the best, as of now, anyway,” Sharma said. “But you’re doing the same work, except you’re operationalizing it differently.”
For many tech companies, that difference comes in the form of DevOps — the hybridization of software development and IT operations — and continuous integration and deployment (CI/CD). With easily packaged-up microservices, it makes sense for the people who write code to deploy and manage it. Similarly, automated pipelines powered by the cloud support fast and frequent deployments.
For up-and-coming developers, preparing for work at cloud-native companies largely means understanding distributed systems and how containerization changed the way we compute.
“One [misconception about cloud native] is that it’s a very rarefied world where you have to have, like, 20 years of experience to understand it. That’s not true.”
Someday, colleges will catch up and teach this thoroughly in systems classes, Sharma said. But until then, programmers can bridge that gap with other training opportunities. CNCF offers cloud engineer boot camps and Kubernetes certifications.
Another way to learn is through participation in open-source projects. (If you’re a first-time contributor, check out this guide.)
“If you have contributed to something in the cloud-native project world, you would be golden for a lot of employers,” Sharma added. “One [misconception about cloud native] is that it’s a very rarefied world where you have to have, like, 20 years of experience to understand it. That’s not true.”
Cloud Native 201
Cloud-Native Software Can Run Almost Anywhere
Cloud-hosted workflow and analytics platforms made a splash across most industries, but many hardware manufacturers, like automotive and aerospace companies, have struggled to jump on board. Their clunky software required a lot of physical infrastructure, and their data security needs were high.
First Resonance, founded by former SpaceX and NASA engineers, built a SaaS platform for that underserved market, and cloud-native capabilities have been essential to the startup’s success.
By architecting its product in microservices, First Resonance became flexible enough to meet the differing needs of potential customers. Some manufacturers want to run the platform in a public AWS cloud. Others — like aerospace companies — want it hosted in their own private clouds for security reasons. Still others require such low latency that they want it installed on premises.
Right now, about three-quarters of the company’s customers contentedly use cloud setups, Chris Magorian, a DevOps and backend engineer at First Resonance, told Built In. But plenty were hesitant about making the switch, either out of habit or due to strict data security protocols.
“A lot of people feel like running things on premises is the only way to keep your data secure,” Magorian said. “And when we end up talking to them, we say, ‘Your stuff is AES 256 encrypted from every endpoint, and we offer end-to-end encryption for all data transits.’ In some ways, it’s enlightening for them, because they say, ‘Oh, OK, I didn’t realize that’s how that worked.’”
Once data security concerns are addressed, moving from elaborate on-premises software systems to a contemporary SaaS platform is usually a welcome change for manufacturers, Magorian said. The platform lives at a single URL, and client-side engineers can build custom visualizations and analytics with First Resonance’s API.
But the real benefit of a cloud-native platform, according to Magorian, is fast feature delivery.
“Customers get to see immediate benefits by refreshing their pages, versus a redeployment or reinstall and an update of an on-prem database.”
Let’s say a manufacturer wants a feature that lets it input the serial number of a part and learn the number of items that use that part. Before, that manufacturer would talk to a rep from its software provider, and that feature would get added to a roadmap. Eighteen months and few contracts later, the manufacturer may see the feature in production.
With cloud-based software, that feature arrives in a matter of weeks. First Resonance shares Slack channels with its customers, so it sees feature requests and feedback as they come.
“Customers get to see immediate benefits by refreshing their pages, versus a redeployment or reinstall and an update of an on-prem database,” Magorian said.
But even customers that decide to run the First Resonance platform on premises can still enjoy the upsides of cloud-native tech. AWS, as well as other providers like Oxide Computer, are offering on-prem servers with cloud-like capabilities, and containerized applications will make it possible for First Resonance to install its platform on premises and maintain the same delivery schedule.
“If we can automate that whole process for them — where updating something is just simply clicking a few buttons in a webpage, instead of contacting several reps to do it for you — that’s still a huge win,” Magorian said. “That’s something cloud native is bringing both to customers who use SaaS products in the cloud, as well as customers that want to basically run SaaS products on prem.”
Where Is Cloud Computing Heading Next?
Taking cloud technology and running it close by instead of at a faraway data center — like some First Resonance customers want to do — is called edge computing.
For companies whose latency, security or bandwidth needs exceed what centralized cloud computing can offer, edge computing is the wave of the future. Big cloud providers are investing in edge outposts, as opportunities to expand centralized clouds dwindle. Individual companies can even pay for their own edge systems.
Consider a manufacturer that wants to process data from thousands of connected devices attached to various factory parts, and push that data to iPads used by employees on the factory floor. Right now, much of the data from connected devices is recorded manually, Magorian said, or by totally separate systems. For machines and devices to feed information directly into the First Resonance platform, clients need an edge computing system with high observability.
“Cloud native has given us the ability to deliver [software] in three environments that are wildly different.”
“There are several ways we can do it. But we’re probably going to do it with Kubernetes,” he said. “With that, and the other software technologies that we’ve chosen, we’ll be able to deliver them super low latency on prem. And then that stuff can get bridged over a secure tunnel, like a VPN or a few VPNs, to our clouds, so they can get higher compute without having to purchase all that equipment themselves.”
With a set-up like that, client-side engineers can learn a little about First Resonance’s API and then start writing their own scripts for bulk data ingestion from IoT devices.
In summary: An understanding of the aerospace industry helped First Resonance recognize a market need, but cloud-native technology helped the startup address it.
“Cloud native has given us the ability to deliver [software] in three environments that are wildly different,” Magorian said.