1. Kubernetes and AI are like peas and carrots
2/11/2019 1:09:49 PM
Kubernetes and AI are like peas and carrots
Kubernetes,AI and Kubernetes,Machine Learning Kubernetes,Google Cloud Platform,AI Algorithms
https://news-cdn.moonbeam.co/AI-and-Kubernetes-App-Developer-Magazine_kmz0qcbz.jpg
App Developer Magazine
Artificial Intelligence

Kubernetes and AI are like peas and carrots


Monday, February 11, 2019

Carmine Rimi Carmine Rimi

Developing AI systems on Kubernetes makes sense because of the many attributes the platform offers such as quickly scaling to large amount of processing power - something artificial intelligence loves.

Kubernetes (commonly known as k8s) started out as a small cluster management project within Google in the early 00s. Today, it’s by far the leading container management tool, with 83 percent adoption, according to the latest Cloud native Computing Foundation survey. Forrester has declared, “Kubernetes has won the war for container orchestration dominance and should be at the heart of your microservices plans.”

Meanwhile, artificial intelligence (AI) has become one of the biggest tech buzzwords. McKinsey estimates AI techniques could create a staggering $3.5 trillion and $5.8 trillion in value annually in 19 industries. 

But, creating AI applications is hard. Large AI programs are difficult to design and write as they involve many types of data. Porting them from one platform to another tends to be a struggle.

Furthermore, several steps are required at each stage to begin building even the most rudimentary AI application, each requiring different skills. Feature extraction, data collection verification and analysis, and machine resource management comprise most of the code base needed to support a comparatively small subset of actual ML code.

There’s a lot of work to do to get to the starting line, as well as a large amount of ongoing effort required to keep the application's current.

Enter Kubernetes, the open-source platform that automates the deployment and management of containerized applications, including complicated workloads like AI and machine learning.

Kubernetes and AI represent converging trends. Most companies are running Kubernetes as a platform for their workloads, or plan to soon, and AI is an increasingly important workload.

As organizations shift their attention to AI to reduce operating costs, improve decision-making and serve customers in new ways, Kubernetes-based containers are becoming the go-to technology to help enterprises adopt AI and machine learning. 

Why is Kubernetes an ideal platform for AI?

Containers provide a compact environment for processes to run. They’re easy to scale, are portable across a range of environments, from development to test to production. Thus, they enable large, monolithic applications to be broken into targeted, easier-to-maintain microservices.

Kubernetes is right for the job because AI algorithms must be able to scale to be optimally effective. Some deep learning algorithms and data sets require a large amount of computing. Kubernetes helps because it is all about scaling based on demand.

It also provides a way to deploy AI-enabled workloads over multiple commodity servers across the software pipeline while abstracting away the management overhead. 

As organizations shift their attention to AI to reduce operating costs, improve decision-making and serve customers in new ways, Kubernetes-based containers are becoming the go-to technology to help enterprises adopt AI and machine learning. 

Innovations on the platform keep coming. In December 2018, the Kubernetes project introduced Kubeflow, “the machine learning toolkit for Kubernetes. The open-source platform is designed “to make scaling machine learning models and deploying them to production as simple as possible, by letting Kubernetes do what it’s great at: easy, repeatable, portable deployments on a diverse infrastructure, deploying and managing loosely-coupled microservices, and scaling based on demand.”

While Kubernetes started with just stateless services, the project said, “customers have begun to move complex workloads to the platform, taking advantage of rich APIs, reliability, and performance provided by Kubernetes. One of the fastest growing use cases is to use Kubernetes as the deployment platform of choice for machine learning.”

As 2017 dawned, among the three major public cloud vendors, only the Google Cloud Platform supported Kubernetes, with its Google Kubernetes Engine. By the end of the year, all were on board, after Microsoft added Kubernetes support to the Azure Container Service and Amazon debuted the Amazon Elastic Container Service for Kubernetes. 

Now, the uses for Kubernetes and ways it’s being deployed by companies appears limitless.

Tech vendors and their customers are rallying around the notion that containers offer huge benefits in developing and managing AI components of applications.

The rise of AI is helping fuel the growing interest in containers as an excellent way to bring repeatability, fault tolerance and repeatability to these complex workloads. And Kubernetes is becoming a de facto standard to manage these containerized AI applications. 

It’s a terrific match that should benefit enterprises for years to come.


This content is made possible by a guest author, or sponsor; it is not written by and does not necessarily reflect the views of App Developer Magazine's editorial staff.

Subscribe to App Developer Magazine

Become a subscriber of App Developer Magazine for just $5.99 a month and take advantage of all these perks.

MEMBERS GET ACCESS TO

  • - Exclusive content from leaders in the industry
  • - Q&A articles from industry leaders
  • - Tips and tricks from the most successful developers weekly
  • - Monthly issues, including all 90+ back-issues since 2012
  • - Event discounts and early-bird signups
  • - Gain insight from top achievers in the app store
  • - Learn what tools to use, what SDK's to use, and more

    Subscribe here