BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Intel Bets Big On Kubernetes For Nauta Deep Learning Platform

Following
This article is more than 5 years old.

Intel announced Nauta, an open source deep learning project based on Kubernetes. The project comes with select open source components and Intel-developed custom applications, tools, and scripts for building deep learning models.

Source: GraphicStock

According to Intel, Nauta provides a multi-user, distributed computing environment for running deep learning model training experiments on systems based on Intel Xeon processor. The software foundation for the distributed platform is built on Kubernetes – industry’s leading container orchestration engine. The project leverages other open source projects including Kubeflow and Docker. Mainstream deep learning tools and frameworks such as TensorFlow, TensorBoard and Jupyter Hub are tightly integrated with the platform. Intel claims that it has optimized the software for its Xeon processor to deliver the best possible performance.

Intel Nauta is designed to deliver the spectrum of machine learning and data processing pipelines. From data preparation to inference, developers and data scientists can rely on Nauta for all the tasks.

With Nauta, developers can define and schedule containerized deep learning experiments using Kubernetes on single or multiple worker nodes, and check the status and results of those experiments to further adjust and run additional experiments, or prepare the trained model for deployment.

Fully trained models can be tested within Nauta for inference. The platform supports both streaming and batch inferencing that can deal with individual data points are a complete dataset sent as a batch.

Source: Intel

Nauta is designed to facilitate collaboration with team members, as it was designed from the start with the ability to support multiple users. Job inputs and outputs can be shared between team members and used to help debug issues by launching TensorBoard against others’ job checkpoints.

Nauta can run in public cloud and enterprise data center environment. The team at Intel is working on making the installation experience seamless for administrators and DevOps engineers. In its current form, Nauta can be tested on Google Cloud Platform. The Github repo contains scripts and tools to set up and configure a test bed on Google Cloud.

Kubernetes is becoming the de facto standard for running modern, distributed workloads. Kubeflow, an open source project initiated by Google aims to bring the best of machine learning and container orchestration to model management and experimentation. Intel Nauta embraces and extends Kubeflow to support additional scenarios.

As data platforms gain prominence, public cloud vendors including Amazon, Google, IBM, and Microsoft are investing in next-generation PaaS that aims to simplify machine learning model management and experimentation. Open source projects such as Intel’s Nauta attempt to bring ML PaaS capabilities to enterprises. Customers deploying ML platforms in the data center would be able to provide data scientists and developers the same experience of using a managed service hosted in the public cloud.

Nauta is the latest attempt from Intel to capture the enterprise data platform and machine learning markets. It hopes to drive adoption of its Xeon-family of processors for advanced deep learning and AI use cases.

Follow me on Twitter or LinkedInCheck out my website