Cloud native data and model lifecycle management for AI

Audience:
Topic:

As the integration of Artificial Intelligence (AI) into business ecosystems accelerates, the demand for robust and scalable infrastructure solutions to support AI workloads has become paramount. Cloud-native technologies offer unparalleled flexibility and scalability, making them ideal for deploying AI infrastuctures. Yet, they introduce the challenge of decoupling AI compute from datasets and models within storage systems, leading to potential bottlenecks in the AI pipeline.

 

This talk delves deep into the intricacies of managing the full lifecycle of AI data and models in a cloud-native environment. We explore key challenges, including data ingestion, preprocessing, versioning, model training, deployment, and serving. Additionally, we highlight best practices and cloud-native solutions that facilitate fast data pipeline, efficient resource utilization, and rapid model deployment, ultimately ensuring the delivery of high-quality AI applications. Real-world case studies from industry leaders like Meta, Uber, and Shopee will be presented, illustrating solutions to these challenges in production environments.

 

 

Room:
Ballroom H
Time:
Saturday, March 16, 2024 - 12:30 to 13:30