Talk

Green AI in Cloud Native Ecosystems: Strategies for Sustainability and Efficiency

Abstract

The rapid proliferation of AI is increasing focus on the environmental costs associated with large-scale model training and deployment. As cloud-native technologies form the backbone of modern AI systems, the Cloud Native Computing Foundation (CNCF) is spearheading efforts to balance AI innovation with sustainability. This session will provide an overview of the CNCF effort to identify key areas, techniques, and best practices for energy-efficient AI in cloud-native environments. Attendees will gain insights into a newly developed taxonomy that categorises remediation patterns and sustainable practices across AI lifecycle phases, deployment environments, and personas.

We will also explore real-world applications and discuss reference architectures that provide means to optimise resource use, such as GPU slicing for inference efficiency, power capping during training, and carbon-aware scheduling, while maintaining performance and scalability.

Related