The far edge is quickly becoming more than a home for IoT sensors and small devices. It's where local processing, automation, and AI inference increasingly need to run. As more intelligence moves closer to where data is created, teams face challenges around footprint, scale, and reliable operations across distributed and often unstable environments.
This talk looks at practical ways to run Kubernetes at the far edge to support both IoT and AI workloads. It covers several deployment patterns, describing how a single‑node edge cluster can serve tightly constrained locations, how an edge‑only cluster with both the control plane and workers running locally provides full independence, and how an externally hosted control plane—whether in the cloud or a datacenter—can manage remote edge workers to keep operations lightweight at scale.
As an example use case, we examine these cluster topologies by bridging together the worlds of IoT and AI while running entirely at the edge. Using lightweight Kubernetes distributions like k0s and device‑orchestration tools such as Akri, we’ll show how open source tooling can surface sensors, cameras, and other devices as native resources and provide practical ways to push applications—including AI inference—to the edge.
Attendees will leave with a clear understanding of practical architectural choices and tooling for running IoT and AI workloads at the edge, along with strategies to build systems that remain manageable and reliable even in challenging environments.