There is a growing interest in running software at the edge. This course takes a deep dive into the use cases and applications of Kubernetes at the edge using examples, labs, and a technical overview of the K3s project and the cloud native edge ecosystem.
This course is designed for those interested learning more about Kubernetes, as well as in deploying applications or embedded sensors in edge locations. While learners do not need a Kubernetes certification for this course, experience with a Linux operating system and shell scripting will be beneficial. Programming experience is also not strictly required. Learners will need to be able to run Docker on their computer.
In this course, you will learn the use cases for running compute in edge locations and about various supporting projects and foundations such as LF Edge and CNCF. The course covers how to deploy applications to the edge with open source tools such as K3s and k3sup, and how those tools can be applied to low-power hardware such as the Raspberry Pi. You will learn the challenges associated with edge compute, such as partial availability and the need for remote access. Through practical examples, students will gain experience of deploying applications to Kubernetes and get hands-on with object storage, MQTT and OpenFaaS. It also introduces the fleet management and GitOps models of deployment, and helps you understand messaging, and how to interface with sensors and real hardware.
This course will enable developers to learn about the growing impact the cloud native movement is having on modernizing edge deployments. They will also learn the challenges of deploying Kubernetes on the edge through a concrete example via the k3s project.