From experiments in robotics to old-school video game play research, OpenAI’s work in artificial intelligence technology is meant to be shared. With a mission to ensure the safety of powerful AI systems, they care deeply about open source—both benefiting from it and contributing to it.
In this case study, OpenAI describes how their use of Kubernetes on Azure made it easy to launch experiments, scale their nodes, reduce costs for idle notes, and provide low latency and rapid iteration, all with little effort to manage.
Containers were connected and coordinated using Kubernetes for batch scheduling and as a workload manager. Their workloads are running both in the cloud and in the data center – a hybrid model that’s allowed OpenAI to take advantage of lower costs and have the availability of specialized hardware in their data center.