Winning Buzzword Bingo With AWS Deep Learning Containers
Amazon’s AWS Santa Clara Summit ‘19 has been chockful of exciting product announcements, including AWS Deep Learning Containers, a service that provides Docker images that will simplify deployment of TensorFlow or ApacheMXNet workloads for training deep learning algorithms (at least according to Amazon).
The idea is to provide ready-to-eat instances that will allow end users to focus on customizations and workflows instead of configuring largely standard environments. The goal, according to AWS’s general manager of deep learning and AI, is for users to “do less of the undifferentiated heavy lifting and installing these very, very complicated frameworks and then maintaining them.”
The cloud’s key value propositions include on-demand self-service and rapid elasticity. The ability to spin up a destructible environment suitable for AI development in a few minutes is valuable, and providing those simple images is one of the ways Amazon will “make machine learning boring.”
Our Take
This story is about operating-system–level virtualization (Docker containers) and deep learning, both of which will change IT. Containers are powerful tools for reducing overhead and building microservices; applications driven by deep learning are the future, and as they become simpler to build, users will begin to expect them. The democratization of cloud artificial intelligence should not be ignored. If your developers have not explored machine/deep learning (for whatever reason), now might be a good time to come aboard.