MathWorks has introduced its new GPU-accelerated container from the Nvidia GPU Cloud (NGC) container registry for DGX Systems and other supported NGC platforms.
Researchers and developers can now use the deep learning workflow in Matlab and leverage multiple GPUs in Nvidia DGX Systems, or on supported cloud service providers, and select Nvidia GPUs on PCs and workstations.
Researchers and developers building AI solutions need access to cloud and HPC resources to minimize training time. With the GPU-accelerated Matlab container from NGC, users can significantly speed up deep learning network training, as well as create, modify, visualize and analyze deep learning networks with Matlab apps and tools.
“NGC provides simple access to fully integrated and optimized software for Nvidia GPU accelerated systems and cloud services,” said Paresh Kharya, director of accelerated computing at Nvidia. “The new Matlab container delivers breakthrough performance with simple installation on all supported NGC platforms, helping developers focus on creating innovative AI solutions.”
Added David Rich, Matlab marketing director, MathWorks, “Accelerating deep learning is key to getting AI projects done, but the process of migrating to cloud or HPC resources can be complex and time-intensive. Preconfigured containers eliminate software installation and integration time, which improves access to Matlab in new compute environments. Now, the NGC user community has access to Matlab and its integrated deep learning workflow, from research to prototype to production.”