Microsoft's Computational Network Toolkit (CNTK) stacks up well against the competition

Kareem Anderson

Microsoft’s dip into deep computational learning has resulted in benefits to projects such as Cortana, Skype Translator, and Project Oxford Speech APIs. Keeping with the company’s modern development stance, Microsoft has been sharing its secret sauce with the world. The tool behind the company’s success in deep learning is its Computational Network Toolkit (CNTK), which was recently released under an open source license during the ICASSP Conference back in April of this year.
CNTK makes up the computational network framework that provide algorithms for both forward computation and gradient calculations. Beyond the mechanics of CNTK, Microsoft is preparing a workable GPU platform to easier analyze the massive amounts of collected data. Microsoft will be combining CNTK with the upcoming release of Azure GPU Lab, intended to modernize current GPU platforms and offer a more attractive platform for advanced deep learning research.
Microsoft’s chief speech scientist, Xuedong Huang, has found that combining both Azure GPU Lab and CNTK has led to a significant performance boost in the efficacy of machine learning.

The combination of CNTK and Azure GPU Lab allows us to build and train deep neural nets for Cortana speech recognition up to 10 times faster than our previous deep learning system. Our Microsoft colleagues also have used CNTK to run other tasks, such as ImageNet Classification and a deep structured semantic model. We’ve seen firsthand the kind of performance CNTK can deliver, and we think it could make an even greater impact within the broader machine learning and AI community.”

Huang and company have also pitted CNTK against other well known deep learning toolkits such as Torch Theano, Caffe, and open source initiatives from competitors IBM and Google, and found that CNTK is just as good, if not, better in some cases against the competition. When it comes scalability, CNTK can account for 4GPUs or 8GPUs and scale well beyond a couple of machines and 8GPUs while still performing very well.
speed-comparison
We reported last week about Huang’s efforts with AI, speech translation, and the work Microsoft has put into projects such as Cortana and Skype Translator, but Huang and fellow Microsoft Researcher colleague Dong Yu, are looking forward to present their deep learning work at Neural Information Processing Systems (NIPS) conference this year.