After spending a week talking about artificial intelligence in Atlanta, Georgia during the Ignite 2016 conference, Microsoft then turned around and hosted its first ever Machine Learning & Data Science Summit in the same area to speak to big data engineers and machine learning developers.
While easily mistaken for one another, machine learning and artificial intelligence are not as one-in-the-same as many people believe them to be.
Joseph Sirosh, corporate vice president of the Data Group at Microsoft, spent much of the summit explaining the differences, similarities, and ties machine learning and artificial intelligence have as well as the company’s goals in combining the two in the future.
For a glimpse of what to expect, here are just a couple of highlights from the summit.
This is the pattern where intelligence lives with the data in the Database. Imagine a core transactional enterprise application built with a database like SQL Server. What if you could embed intelligence – i.e. advanced analytics algorithms and intelligent data transformations – within the database itself to make every transaction intelligent in real time? This is now possible for the first time with R and ML built into SQL Server 2016.
At the Summit, we showed off a fascinating demo titled Machine Learning @ 1,000,000 predictions per second, in which we showcased real-time predictive fraud detection and scoring in SQL Server. By combining the performance of SQL Server in-memory OLTP as well as in-memory columnstore with R and Machine Learning, apps can get remarkable performance in production, as well as the throughput, parallelism, security, reliability, compliance certifications and manageability of an industrial strength database engine. To put it simply, intelligence (i.e. models) become just like data, allowing models to be managed in the database, exploiting all the sophisticated capabilities of the database engine. The performance of models can be reported, their access can be controlled, and, moreover, because these models live in the database, they can be shared by multiple applications. No more reason for intelligence to be “locked up” in a particular app.
I had two customers come up on stage to showcase this pattern: PROS, which uses ML for revenue management in a SaaS app on Azure, and Jack Henry & Associates, who use it in loan charge-off prediction.
One of PROS’ customers is an airline company that needs to respond to over 100 million unique, individualized price requests each day. They use Bayesian statistics, linear programming, dynamic programming and a slew of other technology to create price curves. To answer the responses, they need to be in under 200 milliseconds, so the responses must be incredibly fast. It’s practically impossible for humans to do this – understanding the market economics using all available data and to do so in under 200 milliseconds. So you really need autonomous software handling this. The combination of SQL Server 2016 and Azure provided the unified platform and global footprint that made it a lot easier for PROS to accomplish this. In fact, 57% of the over 3.5 billion people who travel on planes annually are touched by PROS software powered by Azure – it’s pretty amazing.
The third pattern I talked about is Deep Intelligence. Azure now enables users to perform very sophisticated deep learning with ease, using our new GPU VMs. These VMs combine powerful hardware (NVIDIA Tesla K80 or M60 GPUs) with cutting-edge, high-efficiency integration technologies such as Discrete Device Assignment, bringing a new level of deep learning capability to public clouds. Joining me on stage last week was one of our partners, eSmart Systems, who presented a demo of their Connected Drones which are capable of inspecting faults in electric power lines using image recognition.
eSmart Systems is a small, dynamic and young company out of Norway that made an early decision to build all of their products completely on the Microsoft Azure platform. Their mission is to bring big data analytics to utilities and smart cities, and Connected Drone is their next step in delivering value to customers through Azure. The way they put it, “Connected Drone is a way of making Azure intelligence mobile”. The objective of Connected Drone is to support inspections of power lines which, today, is performed either by ground crews walking miles and miles of power lines, or through dangerous helicopter missions to aerially monitor these lines (if there is one place you don’t want humans to be in helicopters, it’s over high power lines). With Connected Drones, eSmart uses deep learning to automate as much of the inspection process as possible.
Connected Drones cover all stages of the inspection process from the initiation of the inspection plan to the planning of the drone mission, and then the execution of the mission. As they fly over power lines, the drones stream live data through Azure for analytics. eSmart Systems uses different types of neural networks including deep neural networks (DNNs) to do this, and these are all deployed on Azure GPUs. They analyze both still images and videos from the drones and are able to recognize objects in real time. The system can process video from between 10 to 50 frames a second at the moment. Among the many challenges they’ve faced, one that they called out on the stage is class imbalance, where they have many different types of objects that need to be recognized and some of those are more common than others. In order to prevent a biased classifier, the solution they came up with is to mix real and synthetic images to train their models. This is truly, Machine Teaching. I call it Mission NN-possible!
Other core topics of discussion include intelligent bot use of cognitive services on Azure, the intersection of humans and the use of machine learning, as well as a look into Intelligent Lake.
To get a word for word run down of everything discussed during Microsoft’s first ever Machine Learning & Data Science Summit interested readers can grab the PDF here or visit Microsoft’s Technet blog on Server and Tools.