Here's how Microsoft Teams plans to use AI to suppress background noise

| |

One of the most annoying things during a video conferencing call is background noise. That could be someone who is in the meeting "jamming out" while typing meeting notes on their keyboard, or just someone eating a bag of chips.  This week, Microsoft announced it will soon be delivering a feature to Teams known as real-time noise supression which would help resolve those common problems, but the folks at VentureBeat also recently got an inside look at how it all works.

According to the report, noise-suppression isn't something new, as it has already been in Teams and Skype. What will new about it, however, is are the ways that A.I. can be used to suppress noise. Teams will soon be able to tell the difference between what's being called stationary noises (like a fan) and non-stationary noises (like police sirens.)

To accomplish this, Microsoft open-sourced the training for noise-suppression on GitHub to extend data and learn more about how to train the A.I. At the same time, though, some types of noise like singing or laughing might not be filtered out of calls. Microsoft says in the report that it can't isolate the sound of human voices since they occur at similar frequencies.

The company compared machine learning models for noise impression against models for speech recognition and trained the model to train the A.I. to understand the differences. This process involved picking out representative data sets, using machine learning, and tweaking their models accordingly.

Of course, there are privacy concerns about all this, but Robert Aichner, Microsoft Teams group program manager, told VentureBeat that it shouldn't be a worry. He says Microsoft can't look at customer data or look at Teams calls. “I can’t just simply say, ‘Now I record every meeting,'” said Aicher.

The company also has what it calls a "smaller-scale" effort to collect real recordings. "So we have a test set which we believe is even more representative of real meetings. And then, we see if we use a certain training set, how well does that do on the test set? So ideally yes, I would love to have a training set," said Aichner.

Other issues discussed in the VentureBeat report include building and improving a neural network. The report also looks at how data is moved between Azure and the cloud and the Edge.

( thanks for the tip, Ben! )

Share This Post:

Previous

Windows 10 Your Phone app gets subtle design changes with latest Insider update

Microsoft explains recent Azure outage problems in Europe due to "constrained capacity”

Next