New US financial regulations, such as the "Dodd-Frank" Wall Street Reform and Consumer Protection Act, have enforced many banks in the US, Europe, and Asia to look at ways that can be used to verify audibility and intelligibility of voice recordings. Such recordings contain the activities of traders, wealth brokers and contact center workers.
One of the main drawbacks that dramatically affects the ability to verify audibility and intelligibility of voice recordings is the presence of background noise such as babble noise, street noise and car noise.
As part of my work here in IR I have developed a deep learning-based method that can verify the audibility of human's voice in the presence of background noise. The AI method called "Ava" is insensitive to various factors such as gender, language and the level of signal-to-noise ratio (SNR). Ava can verify the audibility of voice recordings within the accuracy of 99.44%.
We are very excited with the accuracy achieved with the test data and the real-world impact it will have. Using this method, we can dramatically reduce the risks of the costs to the business that are associated with non-compliance.