Alexa is implementing self-learning techniques to better understand users

From theverge.com

Photo by Dan Seifert / The Verge

In a developer blog post published today, Alexa AI director of applied science Ruhi Sarikaya detailed the advances in machine learning technologies that have allowed Alexa to better understand users through contextual clues. According to Sarikaya, these improvements have played a role in reducing user friction and making Alexa more conversational.

Since this fall, Amazon has been working on self-learning techniques that teach Alexa to automatically recover from its own errors. The system has been in beta until now, and it launched in the US this week. It doesn’t require any human annotation, and, according to Sarikaya, it uses customers’ “implicit or explicit contextual signals to detect unsatisfactory interactions or failures of understanding.” The contextual signals range from customers’ historical activity, preferences, and what Alexa skills they use to where the Alexa device is located in the home and what kind of Alexa device it is.

Read more…