- Advertisment -
HomeEconomyAlexa may soon imitate dead people's voices based on recordings MPNRC
- Advertisment -

Alexa may soon imitate dead people’s voices based on recordings MPNRC

- Advertisment -
- Advertisement -

Alexa had to learn how to make “high quality voice” with short recordings.

Amazon did not provide further details about the feature, which is bound to raise more privacy concerns and ethical questions about consent.

- Advertisement -

Amazon’s Alexa may soon be able to replicate the voices of family members — even if they’re dead. The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is in development and will allow the virtual assistant to mimic a specific person’s voice based on a provided recording in less than a minute. Rohit Prasad, senior vice president and principal scientist at Alexa, said at the event on Wednesday that the desire behind the feature was to create more trust among users by instilling more “human qualities of empathy and influence” with Alexa.
“These attributes have become even more important during the ongoing pandemic, when many of us have lost our dear ones,” Prasad said. “While AI may not eliminate that pain of loss, it can certainly make their memories last.” In a video played by Amazon at the event, a young child asks “Alexa, can Grandma finish reading The Wizard of Oz to me?” Alexa then accepts the request, and changes to another voice, imitating the baby’s grandmother. The voice assistant then continues to read the book in the same voice.
To create the feature, Prasad said the company would have to learn to create “high-quality sound” with short recordings, as opposed to recording hours in the studio. Amazon didn’t provide further details about the feature, which is bound to raise more privacy concerns and ethical questions about consent.
Amazon’s push comes as competitor Microsoft said earlier this week that it was withdrawing its synthetic voice offerings and setting strict guidelines to “ensure the active participation of speakers” whose voice can be heard again. made from. Microsoft said Tuesday that it is limiting which customers have access to the service — as well as continuing to highlight acceptable uses such as an interactive Bugs Bunny character at AT&T stores. “This technology has exciting potential in education, accessibility and entertainment, and yet it’s also easy to imagine it could be used to mislead speakers,” said Natasha Crampton, head of Microsoft’s AI ethics division, in a blog post. And how can it be done to deceive the listeners.” ,

Read all the latest news, breaking news, watch top videos and watch Live TV right here.

- Advertisement -

,

- Advertisement -
RELATED ARTICLES
- Advertisment -
- Advertisment -

Most Popular

- Advertisment -