Connect with us

Technology

Amazon uses kid’s dead grandma in morbid demo of Alexa audio deepfake

Published

on


amazon echo dot gen 4
Enlarge / The 4th-gen Amazon Echo Dot sensible speaker.

Amazon

Amazon is determining the best way to make its Alexa voice assistant deepfake the voice of anybody, useless or alive, with only a quick recording. The corporate demoed the function at its re:Mars convention in Las Vegas on Wednesday, utilizing the emotional trauma of the continuing pandemic and grief to promote curiosity.

Amazon’s re:Mars focuses on artificial intelligence, machine learning, robotics, and different rising applied sciences, with technical specialists and trade leaders taking the stage. Through the second-day keynote, Rohit Prasad, senior vice chairman and head scientist of Alexa AI at Amazon, confirmed off a function being developed for Alexa.

Within the demo, a baby asks Alexa, “Can grandma end studying me Wizard of Oz?” Alexa responds, “Okay,” in her typical effeminate, robotic voice. However subsequent, the voice of the kid’s grandma comes out of the speaker to learn L. Frank Baum’s story.

You’ll be able to watch the demo beneath:

Amazon re:MARS 2022 – Day 2 – Keynote.

Prasad solely stated Amazon is “engaged on” the Alexa functionality and did not specify what work stays and when/if it will be obtainable.

He did present minute technical particulars, nevertheless.

“This required invention the place we needed to be taught to provide a high-quality voice with lower than a minute of recording versus hours of recording in a studio,” he stated. “The way in which we made it occur is by framing the issue as a voice-conversion job and never a speech-generation job.”

Prasad very briefly discussed how the feature works.
Enlarge / Prasad very briefly mentioned how the function works.

After all, deepfaking has earned a controversial repute. Nonetheless, there was some effort to make use of the tech as a device fairly than a way for creepiness.

Audio deepfakes particularly, as famous by The Verge, have been leveraged within the media to assist make up for when, say, a podcaster messes up a line or when the star of a venture passes away immediately, as occurred with the Anthony Bourdain documentary Roadrunner.

There are even situations of individuals utilizing AI to create chatbots that work to speak as if they’re a misplaced beloved one, the publication famous.

Alexa would not even be the primary shopper product to make use of deepfake audio to fill in for a member of the family who cannot be there in individual. The Takara Tomy sensible speaker, as identified by Gizmodo, makes use of AI to read youngsters bedtime tales with a mum or dad’s voice. Mother and father reportedly add their voices, so to talk, by studying a script for about quarter-hour. Though, this notably differs from Amazon’s demo, in that the proprietor of the product decides to supply their vocals, fairly than the product utilizing the voice of somebody probably unable to provide their permission.

In addition to worries of deepfakes getting used for scams, rip-offs, and different nefarious activity, there are already some troubling issues about how Amazon is framing the function, which does not also have a launch date but.

Earlier than displaying the demo, Prasad talked about Alexa giving customers a “companionship relationship.”

“On this companionship position, human attributes of empathy and have an effect on are key for constructing belief,” the exec stated. “These attributes have develop into much more vital in these instances of the continuing pandemic, when so many people have misplaced somebody we love. Whereas AI cannot eradicate that ache of loss, it may possibly undoubtedly make their recollections final.”

Prasad added that the function “allows lasting private relationships.”

It is true that numerous individuals are in severe search of human “empathy and have an effect on” in response to emotional misery initiated by the COVID-19 pandemic. Nevertheless, Amazon’s AI voice assistant is not the place to fulfill these human wants. Alexa can also’t allow “lasting private relationships” with people who find themselves now not with us.

It is not laborious to imagine that there are good intentions behind this growing function and that listening to the voice of somebody you miss is usually a nice consolation. We might even see ourselves having enjoyable with a function like this, theoretically. Getting Alexa to make a buddy sound like they stated one thing foolish is innocent. And as we have mentioned above, there are different firms leveraging deepfake tech in methods which might be just like what Amazon demoed.

However framing a growing Alexa functionality as a method to revive a connection to late members of the family is a big, unrealistic, problematic leap. In the meantime, tugging on the heartstrings by bringing in pandemic-related grief and loneliness feels gratuitous. There are some locations Amazon does not belong, and grief counseling is one in every of them.

Copyright © 2022 Voiceoftime.online