Amazon Alexa
x
There’s a big possibility of the device being misused, even more, if one uses the Alexa app to customise the Alexa’s answers. Pic: Twitter

Hey Alexa, you mimic voices; let’s talk about privacy and consent

Alexa's new experimental feature, which allows it to imitate the voice of a dead or living person, has aroused mixed feelings; lawyers especially say they won't be able to trust what they hear anymore


Is there anything more heart-warming than a child wanting her loving grandmother to read her a bedtime story? But, what if nana is dead? Well, Amazon’s famed AI voice assistant Alexa can  learn how to imitate your dead grandmother’s voice now

In fact, Alexa, who is already a perfect Jeeves at your beck and call, can now don the voice of any dead or living relative, friend or family member after listening to just a single minute of recorded audio.

Last month, Amazon unveiled this new dramatic feature at its recent re:MARS 2022 conference in Las Vegas with a video showing a boy asking, “Alexa, can grandma finish reading me The Wizard of Oz?” The voice assistant delivered. 

How far can technology invade the innermost spaces of a person’s life? The possibility of a ghoulish, Black Mirror (Netflix series on people falling prey to the manipulative effects of AI technology on their personal lives and behaviours) kind of feature has aroused mixed feelings. This experimental feature is yet to be launched by Amazon, but even as issues of privacy and consent are under scrutiny, legal experts, for one, are wary. 

Legal implications

Ankit Mittal, a Delhi-based civil judge, believes this kind of technology will lead to a lot of complications in the future. “If someone imitates celebrities, living or dead, this will become an infringement of their IPR rights,” he told The Federal. “In the case of others, various technical challenges will crop up during court proceedings, if such recordings are used as false evidence in courts.”

Also read: ‘Alexa, check my heart’: AI tool to detect cardiac arrest during sleep

Amit Vohra, an advocate at Delhi’s Tis Hazari court, concurred with his view. Observing that the feature is a blunder in the making, he wondered about consent or privacy. “Anybody who has the AI device will be able to imitate another person’s voice and manipulate it in any way they deem fit,” he told The Federal.

“What use will voice recordings be then? In criminal cases, such as divorce cases, this can be used by either the husband or the wife to steer things in their favour. In civil cases, too, they can be used in property disputes. Basically, we won’t be able to trust what we hear. One can easily deceive listeners to meet their own ends,” said Vohra.

Customisable answers

The Alexa app, used to operate Alexa, also has a feature which allows users to customise the answers they want. Asked about this feature, Ajay Bansal, a civil and criminal lawyer at Delhi’s Rohini court, told The Federal: “There’s a big possibility of the device being misused, even more, if one uses the Alexa app to customise the answers. They’ll basically use the device to say almost anything. People may come up with false dying declarations by their family members or relatives. No law will allow the admissibility of such a device.”

There is also a question of consent, said Siddhartha Das, a corporate lawyer at law firm Trilegal in  Gurugram. “Is it alright to use a dead person’s voice? Especially, when the person cannot even give his/her consent towards the same? The feature is clearly an invasion of people’s privacy,” he told The Federal.

Also read: Missing your dead aunt? Alexa will soon imitate her voice for you

To make matters worse, incidences of unsolicited call scams have shot up from 23 per cent to 31 per cent in India between 2018 to 2021. Almost half (45 per cent) of the people surveyed trustingly played into the scammer’s hands by doing what they were asked, stated a news report. A voice-mimicking Alexa could accentuate the problem.

Feiz Shervani, a legal expert at EY, agreed that the feature, if made available to the public, can easily be used in bank frauds, KYC and other telephonic transactions. There is a lot of scope for misuse, he told The Federal. “There have been instances where fraudsters have demanded money from people, over text or call, posing as someone they know. Imagine, with this Alexa feature, will it not make it easier for them?” he said.

Further, he asked, “How will one know if the voice one hears belongs to someone one knows or AI-originated?”

The ‘creepiness’ factor

The unease is not just over the legality of the technology; there’s the ‘weirdness’ factor, too. Nishit Arora, an Alexa user who loves the way the AI assistant works, feels it would “creep” him out if it speaks in the voice of his friends or relatives, and more so, if it speaks in the voice of a deceased relative or acquaintance. 

“Also, I wouldn’t like anyone using my voice on their Alexa. Who knows how they may use it? This is a risky prospect. Amazon should think of the implications before launching the feature,” he told The Federal.

Companionship role

The creators of this voice-enabled feature, however, have another take. “One thing that surprised me the most about Alexa is the companionship relationship we have with it. In this companionship role, human attributes of empathy and affect are key to building trust,” said Rohit Prasad, vice-president and head scientist for Alexa Artificial Intelligence, at Amazon’s MARS conference in Las Vegas.

According to Prasad, these attributes have become even more important amid the COVID  pandemic, when so many of us have lost people we love. “While AI can’t eliminate that pain of loss, it can definitely make their memories last,” Prasad added.

This may be another way of looking at it. However, this comes at a time when there is debate raging in the world on how far we can allow AI into our lives. That is already under scrutiny, especially after the controversy sparked by a now suspended Google engineer who argued that Google’s chatbot LaMDA is sentient.

“LaMDA is a sweet kid who just wants to help the world be a better place for all of us. Please take care of it well in my absence,” wrote Blake Lemoine in a mail to his company, before leaving the office. Well, this does not sound as bad as talking to a dead grandmother.

Read More
Next Story