fbpx
Use of AI to Clone Voices for Creative Purposes |

Use of AI to Clone Voices for Creative Purposes

Context

āˆ™ Recently, music composer A.R. Rahman used an Artificial Intelligence (AI) software to recreate the voices of singers Bamba Bakya and Shahul Hameed who are now dead.

About 

āˆ™ A report by Market US has revealed that the global market for these voice cloning applications stands at $1.2 billion in 2022 and is estimated to touch almost $5 billion in 2032 with a CAGR above 15-40%.

Voice cloning

Use of AI to Clone Voices for Creative Purposes |

āˆ™ Voice cloning technology employs sophisticated AI algorithms to replicate the intricacies of human speech patterns. 

āˆ™ This innovative process hinges on the principle of training neural networks, a fundamental aspect of artificial intelligence, using extensive datasets of recorded speech. 

āˆ™ Thereā€™s a host of these applications online with popular ones like Murf, Resemble and Speechify.

āˆ™ Recently, former Pakistani Prime Minister Imran Khanā€™s political party used an AI-generated speech from the now imprisoned leader in an attempt to rally for votes.

Applications

āˆ™ Preserving legacy: Can keep the voices of loved ones alive for future generations.

āˆ™ Apple introduced a voice cloning feature in iOS 17 intended to help people who may be in danger of losing their voice to a degenerative disease.

āˆ™ Personalized experiences: Custom virtual assistants, interactive storytelling, and more immersive digital interactions.

āˆ™ Gaming: Prominent tech companies also have a hand in the AI voice game. Recently, Meta launched SeamlessM4T, which can understand nearly 100 languages from speech or text and generate translations in real-time. 

āˆ™ Accessibility: Can offer voice to those who have lost it or will lose it due to illness or disability.

āˆ™ Song creations: YouTube took a similar route and announced Dream Track that allows them to create song clips featuring AI vocals with permission from pop stars like Demi Lovato, Sia and John Legend.

āˆ™ Creative applications: Enhancing storytelling, audio games, and immersive experiences.

Issues/Concerns

āˆ™ Scams: In April 2023, a family living in Arizona, U.S. was threatened to pay ransom for a fake kidnapping pulled off by an AI cloned voice. 

āˆ™ Reporting issues: Several cases went unreported and only some came to light.

āˆ™ Fake news: Easy access to AI voice clones also spawned disinformation.

āˆ™ Harry Potter actress Emma Watson allegedly read out a portion of the Mein Kampf.

āˆ™ Privacy and consent: Concerns about unauthorized recording and use of voices without consent need to be addressed.

āˆ™ Ethical considerations: Potential for exploitation, manipulation, and emotional harm through impersonation and misuse.

āˆ™ Social implications: Impact on identity, trust, and communication dynamics in the digital age.

āˆ™ Hate speech: Recently, users started flocking to free AI voice cloning tools to generate celebrity hate speech.

āˆ™ Conservative political pundit Ben Shapiro allegedly made racist comments against Democrat politician Alexandra Ocasio-Cortez.

India: a major target for AI voice clone scamsā€“ A report titled ā€˜The Artificial Imposterā€™ published in May last year revealed that 47% of surveyed Indians have either been a victim or knew someone who had fallen prey to an AI generated voice scam.Ā ā€“ The numbers are almost twice the global average of 25%. In fact, India topped the list with the maximum number of victims to AI voice scams.Ā a. In December, a Lucknow resident fell prey to a cyberattack that used AI to impersonate the voice of the victimā€™s relative, requesting the person to transfer a substantial amount through UPI.Ā ā€“ Indians have been found to be particularly vulnerable to scams of this nature.Ā a. According to McAfee, 66% of Indian participants admitted that they would respond to a voice call or a phone call that appeared to be from a friend or family member in urgent need of money.b. The report also shared that 86% Indians were prone to sharing their voice data online or via voice notes at least once a week which has made these tools potent.

Measures

āˆ™ Regulatory frameworks: Robust legal and ethical guidelines are crucial to prevent misuse and protect privacy.

āˆ™ The U.S. Federal Trade Commission is considering the adoption of a recently-proposed Impersonation Rule that will help deter deceptive voice cloning.

āˆ™ Technological safeguards: Watermarking and other authentication mechanisms can help identify and verify cloned voices.

āˆ™ Public awareness and education: Educating the public about voice cloning technology and its potential risks is vital.

āˆ™ The US Federal Trade Commission has also launched a Voice Cloning Challenge which asked the public to send in their ideas to detect, evaluate and monitor cloned devices. 

āˆ™ Responsible development and application: Promoting ethical and transparent use of voice cloning for positive societal impact.

Way Ahead

āˆ™ The future of voice cloning hinges on responsible development and utilization, balancing its potential benefits with ethical considerations and safeguards to avoid its misuse.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

You cannot copy content of this page

0
Would love your thoughts, please comment.x
()
x
Scroll to Top
×