rapid development of Artificial intelligence (AI) It brought both benefits and risks.
One worrying trend is the misuse of voice cloning. Scammers can duplicate audio in seconds and trick people into thinking a friend or family member is in urgent need of money.
including news media CNNwarns that this type of scam can affect millions of people.
Technology has made it easier for criminals to invade our personal space, so it’s more important than ever to be careful about its use.
What is voice cloning?
With the rise of AI, images, text, audio generation, and machine learning.
While AI offers many benefits, it also provides scammers with new ways to exploit individuals for money.
You may have heard of “deep fake”, where AI is used to create fake images, videos, and even audio, often celebrity Politicians etc.
voice cloneA type of deepfake technology that captures a person’s speech patterns, accent, and breathing from short audio samples to create a digital replica of that person’s voice.
Once the voice pattern is captured, the AI voice generator can convert the text input into highly realistic voices that resemble the subject’s voice.
With the advancement of technology, voice cloning can be done by following these steps: 3 seconds audio sample.
Simple phrases like “Hello, is anyone here?” can lead to voice cloning fraud, as longer conversations allow scammers to capture more details in the audio. Therefore, it’s best to keep calls brief until you can verify the identity of the caller.
Voice cloning has valuable applications in the entertainment and healthcare fields, allowing remote voice work for artists ( after death) and assisting people with language impairments.
However, this raises serious privacy and security concerns and highlights the need for safeguards.
How criminals exploit it
Cybercriminals exploit voice cloning technology to impersonate celebrities, authorities, or ordinary people to commit fraud.
They create urgency, gain the victim’s trust, and demand money via gift cards, wire transfers, or cryptocurrency.
process begins Collect audio samples from sources like YouTube and TikTok.
The technology then analyzes the audio and generates a new recording.
Once voice is cloned, it can be used for deceptive communications, such as spoofing caller ID to appear trustworthy.
Many voice cloning fraud cases have made headlines.
For example, criminals have cloned human voices. company director Organize a $51 million heist in the United Arab Emirates.
a Mumbai businessman He became a victim of a voice cloning scam using a fake call from the Indian embassy in Dubai.
In Australia, scammers recently used the following voice clone: Queensland Premier Stephen Miles Trying to trick people into investing in Bitcoin.
Adolescents and children are also targeted. in kidnapping scam In the United States, the voices of teenage girls were cloned and manipulated into making their parents comply with their demands.
How widespread is it?
recent the study According to , 28% of UK adults fell victim to a voice cloning scam last year, and 46% didn’t know this type of scam existed.
This highlights a significant knowledge gap, leaving millions of people at risk of fraud.
In 2022, around 240,000 Australians were reported to be victims of voice cloning fraud, leading to financial losses. $568 million.
How can people and organizations prevent it?
The risks posed by voice cloning include: Interdisciplinary response.
Individuals and organizations can take several steps to prevent the misuse of voice cloning technology.
beginning, Awareness campaigns and education Help protect people and organizations and reduce this type of fraud.
Public-private collaboration can provide clear information and consent options regarding voice cloning.
Second, people and organizations Biometric security using living body detectiona new technology that can recognize and verify real voices rather than fake ones. Additionally, organizations using voice recognition should consider implementing multi-factor authentication.
Third, strengthening investigative capabilities against voice clones is also an important tool for law enforcement agencies.
Finally, Accurate and up-to-date regulations Because countries need it to manage the associated risks.
Australian law enforcement agencies are recognizing the potential benefits of AI.
However, concerns about the “dark side” of this technology have led to calls for research into its criminal uses.Artificial intelligence for victim targeting. ”
There have also been calls for intervention strategies that law enforcement can use to address this problem.
Efforts like this should lead to overall results. National plan to fight cybercrimefocuses on preventive, reactive, and restorative strategies.
The National Plan sets out a duty of care for service providers, which is reflected in the Australian Government’s new plan. law To protect the public and small businesses.
This legislation aims to impose new obligations to prevent, detect, report and stop fraud.
This applies to regulated organizations such as telecommunications companies, banks and digital platform providers. The goal is to protect customers by preventing, detecting, reporting, and disrupting fraud. Cyber fraud involving deception.
Risk reduction
Estimated damage to Australian economy from cybercrime AUD 42 billionpublic awareness and strong protective measures are essential.
Countries like Australia recognize the increased risk. The effectiveness of voice cloning and other countermeasures against fraud depends on their applicability, cost, feasibility, and regulatory compliance.
All stakeholders, including governments, citizens, and law enforcement, must remain vigilant and raise public awareness to reduce the risk of harm.
Leo SF LinSenior Lecturer in Police Studies; Charles Sturt University; duane alletteSenior Lecturer in Police Studies; Charles Sturt University; Gebeleu Tur MekonnenLecturer, Graduate School of Police Science, Charles Sturt Universityand mladen zecevicpolice academy instructor Charles Sturt University
This article is republished from conversation Under Creative Commons License. please read original article.