Posted March 17, 2023 at five:02 pm EST
Artificial intelligence specialist Marie Haynes says AI tools will quickly make it tougher to distinguish an AI from a genuine person’s voice. (Dave Charbonneau/CTV News Ottawa)
As AI technologies continues to advance, fraudsters are discovering new approaches to exploit it.
Voice cloning has emerged as a specifically hazardous tool, with fraudsters working with it to imitate the voices of men and women their victims know and trust in order to trick them into handing more than cash.
“People today will quickly be in a position to use tools like ChatGPT or even Bing and at some point Google, to build voices that sound really equivalent to their voice, use their rhythm,” mentioned Marie Haynes, an artificial intelligence specialist. “And it will be really, really tough to distinguish from a genuine reside particular person.”
She warns that voice cloning will be a new tool for fraudsters pretending to be a person else.
Carmi Levy, a technologies analyst, explains that scammers can even spoof the telephone numbers of loved ones and buddies, generating it appear like the contact is essentially coming from the particular person they are impersonating.
“Fraudsters are working with increasingly sophisticated tools to convince us that when the telephone rings, it really is essentially coming from that loved ones member or that important other.” These men and women we know,” he says.
Levy advises men and women who get suspicious calls to hang up and contact the particular person they believe is calling them straight.
“If you get a contact and it sounds a tiny off, the very first issue you should really do is say, ‘Okay, thank you really considerably for letting me know.’ I am going to contact my grandson, my granddaughter, whoever it is that you inform me is straight in difficulty.’ Then choose up the telephone and contact them,” he advises.
Haynes also warns that voice cloning is just the starting, with artificial intelligence potent sufficient to clone someone’s face as properly.
“Quickly, if I get a FaceTime contact, how will I know it really is a person I know,” she says. “Possibly a person is pretending to be that particular person.”
As this technologies becomes extra widespread, specialists urge men and women to be cautious and verify calls from buddies and loved ones just before sending cash.
“There are all types of tools that can take the written word and make a voice out of it,” Haynes says. “We’re going to discover that scam calls are going to be truly, truly on the rise.”
One thought on “New voice cloning technologies enables fraudsters to impersonate any individual”
Максим Криппа биография датского футболиста Салон Штор та Гардин у Запоріжжі
มาสนุกไปกับ เกมสล็อตออนไลน์กันเถอะ กับ pg slot สล็อตออนไลน์ สล็อตออนไลน์ที่ SLOTKKK ทำเงินได้แน่นอน 100