Real AI fraud warning: Why our voices, faces, and our money next line up? In this stock image, the Openai ChatGpt logo appears on your smartphone and CEO Sam Altman appears in the background. Credit: El Editorial, Shutterstock
“Just because we don’t release technology doesn’t mean it doesn’t exist… a bad actor is going to release it. This is coming very quickly.”
You might want to think twice before trusting a fascinating time call from your mom or an emergency voicemail from your boss. According to Openai CEO Sam Altman, the era of Deepfake scams has not come. It’s already here – and it sounds exactly like you. According to Altman, this is just the beginning of a global crisis.
At a recent event in Washington, DC, Altman issued an ominous warning. Generating AI will quickly allow bad actors to completely mimic people’s voices, faces, and even personality, and use them to scam you from your money, data, or both. Anyone You will be able to do it Do that.
“Now, it’s an audio call. Soon it’s going to be a video or fasciti time that’s indistinguishable from reality,” Altman told US Federal Reserve Vice-Chairman Michelle Bowman.
So what is it? actually We’re heading here – and do you need to worry?
VoicePrints and Video Fakes: New Scavenging Weapons
Altman’s concerns focus on the fact that some banks and businesses still use VoicePrint certification. This means that you can just recognize your voice and move your money and access your account. But with today’s AI tools, it takes seconds of audio to clone someone’s voice. There are currently dozens of apps – Some free – it can do it.
Scammers have already called people and record their voices when they answer the phone. They need one sample can They say whatever they want and create a realistic version of your voice.
Combine it with videos generated in increasingly realistic AI, and there’s a perfect storm. Scammers can now create completely fake fascinating fascititime or video calls that look like spouses, bosses, or children. You’re not just receiving suspicious emails anymore – you’re getting fake people.
Real-world fraud: When your “son” isn’t Really Your son
These warnings are not theoretical. Here are some examples of how AI scams are already unfolding.
As reported by CBC Canadathe con man cloned the voice of the female son and called her. Claim he ‘It was necessary talk’. “It was his voice,” she said. Manitoba Mama hears his son on the phone, but it wasn’t him.
Leann Friesen, a mother of three from a small community in Miami, Manitoba, received a strange call from her personal number a few weeks ago. What she heard on the other side stopped her with her truck – it was her son’s voice and sounded like he was in pain.
“He said ‘Hello mom,'” I recalls Friesen. “He said, ‘Mama, can I tell you something?” And I said, ‘No judgment.’
That’s when the alarm bell starts to ring.
“I’m a bit confused at that point. Why are you asking me this?” she said.
I felt something was wrong about the conversation. Friesen decides to cut it short, telling the caller that he will be back on his son’s cell phone and hangs up.
She immediately dialed his number.
She said she woke him up. He would have been sleeping the whole thing time, Because he was working the shift. “He said, ‘Mom, I didn’t call you.’
“it was absolutely My son’s voice was on the other side of the line. ”
Hong Kong Deep Fark Video and FBI Case
In Hong Kong, financial workers have been transferred to video Deepfark 25 million US$ After believing they were in a Zoom meeting with them company CFO.
According to the FBI,nUS, impersonators a government Official To access confidential information – in one case, he pretends to be Senator Marco Rubio with a call to a foreign diplomat.
So, what exactly is Openai doing?
Altman claims Openai hasn’t built any spoofing tools. Technically, that’s true. but Some of their projects may be I used it like that.
Openai’s video generator, Sora creates surreal videos from text prompts. It’s a step forward for creative AI – But you could move forward due to fraud Too much. Imagine giving the script and asking for “video where Joe Bloggs calls the bank and request a password reset.”
Eye scanner controversy
Altman also supports the ORB of WorldCoin, a controversial biometric device that scans the eyeball to confirm identity. It is being sold as a new kind Human rights proof – However, critics argue it is dystopia answer For digital issues.
Openai says it won’t tolerate misuse, but Altman admits that others may not play It’s very beautiful.
“Just because we don’t release technology doesn’t mean it doesn’t exist… a bad actor is going to release it. This is coming very quickly.”
Technology is better than that Law
The government is still rushing to catch up. The FBI and Europol have issued warnings, but the global laws on AI impersonation are patchy at best. The UK’s online safety laws still don’t cover all forms of synthetic media, and regulators are still debating how to define AI-generated fraud.
On the other hand, the scammers are Abuse delay.
What can you do to protect yourself?
Altman may be worried, but there are ways to protect himself and your account. This is what you should consider doing today:
- Stop using voice authentication: If the bank uses it, Please seek something else method. It’s no longer safe.
- It uses strong and unique passwords and two-factor authentication (2FA). I prefer app-based 2FA to SMS whenever possible. It remains your best defense.
- Please check through another channel. you obtain Suspicious call or video message – Even if that’s the case Looks authentic – Please contact people individually Another Platform or phone number.
- Educate your family: Some older relatives are particularly vulnerable. Help us understand what AI scams look and sound.
- Watch out for voices online: Creating a compelling fake takes just a few seconds of clear audio. Do not post long videos or voicemails if you don’t need them.
Final Thoughts: We’re not in Kansas anymore
AI tools that can mimic your voice and face with cold accuracy are no longer science fiction. They are out in the wild. Sam Altman’s warning may sound selfish, but he’s not wrong. This will get worse before it gets better.
And while the scammers move rapidly, our institutions are moving from banks to regulators – moving slowly and painfully.
Until the system catches up, the best security you have is you My own Scepticism.
The next time the “boss” sends a video message at 4am to ask for wire transfers? You might want to sleep with it.
Get more Technology News.
It’ll be fresher Celebrity News in the morning.