The FBI is deep fake scams citizens to pay attention to who they’re communicating with online – particularly over video calls. As deep fake technology advances, criminals may be able to impersonate people they know and trust. This could make it harder to detect a scam and fall victim.
In a recent case, a woman was scammed out of $300,000 on a dating app by criminals posing as Sean Buck, U.S. Navy vice admiral and superintendent of the Naval Academy. Fraudsters used manipulated videos of him and even cloned his voice to trick the victim into sending money and providing personal information.
For businesses, this means a greater need to ensure employees are trained in detecting phishing attacks that use deep fakes. With the right technology, threat actors could impersonate executives and gain access to internal systems that contain customer PII or proprietary data.
Unmasking Reality: The Rise of Deepfake Scams and How to Detect Them
While well-crafted deep fakes require high-end computing resources and time, criminals are leveraging this technology to increase the effectiveness of their attacks. For example, they’re using audio deep fakes to spoof cloned voices in Zoom calls, which have been used in banking and corporate fraud cases.
Criminals also use deep fakes to manipulate images of children without their consent. In March, NBC News reported that criminals are demanding victims, some of whom are children, send them cash or gift cards, or share authentic sexual imagery on the open internet, and threaten to publish them in social media or to friends and family.