How AI fuels 'virtual kidnapping' scams: FBI warning and tips

Danny Weber

15:18 10-12-2025

© RusPhotoBank

The FBI warns of AI-driven virtual kidnapping scams using fake proof-of-life to force quick payments. Learn key red flags and how to verify family safety.

Scammers are increasingly leaning on AI to stage supposed "kidnappings" without any real crime involved. The FBI has issued a warning about a new wave of schemes in which victims receive fabricated "proof of life"—photos or videos that appear to show a relative being held—and are pushed to pay an urgent ransom. The setup preys on a basic reflex: fear for loved ones, which can make even cautious people stumble.

According to the bureau, perpetrators pick a target, gather photos and clips from social networks and other open sources, then refine the material with generative tools. The result can look convincing enough to trigger panic in the first minutes—and that timing is the strategy, nudging someone to pay before they have a chance to calmly verify anything.

The FBI also notes that today’s forgeries are increasingly hard to spot at a glance, and scammers may send such "evidence" via disappearing messages to narrow the window for careful review. Even polished fakes, however, often trip over small inconsistencies: odd proportions, warped facial details, or the absence of distinctive marks that show up in genuine photos. It’s the tiny tells that still give the game away.

The core advice is straightforward: before taking any action, try to reach the supposedly abducted person directly using a known number or through close relatives, rather than the contacts the scammer provides. It also helps to agree on a family code word for emergencies in advance—simple and memorable, and often enough to puncture the illusion quickly.