Social Security scams have experienced a significant surge in recent years, reaching alarming heights in 2022, with Americans aged 60 and older reporting $3.1 billion in scam losses, marking an 84% increase from the previous year, according to FBI data.
The convergence of the COVID-19 pandemic and the rise of sophisticated technologies has created a fertile ground for fraudsters. Now, a new player has entered the scene – artificial intelligence (AI), elevating the risk of scams targeting the elderly.
The Surge of Fraud in Government Benefit Programs
Amanda D’Amico, Vice President of Risk and Fraud Strategy & Operations at Thomson Reuters, highlighted the alarming trend. “In the last few years, we have seen an explosion of the use of advanced technology by criminal actors to commit fraud and financial crimes targeting government benefit programs, including Social Security,” she told Forbes. The advent of AI, particularly exemplified by technologies like ChatGPT, has provided scammers with powerful tools to enhance their fraudulent activities.
Seniors, often less technologically savvy, are prime targets for these scams, given their potential accumulated savings and retirement funds.
David Derigiotis, Chief Insurance Officer for Embroker, emphasized the need for seniors to verify the legitimacy of communications from federal programs, such as Social Security, to avoid falling victim to scams.
In 2022, fraud complaints involving seniors totaled 88,262, with an average loss per victim amounting to $35,101. A concerning 5,456 victims experienced losses exceeding $100,000.
AI has introduced a new dimension to fraud, enabling criminals to manipulate personal information and engage in identity theft. D’Amico outlined the following ways in which AI contributes to Social Security scams.
Read more: Cash Is Crashing: Strategist Sounds Alarm, Urges Investors To Ditch Dollar
AI-Powered Scams Identity Theft and Fraud
Creation of Phony Accounts: AI assists criminals in searching personal records to create fraudulent Social Security accounts, allowing them to impersonate individuals and manipulate personal data.
AI-Generated Video Scams: Fraudsters can use AI-generated video to impersonate public figures, such as the president or elected officials, creating lifelike likenesses. Victims are then deceived into providing sensitive personal information.
Despite its capabilities, AI is not flawless. D’Amico pointed out limitations in replicating certain features, such as teeth and hands. Vigilance is crucial, and individuals should be wary of images where proportions seem inaccurate, a potential indicator of fraudulent activity.
Identity Change on Social Media: Criminals use AI to change identities on social media platforms, facilitating the creation of fake profiles to manipulate victims.
Impersonation of SSA Personnel: Scammers may impersonate Social Security Administration personnel, contacting victims through various channels, including phone calls, emails, texts, or social media messages. AI-generated photos of real SSA employees add an extra layer of deception.
The Social Security Administration (SSA) identifies four basic signs of a scam:
Trust Exploitation: Scammers pretend to be from a known agency or organization to gain trust.
Fabricated Issues or Prizes: Scammers claim there is a problem or a prize to manipulate victims.
Urgency and Pressure: Scammers create urgency, pressuring victims to act immediately.
Specific Payment Demands: Scammers dictate a specific payment method, often indicating fraudulent activity.
As scams evolve with the integration of AI, staying informed about these tactics and maintaining vigilant skepticism are crucial for individuals, particularly seniors, to protect themselves from financial exploitation and identity theft. Seeking a second opinion from friends or loved ones can provide an additional layer of security against increasingly sophisticated scams.
Read more: Buyer Beware: Navigating The Murky Waters Of Spy Cameras On Amazon