Frightening is a word that quickly comes to mind when those who have worked closely with the development of artificial intelligence (AI) consider how it can be subverted. It is frightening in its capacity to deceive, its impact on jobs, and its availability to individuals, groups, and countries that would abuse it for financial gain.
“Across the country, fraudsters are using artificial intelligence voice cloning technology to trick people into giving up their personal information or money,” Minnesota’s Democratic US Sen. Amy Klobuchar and Maine’s Republican Sen. Susan Collins wrote in a letter to the Federal Trade Commission and the Federal Communications Commission.
They are urging them to take action to raise awareness about the increasing dangers and deceptive capabilities of scammers whose efforts are powered by AI. They also ask if additional resources are needed to combat the growing threat of AI-enhanced scams.
“Recently in Minnesota, a father received a call from someone that sounded just like his son, crying ‘Mom, dad, can you hear me?’ He and his wife were terrified—they had not heard from their son, a Marine deployed abroad, in weeks. Thankfully, the father was able to decipher that the call was not from his son but was an impersonation of his son’s voice generated by AI technology,” the senators wrote.
“These scams are putting too many parents and grandparents through this emotional toll, and as technology improves, voice clones will only become more convincing,” they wrote.
In another case, a mother testified before Congress that she received a terrifying call that sounded just like her 15-year-old daughter. “Mom, these bad men have me. Help me, help me, help me,” she cried. In this case, the daughter was safe at home. Yet, if that call was placed a hundred times, enabled by the instant capability of AI to create new identities, how many of those daughters might have been on vacation with friends, on a field trip, or just out for the evening?
These scams will only get more sophisticated as AI’s rapid development accelerates. Sophisticated AI in the hands of criminals will generate realistic audio and convincing visual scenes of pleas and despair from loved ones.
Hundreds of millions of Americans have willingly posted photos and movies of themselves, friends, and family to the internet, all of which can be used to create their libraries of false identities.
We’ve posted enough about ourselves online that these false identities will be able to talk about past events we’ve shared with friends and family, adding to their believability.
Their efforts are aided by the numerous business, local government, and medical facility data breaches in recent years. Through these breaches, the scammers have collected Social Security numbers, driver’s license numbers, employment data, financial history, home addresses, and medical records.
Married with all of what we put online through social media, the stolen breach information AI-powered scammers can build detailed identities of their current and future victims.
They can build avatars, real-looking and sounding audio and visual replicas of their targets.
Scammers, with their AI-enhanced skills, aren’t just targeting individuals but financial institutions and businesses as well.
“A new survey of 500 fraud and risk professionals, first reviewed by ABC News, shows widespread concern in the financial industry about the growing scale of these fake online customers and whether security and identity detection technology at banks and loan servicers can keep up,” Quinn Owen of NBC writes.
AI-powered scammers create false identities that can apply for financing from credit card companies and lenders, take their money and disappear. Their efforts are assisted by the move of so many of our financial transactions moving online. Loans and credit are extended without people sitting across from each other or talking to one another on the phone.
“As banking and payments security becomes increasingly advanced, fraudsters have shifted their focus to impersonation tactics,” Thomson Reuters Quinn wrote. “Their goal is to convince people and businesses to send them money, thinking the transfer is to a legitimate person or entity.” Thomson Reuters advises financial institutions on their security efforts.
What makes AI-generated fraud schemes increasingly powerful and frightening is that AI can learn. It will instantly discard what doesn’t work as it refines its pitch to get better results. Imagine the hours or days it would take someone to build a detailed profile of an individual from stolen information and profiles on social media sites. AI can do that in seconds, radically increasing the number of scams that can be played at the same time across the world.
As AI allows scammers to target us with what appear to be authentic calls, video and audio, of loved ones in trouble, we have to have ways of assessing whether it is real or fake. One recommendation is to have a family code word kept secret and never posted to any electronic device from which it can be stolen.
Apple co-founder Steve Wozniak told the BBC that “’ we can’t stop the technology,’ but we can prepare people, so they are better educated to spot fraud and malicious attempts to take personal information.”
“You don’t have years, months or weeks to study this, because it is here right now,” Haywood Talcove, CEO of the government business of LexisNexis Risk Solutions, told CNBC.
There are several reasons for that, Talcove said. For example, it’s easy; there’s “virtually zero” probability of getting caught, and the government never runs out of money.
It is also overwhelmed by fraud and doesn’t have the staff to track it down. Too often these days, it just writes off the losses, letting the scammers get away with their crimes and providing incentives to continue their criminal ways.
Federal and state governments have to do a better job of educating Americans on fraud schemes and prevention. If they just do it online, they will miss tens of millions of America’s most vulnerable citizens – seniors. Advertisements in community newspapers would reach many of these citizens.