Starling Bank research
webCredibility Rating
Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: CNN
A CNN news article reporting on Starling Bank's consumer-facing research into AI voice cloning fraud, relevant to discussions of AI misuse, social engineering, and the real-world harms of accessible generative AI tools.
Metadata
Summary
Starling Bank research highlights the growing threat of AI-powered voice cloning scams, where fraudsters replicate individuals' voices to deceive friends and family into transferring money. The article warns that even small audio samples from social media can be sufficient to create convincing voice clones. It offers practical advice for individuals to establish safe code words with trusted contacts to verify identity.
Key Points
- •AI voice cloning technology can replicate a person's voice from as little as 3 seconds of audio found on social media.
- •Starling Bank research found a majority of people surveyed had never heard of voice cloning scams, leaving them highly vulnerable.
- •Fraudsters use cloned voices to impersonate victims in calls to family members, requesting urgent money transfers.
- •The bank recommends establishing a 'safe phrase' or code word with close contacts to authenticate identity in suspicious calls.
- •Voice cloning scams represent a rapidly growing category of AI-enabled fraud with significant financial and trust implications.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| AI-Era Epistemic Security | Approach | 63.0 |
Cached Content Preview
AI voice-cloning scams could target millions of people, Starling Bank warns | CNN Business
Markets
DOW
S&P 500
NASDAQ
Hot Stocks
Fear & Greed Index
----- is driving the US market
Latest Market News
For Subscribers
Rent now, pay later offers renters a little more power — at a price
Disney’s new CEO has overseen a huge transformation of the Disney parks. Take a tour
The US economy grew just 0.7% last quarter, ahead of a potentially destabilizing war with Iran
Hot Stocks
----- is driving the US market
Something isn't loading properly. Please check back later.
Ad Feedback
Business
Tech
2 min read
This bank says ‘millions’ of people could be targeted by AI voice-cloning scams
By Anna Cooban , CNN
2 min read
Published
7:24 AM EDT, Wed September 18, 2024
Link Copied!
Follow:
AI
See your latest updates
Starling Bank said fraudsters are capable of using AI to replicate a person’s voice from just three seconds of audio found in, for example, a video posted online.
Adrian Dennis/AFP/Getty Images
London
CNN
—
“Millions” of people could fall victim to scams using artificial intelligence to clone their voices, a UK bank has warned.
Starling Bank, an online-only lender, said fraudsters are capable of using AI to replicate a person’s voice from just three seconds of audio found in, for example, a video the person has posted online. Scammers can then identify the person’s friends and family members and use the AI-cloned voice to stage a phone call to ask for money.
These types of scams have the potential to “catch millions out,” Starling Bank said in a press release Wednesday.
They have already affected hundreds. According to a survey of more than 3,000 adults that the bank conducted with Mortar Research last month, more than a quarter of respondents said they have been targeted by an AI voice-cloning scam in the past 12 months.
The survey also showed that 46% of respondents weren’t aware that such scams existed, and that 8% would send over as much money as requested by a friend or family member, even
... (truncated, 6 KB total)03c3c9c434b5ad4f | Stable ID: NzMxMzVhZj