CyLab research highlights the human side of scam recovery
Michael Cunningham
Nov 12, 2025
Source: Alexandra Nisenoff
Eljah Bouma-Sims presents his research on scam victims and online communities at ACM CCS 2025 in Taipei, Taiwan
When people fall victim to online scams—whether through phishing emails, fraudulent sellers, or even extortion—they often turn to the internet for help.
A team of CyLab researchers recently investigated how scam victims use online communities as a resource when seeking answers to difficult questions.
At the Association for Computing Machinery Special Interest Group on Security, Audit and Control’s (SIGSAC) 2025 Conference on Computer and Communications Security (ACM CCS), Carnegie Mellon Ph.D. student Elijah Bouma-Sims presented new research exploring how online Reddit communities provide support, information, and reassurance to scam victims.
The study was co-authored with fellow CMU researchers Mandy Lanyon, Carnegie Mellon research associate, and Lorrie Cranor, CyLab director.
The paper, “Is this a scam?”: The Nature and Quality of Reddit Discussion about Scams,” analyzes more than 1,500 posts across four Reddit forums where users gather to discuss scams. Bouma-Sims and his collaborators found that these digital spaces often serve as grassroots support groups, where users help one another identify scams, cope with losses, and in some cases, simply find someone who understands what they’re going through.
“People go to the police, and there’s often not much they can do,” said Bouma-Sims, a Ph.D. candidate in the School of Computer Science. “So the internet becomes a natural place to ask for help, and Reddit’s community structure makes it a particularly useful platform for that.”
The research team utilized a multi-stage thematic analysis to categorize the types of discussions taking place in scam-related Reddit communities, such as the general forum /r/Scams, which hosts more than 1.3 million weekly visitors, and the smaller but more specific /r/Sextortion board. Through careful coding of thousands of posts and comments, the team discovered a range of support mechanisms that users provided to one another.
Posts often began as calls for help: a message from someone unsure if an email or text was legitimate, or a panicked note from someone who had already sent money or personal information. The responses that followed frequently included advice on what steps to take—canceling cards, changing passwords, or reporting to authorities—but also words of empathy and reassurance.
“Security is often framed as procedural: change your password, lock your account,” said Bouma-Sims. “But there’s also an emotional aspect. People want to feel safe.”
There’s a stereotype that only certain kinds of people get scammed.
Elijah Bouma-Sims, Ph.D. Candidate, Carnegie Mellon University's School of Computer Science
That emotional support was particularly evident in communities like /r/Sextortion, where victims often face intense fear and shame after being threatened with the release of private images.
“Those communities tended to be more focused on the reassurance of telling victims they would be okay, and that the threats were unlikely to be real,” said Bouma-Sims.
The research also uncovered challenges inherent to these open online forums. While moderators work to delete malicious comments, scammers sometimes infiltrate threads to prey on victims a second time, posing as people who can recover stolen money. Victim-blaming also emerged as a recurring problem, with some users chastising others for falling for scams.
“There’s a stereotype that only certain kinds of people get scammed,” said Bouma-Sims. “But what we saw on Reddit shows that deception is complex, and it can happen to anyone.”
Despite those risks, Bouma-Sims’s findings suggest that Reddit’s scam-related communities are a largely positive force for digital self-help. Beyond aiding individual users, they collectively form a vast, crowdsourced record of emerging fraud tactics, cultivating a dataset that could help researchers, policymakers, and law enforcement officials understand the evolving landscape of cybercrime.
The study has broader implications for both human-centered cybersecurity and the design of future tools. By examining how people naturally seek and share advice online, Bouma-Sims hopes to inspire new ways of integrating trustworthy, automated guidance into the platforms where people already ask for help.
“Reddit discussions show us the real questions people have about scams,” he said. “That’s valuable information for researchers who want to design systems that respond to those needs.”