CyLab’s Sarah Scheffler to discuss new research on age verification technologies at FTC workshop

Michael Cunningham

Jan 26, 2026

Photo of Sarah Scheffler speaking at the 2025 CyLab Partners Conference

Source: David Cochran

Sarah Scheffler, assistant professor in Carnegie Mellon’s Departments of Engineering and Public Policy and Software and Societal Systems, will serve as a panelist at the FTC Age Verification Workshop on January 28.

As governments and technology companies grapple with how to protect children online without compromising privacy and security, the Federal Trade Commission (FTC) will convene a public Age Verification Workshop on January 28 to examine the rapidly evolving landscape of age verification and age estimation tools.

CyLab faculty member Sarah Scheffler, assistant professor in Carnegie Mellon’s Departments of Engineering and Public Policy and Software and Societal Systems, will participate as a panelist in a session focused on the technical design and implications of these systems.

The workshop will bring together researchers, policymakers, consumer advocates, and industry representatives to discuss why age verification matters, how current tools work, how they intersect with regulation, including the Children’s Online Privacy Protection Act (COPPA). Participants will also discuss risks and tradeoffs that some of these tools may introduce. Due to the expectation of inclement weather, this event will be held virtually. No registration is required, and a livestream link will be available on this page.

Scheffler’s panel, titled From Biometrics to Behavioral Signals: Age Verification Tools, will start at approximately 11 a.m. ET, and will be moderated by FTC Staff Attorney Elizabeth Averill. The panelists will examine the technical approaches companies use to determine a person’s age and the consequences of those approaches for privacy, security, and usability.

Scheffler’s participation reflects her ongoing research into how people respond to different age verification methods, and how those systems can be designed to minimize unnecessary data collection.

“Right now, a lot of laws and proposals treat age verification as if it were a simple problem, but it’s actually a very complicated technical and social issue,” said Scheffler. “How you design these systems matters enormously for people’s privacy, security, and trust.”

In recent work, Scheffler and her research collaborators conducted a large-scale online experiment to study how users react to different age-verification prompts. Participants were asked to access age-restricted content and were shown one of several types of age checks — ranging from a simple checkbox confirming they were over 18, to uploading a photo ID, to allowing an AI system to estimate their age from a facial scan.

A preliminary report, released by the researchers this week, shows striking differences in user behavior between study conditions. Nearly everyone was willing to click a checkbox, but far fewer people were willing to upload a government ID or submit to more invasive verification methods, often choosing instead to abandon the site entirely.

“That tells us something important,” Scheffler explained. “The more intrusive the method, the more people push back. That has implications not just for privacy, but also for whether these systems actually work as intended.”

 

We should not turn age checks into identity checks by default.

Sarah Scheffler, assistant professor, Engineering and Public Policy, Software and Societal Systems

Beyond usability, Scheffler is concerned that some current approaches risk turning age verification into full-scale identity verification, collecting far more information than is necessary simply to determine whether someone is over or under a certain age.

“If all you need to know is whether I’m over 18, you shouldn’t need my name, my address, or a copy of my ID,” she said. “Once you start collecting and storing that kind of data, you create new risks, such as identity theft and long-term surveillance, that can be much worse than the original problem you were trying to solve.”

Scheffler has been active in broader technical and policy discussions on this topic, including participating last October in the Internet Architecture Board (IAB) and World Wide Web Consortium (W3C) Workshop on Age-Based Restrictions on Content Access, which explored whether technical standards for age verification should be developed and what privacy-preserving designs might look like. Those conversations highlight the need to carefully balance child protection goals with fundamental privacy and security principles.

Scheffler’s message to policymakers and technologists is that age verification should be narrowly tailored, proportionate, and designed with privacy in mind from the start.

“We should not turn age checks into identity checks by default,” Scheffler said. “We should be asking: what is the least intrusive way to accomplish the goal, and how do we prevent these systems from becoming a new source of harm?”

Read the full preliminary report, User (Non-)Compliance with Age Verification: Preliminary Evidence from a Deceptive Web Experiment.

Co-Authors: Yanzi Lin, Cheng Zhang, Madelyne Xiao, Lorrie Faith Cranor, and Sarah Scheffler; Carnegie Mellon University’s CyLab Security and Privacy Institute