You might think the ads you scroll past every day are just background noise. But a new research suggests they’re doing a lot more than selling you things. The study found that AI can analyze the ads shown to you online and reconstruct sensitive personal details about you (via UNSW).
That includes your political preferences, education level, employment status, age, gender, and broader financial situation. The scary part is that you don’t need to click anything; just seeing the ads is enough.
How does this actually work?
Researchers analyzed over 435,000 Facebook ads shown to 891 users, collected through a citizen science initiative called the Australian Ad Observatory. They fed those ad streams into widely available large language models, the same ones most people use as AI assistants every day, and the results were striking.
The AI could build detailed personal profiles from short browsing sessions alone. It didn’t need your browsing history or any data you actively shared. The process was also over 200 times cheaper and 50 times faster than using human analysts to do the same thing.
The reason this works is that ad delivery systems aren’t random. Platforms optimize which ads you see based on inferred profiles built from your behavior. That optimization leaves behind a kind of fingerprint, and AI can now read it.
Why existing privacy protections aren’t enough
Even though major platforms restrict advertisers from directly targeting sensitive categories, the study shows that those traits still get encoded indirectly into ad delivery patterns.

Researchers also flagged that common browser extensions, like ad blockers or coupon finders, could quietly collect this data in the background without raising any red flags.
Researchers say users can reduce risk by limiting browser extension permissions and adjusting ad personalization settings. But they also make it clear that this isn’t something individuals can solve alone. The vulnerability is built into the ad ecosystem itself, and stronger platform-level safeguards are needed to address it.

