The $30 Question: Can You Really Anonymize and Monetize Your Conversations?

October 9, 2025


Privacy Plus+


Privacy, Technology and Perspective

 

This week, let's consider the Neon Mobile app, which rocketed to popularity in September with a provocative promise: pay users "hundreds or even thousands of dollars per year" to record their conversations and sell them to AI companies for model training. The app's spectacular rise and crash together crystallize a fundamental how the commodification of data often comes at the expense of consumers.

Background

In September 2025, TechCrunch broke news about the Neon app, which had risen to #2 on Apple's App Store with a provocative pitch: "Companies collect and sell your data every day. We think you deserve a cut." Neon offered to pay its users up to $30 daily to record their phone calls, anonymize that data, then sell it to AI firms. After TechCrunch discovered a security flaw that allowed any user to access the phone numbers, recordings, or transcripts of other users, Neon took itself offline.

 Beyond the security breach, TechCrunch's investigation revealed several troubling privacy issues:

Wiretapping Concerns: Thirteen U.S. states require all-party consent for call recording, yet Neon appeared designed to circumvent these laws. The app claimed to record only the user's side of conversations, but experts suggested it likely recorded entire calls and simply removed non-users' voices from transcripts—a technical workaround that may not satisfy legal requirements.

Third-Party Privacy Violations: Non-users had no opportunity to consent to their calls being recorded. When a Neon user called someone who didn't use the app, that person's voice—a unique biometric identifier—could be captured, monetized, and potentially retained indefinitely without their knowledge or agreement.

Exploitative Terms: Neon's terms of service granted the company "worldwide, exclusive, irrevocable, transferable, royalty-free" rights to users' recordings across all media formats. Users received one-time payments while surrendering permanent control over personal data.

The TechCrunch investigation is worth the full read by clicking on the following link:

https://techcrunch.com/2025/09/24/neon-the-no-2-social-app-on-the-apple-app-store-pays-users-to-record-their-phone-calls-and-sells-data-to-ai-firms/

Our Thoughts

Rather than focus on the many privacy issues already raised by TechCrunch, here, we will highlight the issues associated with Neon’s promises of anonymization and its efforts to compensate its users for their data.

The Anonymization Fiction: Neon claimed to filter out "names, numbers, and other personal details," but voice recordings contain speech patterns, conversational content, relationship dynamics, and rich contextual information that resist meaningful anonymization. Such promises of so-called “anonymization” echo throughout the data economy. We have written before how these promises are illusory where re-identification of such data is often straightforward, and you can read one such post, entitled “When Anonymous Isn’t,” by clicking on the following link: 

https://www.hoschmorris.com/privacy-plus-news/when-anonymous-isnt

The reality is that computer scientists have repeatedly proven that effective anonymization of rich datasets is impossible. Yet regulations continue to exempt "anonymized" data from privacy protections, thereby creating a massive loophole through which surveillance operates unchecked.

Data Dignity—And Why Neon Betrayed It: We have also written before about what technologist Jaron Lanier and economist E. Glen Weyl coined "data dignity," proposing that people should be compensated for data they create and maintain meaningful control over its use. You can read our previous post, “Data Dignity and Inverse Privacy” by clicking on the following link:

https://www.hoschmorris.com/privacy-plus-news/data-dignity-and-inverse-privacy

While Neon paid its users (apparently to their resounding delight and enthusiasm), it otherwise failed to align with data dignity's core requirements:

  • ·       No Real Control: Despite payment, users surrendered exclusive, irrevocable rights to the data. Once data entered Neon's ecosystem, users’ agency evaporated.

  • ·       No Collective Representation: Users accepted Neon's terms without negotiation and with vastly unequal power.

  • ·       Exploitative Compensation: Users received pennies (maximum $30 daily) while the company built million-dollar datasets for resale.

  • ·       False Security: Catastrophic security failures exposed users' most intimate conversations to anyone.

  • ·       The Anonymization Lie: As "When Anonymous Isn't" demonstrates, anonymization promises are often meaningless, exposing users to re-identification risks they were never warned about.

What We Need: As AI systems grow hungrier for training data, as data brokers proliferate, and as authoritarian governments refine surveillance capabilities, this crisis intensifies daily. Ultimately, we must choose:  

Do we want a future where every human interaction becomes a commodity circulating through markets we don't control to buyers we'll never know, or one establishing meaningful protections reflecting technological reality rather than comforting fictions about anonymization?

We urge the need for not just data dignity, but for justice.

-- 

Hosch & Morris, PLLC is a boutique law firm dedicated to data privacy and protection, cybersecurity, the Internet and technology. Open the Future℠.

 

Next
Next

Data Broker CCPA Compliance Issues