CMG Leak Unveils Controversial ‘Active Listening’ Ad Technology
TEHRAN (Tasnim) – A leaked document from Cox Media Group’s subsidiary, CMG Local Solutions, has raised concerns over the use of AI-driven "active listening" to target smartphone users with ads based on voice data.
For years, smartphone users have worried that their devices may be listening to their conversations to push targeted ads. Recent revelations suggest that these concerns might have merit.
A leaked presentation from CMG Local Solutions, a branch of Cox Media Group (CMG), describes a method termed "active listening." This technique reportedly uses AI to combine voice data with online behavioral data to deliver hyper-targeted ads.
The pitch deck, obtained by 404 Media, explains that "advertisers can pair this voice data with behavioral data to target in-market consumers." The presentation adds that the technology can identify "ready-to-buy" consumers, creating ad lists based on their spoken intentions.
A CMG spokesperson told Newsweek that "CMG businesses have never listened to any conversations nor had access to anything beyond third-party aggregated, anonymized, and fully encrypted data sets used for ad placement." The company clarified that it had sourced existing voice datasets from third-party providers to integrate with other data. CMG also called the presentation "outdated materials for a product that CMG Local Solutions no longer sells," adding, "Although the product never listened to customers, it has been discontinued to avoid misperceptions."
While tech giants like Google, Meta (Facebook’s parent company), and Amazon were listed as CMG clients in the presentation, all three denied any involvement in the "active listening" program.
An Amazon spokesperson told Newsweek, "Amazon Ads has never worked with CMG on this program and has no plans to do so."
A Google spokesperson stated, "All advertisers must comply with all applicable laws and regulations as well as our Google Ads policies, and when we identify ads or advertisers that violate these policies, we will take appropriate action." Google further confirmed that Cox Media Group had been removed from its Partners Program as part of a review process.
Meta also responded, with a spokesperson telling Newsweek, "Meta does not use your phone’s microphone for ads, and we’ve been public about this for years. We are reaching out to CMG to clarify that their program is not based on Meta data." Meta is investigating if CMG violated its terms and conditions and says it will take appropriate action if necessary.
The third-party data used by CMG often originates from smartphone apps that gather data, including voice, based on user consent through terms and conditions (T&Cs). Research shows that 91% of users accept T&Cs without reading them, with the figure rising to 97% among those aged 18-24.
For users who have agreed to T&Cs, the first step is to review the permissions granted to such apps.
"For an app to perform active listening, it would need microphone access permissions. Both Android and iOS devices explicitly request this permission when the app is installed or updated," said Luis Corrons, researcher and Norton Security Evangelist. He added that apps might also request background access, which allows them to continue listening even when not in active use.
"iOS now shows an orange or green dot in the status bar when the microphone or camera is in use. Android has similar indicators to alert users when the microphone is being accessed," Corrons explained. He also recommended that users regularly review app permissions to detect unnecessary access.
Addressing the differences between virtual assistants and other apps using "active listening" for ads, Corrons noted, "Assistants like Siri, Alexa, and Google listen for trigger words like 'Hey Siri.' Once activated, they show the user that the microphone is listening. Other apps also require microphone permissions and will trigger a similar icon."
To avoid unwanted background listening, Corrons advised users to limit permissions for voice assistants. "Only allow Siri or Alexa to activate with wake words and disable listening when the device is locked," he said.