I just watched an interesting segment on France24 about Cox Media Group, a US company that has admitted to developing a software program called “Active Listening” that uses smart devices’ microphones to listen in on private conversations in real time.
They are actually marketing it to advertisers O_o and published the info in a pitch deck.
I found the actual pitch deck online & it’s pretty scary. It reveals a highly aggressive and invasive approach to data collection and targeted advertising, leveraging private conversations in a way that many would find unethical and potentially illegal. The document positions the technology as a powerful tool for advertisers, but at the cost of user privacy and trust.
Here’s the link to the video:
They were actively marketing it online but the page has been taken down. Good thing the internet archive exists https://archive.is/CL6Gs
This part is really scary:
I’d like to add that the Mudita Kompakt Offline+ feature is designed to completely deactivate the GSM modem and microphones, ensuring absolute privacy .
I personally do not worry about what CMG does because I have a non-iPhone Google-apps-free cellphone, because I never use a QR reader, because I no longer use Alexa devices, because I left Facebook et al. a few years ago, and because my laptop uses a fork of Linux.
However, I do worry about what CMG does with respect to friends and relatives, who are nearly or completely ignorant about how much governments and tech companies surveil their digital moves.
This news about CMG gives me yet another reason for interest in the Mudita Kompakt.
@kirkmahoneyphd That’s the kicker- Even though YOU are vigilant with your privacy, people you associate with on a daily (friends & family, co-worker etc) might not be so prudent about their data & if you spend time around them, your conversations can also get caught in the mix.
I have this one friend who has an Alexa speaker, and I’m very careful what I talk about when I’m at her house.
It’s a reminder that our privacy should be considered a RIGHT.
Larry Ellison, Oracle’s co-founder, advocates for an AI-driven surveillance system to monitor citizens, claiming it would ensure “citizens are on their best behavior.”
He’s got a serious Orwellian vision with AI overseeing police and public activities through various cameras, potentially replacing police cars with drones in high-speed chases. He believes that constant monitoring will lead to improved public conduct. YIKES!
Oracle is actively investing in AI projects to further these capabilities. Double YIKES o_O
@roberto You have no idea how many times I’ve heard this come out of people’s mouths. Get this- One of my friends works for the Polish Press Agency & he’s said it. I would think someone in his field would be concerned, but he’s just not bothered.
A friend has told me a few times that she does not lock her iPhone because she has nothing to hide. I have not convinced her that privacy is a necessary condition for freedom. (Simply listen to those who have escaped North Korea and to those who have survived the USSR days.)
Your friend believes in a fair world. A bit like me: I still want to believe that I live in a world where I have the ‘right to be wrong’. Sometimes I feel that I’m wrong about that belief. Let’s see until when.
Hi, I was sent here through the other topic.
As a privacy enthusiast who did quite some research and uses GrapheneOS as a daily driver, I can say this:
Neither iOS nor Android phones can actively listen to your conversations. When the microphone is activated, an icon will notify you. Apps have no access to the microphone if you don’t give them access. Cox media likely has a deal with apps who need a microphone.
The exception to this are the apps which are priviliged: apps from Google and Apple itself can obviously bypass their own restrictions. The Google assistant or Siri can be triggered by a keyword e.g. “Hey Siri!” However, the microphone is not actively listening in to conversations and uploading or transcribing, the computational resources of which would be too great. Instead, they have a low-power algorithm to recognise the specific keyword pattern and only that pattern. BUT:
** There is no guarantee the device is not listening for other patterns or keywords of interest
** with the current trend of AI and specialized chips for things like audio trancription, there is the possibility to really listen in. Note however, that in most countries this would be against the law. There is also no reason to believe that - currently - Apple or Google are interested in such practices. The risk for them is too big.
You also don’t need a Mudita phone to be certain to be free from this surveillance. Any phone with Android AOSP (no Google services) or any dumbphone will be free from such nonsense. It’s good that Mudita gives us one more choice. Provided the Kompakt doesn’t come with the Google Play Store, of course.
@nilss
Thank you for sharing your insights and for contributing to this important discussion.
And most importantly for bringing up the important point about how permissions work on iOS and Android devices. It’s true that both platforms require user consent to access microphones, and they notify users when the microphone is actively in use.
However, it’s also worth noting that when we agree to the terms and conditions of a new device, we generally consent to the possibility of our phones listening whenever they are powered on. This is because any voice-activated device, whether an iPhone or an Android, must always be listening to some extent to detect “wake words” like “Hey Siri” or “OK, Google.” In order for these commands to work, the device’s microphones must remain on and continuously monitor for those specific phrases. This is the scenario that Cox Media is exploiting and selling to potential advertisers. (according to their pitch deck) They are claiming that they have developed an “Active Listening” software designed to capture real-time voice data from smart devices. This software utilizes the microphones of various devices, such as smartphones, smart TVs, and other smart home technologies, to listen to private conversations. The primary method involves:
Using Always-On Microphones: The software actively uses open microphones on devices to listen for and collect voice data. This data is gathered during what Cox Media calls “pre-purchase conversations,” where consumers discuss potential buying decisions, such as their plans for the weekend or their interest in specific products.
While it’s often believed that this listening is limited to detecting wake words, it’s important to understand that smart devices use a wide range of data to build detailed consumer profiles. This data helps them deliver targeted ads and other personalized content. Although companies like Apple and Google emphasize that they are listening only for specific commands, the line between active listening for wake words and passive data collection can sometimes be blurred, especially as AI and data analytics evolve.
I remember this news story for a few years back:
These situations are why many people seek out alternatives that offer more comprehensive privacy protections—devices that limit or eliminate the potential for unwanted data collection altogether.
I hope this clarifies how our devices function and why some users choose to look for options that offer more control over their privacy.
@roberto I remember GoogleGlass…I think it failed. I home this little project fails too.
Remember the hype about Apple Vision Pro, that mixed reality headset?