A media firm that has worked with the likes of Google and Meta has admitted that it can target adverts based on what you said out loud near device microphones.

Media conglomerate Cox Media Group (CMG) has been pitching tech companies on a new targeted advertising tool that uses audio recordings collected from smart home devices, according to a 404 Media investigation. The company is partners with Facebook, Google, Amazon, and Bing.

In a pitch deck presented to GoogleFacebook, and others in November 2023, CMG referred to the technology used for monitoring and active listening as “Voice Data.” The firm also mentioned using artificial intelligence to collect data about consumers’ online behavior.

  • IsThisAnAI@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    edit-2
    4 months ago

    Ignorant lemmings who don’t understand there are billions and billions of dollars going into Sec ops research who would give both their nuts, ovaries, or whatever to make this discovery and can’t be bothered to understand how the hardware chip powering the on word works.

    • N0body@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      4 months ago

      If a device can start listening when you say, “Hey Siri,” it can also start listening when you say other words.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        4 months ago

        So far, nobody has proven that any major phone apps are constantly listening and sending that data somewhere. That would be huge news if it ever happened.

        For voice assistants, they’re tuned to listen specifically for wake words like “hey Siri”.

      • IsThisAnAI@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 months ago

        Let me ask, do you understand how the wake word chip works? Or are you just imagining an absurd edge case where a researcher some how gaps the bridged chips while having the device in person?

        What do you believe is a viable attack vector that somehow nobody knows about?