Google’s latest flagship smartphone raises concerns about user privacy and security. It frequently transmits private user data to the tech giant before any app is installed. Moreover, the Cybernews research team has discovered that it potentially has remote management capabilities without user awareness or approval.
Cybernews researchers analyzed the new Pixel 9 Pro XL smartphone’s web traffic, focusing on what a new smartphone sends to Google.
“Every 15 minutes, Google Pixel 9 Pro XL sends a data packet to Google. The device shares location, email address, phone number, network status, and other telemetry. Even more concerning, the phone periodically attempts to download and run new code, potentially opening up security risks,” said Aras Nazarovas, a security researcher at Cybernews…
… “The amount of data transmitted and the potential for remote management casts doubt on who truly owns the device. Users may have paid for it, but the deep integration of surveillance systems in the ecosystem may leave users vulnerable to privacy violations,” Nazarovas said…
Yes you can: https://grapheneos.org/
I was just wondering earlier today if Google kept the bootloader open to allow custom OS installation only because they had other hardware on the phone that would send them their information anyways, possibly through covert side channels.
Like they could add listeners for cell signals that pick up data encoded in the lower bits of timestamps attached to packets, which would be very difficult to detect (like I’m having trouble thinking of a way to determine if that’s happening even if you knew to look for it).
Or maybe there’s a sleeper code that can be sent to “wake up” the phone’s secret circuitry and send bulk data when Google decides they want something specific (since encoding in timestamps would be pretty low bandwidth), which would make detection by traffic analysis more difficult, since most of the time it isn’t sending anything at all.
This is just speculation, but I’ve picked up on a pattern of speculating that something is technically possible, assuming there’s no way they’d actually be doing that, and later finding out that it was actually underestimating what they were doing.
I don’t mean to discredit your opinion, but it is pure speculation and falls in the category of conspiracy theories. There are plenty of compelling arguments, why this is likely completely wrong:
12(the Pixel 9 is sold in 32 countries, my bad, I had an outdated number in mind) countries around the world. Do you really think that Google would spend all the money in research, custom manufacturing, software development and maintenance to extract this tiny bit of data from a relatively small number of users? I’d say more than 90% of Pixel owners use the Stock OS anyways, so it really doesn’t matter. And Google has access to all the user data on around 70% of all the smartphones in the world through their rootkits (Google Play services and framework, which are installed as system apps and granted special privileges), which lets them collect far more data than they ever could from Pixel users.You’re right that it’s pure speculation just based on technical possibilities and I hope you’re right to think it should be dismissed.
But with the way microchip design (it wouldn’t be at the PCB level, it would be hidden inside the SoC) and manufacturing work, I think it’s possible for a small number of people to make this happen, maybe even a single technical actor on the right team. Chips are typically designed with a lot of diagnostic circuitry that could be used to access arbitrary data on the chip, where the only secret part is, say, a bridge from the cell signal to that diagnostic bus. The rest would be designed and validated by teams thinking it’s perfectly normal (and it is, other than leaving an open pathway to it).
Then if you have access to arbitrary registers or memory on the chip, you can use that to write arbitrary firmware for one of the many microprocessors on the SoC (which isn’t just the main CPU cores someone might notice has woken up and is running code that came from nowhere), and then write to its program counter to make it run that code, which can then do whatever that MP is capable of.
I don’t think it would be feasible for mass surveillance, because that would take infrastructure that would require a team that understands what’s going on to build, run, and maintain.
But it could be used for smaller scale surveillance, like targeted at specific individuals.
But yeah, this is just speculation based on what’s technically possible and the only reason I’m giving it serious thought is because I once thought that it was technically possible for apps to listen in on your mic, feed it into a text to speech algorithm, and send it back home, hidden among other normal packets, but they probably aren’t doing it. But then I’d hear so many stories about uncanny ads that pop up about a discussion in the presence of the phone and more recently it came out that FB was doing that. So I wouldn’t put it past them to actually do something like this.
Why would this only be present in Pixels then? Google isn’t interested in specific people. Intelligence agencies are. This would mean, that every phone in the world needs to be compromised using this sophisticated, stealthy technology, which is even more unlikely.
If it is present there, it doesn’t imply it’s only present there.
And we really have no idea how close of a relationship Google, or any other corp for that matter, has with various intelligence agencies. Same thing with infiltrations by intelligence agencies.
And no, it doesn’t mean that every phone in the world is compromised with this, which wouldn’t be that sophisticated, just stealthy. The sophisticated part would be part of the normal design process, it’s called DFT or design for test if you want to read about it, used legitimately to determine what parts of the chip have manufacturing flaws for chip binning.
Most phones don’t have an unlocked bootloader, and this post is about the data Google is pulling on factory pixels.
Why would they do all the work on the software side and then themselves offer a device that allows you to remove their software entirely? And if it’s worth it just from the “make more money from people who only want unlocked phones”, why isn’t it more common?
Mind you, my next phone might still be a pixel. Even if this stuff is actually there, I wouldn’t expect to be targeted. I can’t help but wonder about it, though, like just how deep does the surveillance or surveillance potential go?
The Pixel is a good phone to test the latest android features for development purposes. I would imagine to some degree they are trying to target developers interested in testing software by offering the ability to unlock and relock the bootloader. This fosters a vibrant developer community and encourages innovation. Certain things can be tested in an android emulator but it helps to have a real device to test as well.
Pixels often ship with hardware features that other phones later include. For example Pixel 8 was the first phone with hardware memory tagging extensions and if developers wanted to test that feature they would buy a Pixel first and then use that experience with the devices their company is manufacturing. Pixels are often released with new android versions that implement android features and APIs the way they were intended to work. There have been cases of OEMs releasing devices with broken implementations of standard android features.
Pixel was the first phone with Strongbox as well. Additionally, It was the first android phone with satellite connectivity.
It also attracts the segment of the market that just enjoys modifying their phones as well. So basically they are targeting the power user community and developers. Despite the Pixel having the ability to install custom verified boot keys and custom OSs, Google knows that very few users use those features so it does not cut into their Play Store and Play Services market share very much.
Ok let’s assume this is true, and US intelligence agencies have actually backdoored all US phone manufacturers. What about foreign phones? If this was true, someone the NSA is interested in could just defend themselves by e.g. buying a Chinese phone. All this effort, just to be defeated by foreign phone manufacturers? It wouldn’t be worth it, which is why it’s so highly unlikely.
Well to this point (I don’t 100% believe this flavor of state surveillance theory but) you cannot buy phones made my foreign manufacturers and have them work in the US. For example, Oppo, Huawei, Xiaomi, all do not work on USA cell networks, and you can’t buy them unless you go through an import process. Just to name a few of the many. But granted, those are all Chinese manufacturers.
Wait what? Is that actually true? What if you are a foreigner visiting the US and bring your e.g. Oppo phone with you? You can’t use it? Even with a foreign SIM?
As the saying goes, just because you’re paranoid, doesn’t mean you’re wrong.
The answer that will put this question to bed is open source hardware. Thankfully we’re close to having viable options, finally.
I will never understand buying a google phone just to deGoogle it. why would you give them money.
I’ve seen the reasoning, I just …
Because I want a secure phone with relatively good specs, relatively good design, battery life and camera quality. And because it is one of the very few devices with a user-unlockable and re-lockable bootloader.
@averyminya @Andromxda grapheneos is SOTA of android security, and it only supports pixels, thats why
Right, like I said I’ve seen the reasoning. It just seems like giving money to the very company you’re all trying to avoid, which in turn is just funding for Google to be more invasive.
@averyminya bought it secondhand, problem solved
Certainly helps!