this post was submitted on 22 May 2024
199 points (98.5% liked)
195
4163 readers
2 users here now
RULE 1: IF YOU VISIT THIS COMMUNITY, YOU MUST POST BEFORE LEAVING!!
The Lemmy equivalent of r/195 on Reddit. Not officially affiliated (yet).
Any moderators from the reddit can have mod here, just ask.
There's another 196 over on 196@lemmy.blahaj.zone
Most people use the Blahaj.zone one so this place isn't very active.
ALL HAIL LORD SPRONKUS!!!
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Anything is legal when you force the customer to agree to it or not use your product. They can say whatever they want in the ToS because it's 365 pages long and only attorneys can understand what is actually being said.
You can turn this feature off without any problems.
And based on their track record, they will just quietly turn it back on.
Microsoft is so far beyond the benefit of the doubt they couldn't get back to it if they tried.
Are there actually any documented cases of them just enabling userland features after they've been disabled? The only thing I heard of before was registry edits / telemetry changes being undone. Not to say that that's cool of course, but at least it's not like it asks you for your privacy settings during startup and then undoes your choices. As far as I know, maybe I'm just out of the loop.
Generally though, what do you think would actually be Microsoft's motivation to randomly re-enable this particular feature? Do you think that the claim that the data doesn't leave the device is a lie?
Does it get much worse than telemetry settings being quietly enabled? It's spyware at the best of times, much less when they get all sneaky about it. And I've definitely had them change privacy/telemetry options that I set on startup, multiple times.
I don't necessarily think they're stupid enough to come out with the full data harvesting machine on day one. They'll release what they can get away with - in this case, taking screenshots and storing them locally - and they'll boil the metaphorical frog from there. Maybe they offer more powerful AI by running it through their servers, and then they can start "accidentally" opting people into that "service".
I'm not even necessarily saying there's some grand scheme going on here, but nobody can possibly deny they have every incentive to push that boundary until it breaks, and they have consistently shown that they will pursue that incentive without any regard for user privacy whatsoever.
We know this because they have done it so many times before.