this post was submitted on 24 Apr 2024
436 points (96.4% liked)
Technology
59207 readers
3134 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This seems to be the case, but congress is doing an awful job of communicating the danger to the public. There will likely be a lot of people angry at Biden when he signs this if there is no effort to justify the targeted action.
I feel like it isn't congress's job to do that. They don't have to share or repeat information that they are not experts on to the public. They can share their thought process and rationale for supporting legislation, but we shouldn't expect them to be perceived as technical experts. I bet that fewer than 10 congressional representatives can look at a portion of code and make an educated statement on what's going on and how authors may be performing abnormal operations or obfuscating other actions.
It's the job of the organization(s) that prepare the security briefing, and we've already been hearing this kind of thing in the cybersecurity field for years. Those in the know, know. Those not, tend to not believe it. Warnings about the potential for data harvesting and information operations via platforms like (and specifically) Tik Tok aren't new.
This is like public health information during COVID. Medical professionals have the training and experience to share their professional assessments, but large portions of the population were instead solely relying on politicians to deliver medical information.
So ban data harvesting? That seems pretty common sense. Instead we're giving our government autocratic powers pick winners in the market.
So they can't talk to people about it because they're not technical experts, but they also have the authority to make decisions despite not being experts.
They can talk about it if they want to, but we shouldn’t be using them as our only source of information. Curious on why politicians voted X instead of Y? Look it up! See what experts in the field are saying.
You shouldn’t rely on them to tell you why TikTok is a threat the same way we shouldn’t rely on them to inform us on why weakening EPA standards is good for the environment, why taxing foreign trucks is good for the economy, or why drawing voting maps to concentrate demographics is good for democracy.
These politicians probably know enough to make an informed decision if they care to seek out information. They don’t always have the time or desire to do this. If you believe this to be true even one in a hundred times, that covers a handful of politicians for every single piece of legislation that comes out, every single time.
The same way you may care about many things but only know a lot about a few subjects, they legislate everything and people act like they are the experts. Why assume they know what they’re talking about for every single topic?
Because I would hope they know what they're talking about when they write the legislation. Or at least can explain their reasoning for voting a certain way. Especially when nobody has made the case that TikTok is any more dangerous than Facebook or Twitter.
They largely don’t write the legislation. Lobby groups draft the materials and if we’re lucky, the congressional aides make a pass and clean things up.
You can search for why TikTok is dangerous. There are plenty of examples of how the application and platform are not being forthright with how they collect your identifiers and weaponize them for information operations campaigns.
As I mentioned earlier, the powers that be aren’t as worried about Facebook and the like because they’re US-based and have working relationships with law enforcement. Facebook has been used for the very campaigns that TikTok is being used for now, but a large difference is that another nation has near complete control over the platform.
Yes but they should at least be able to explain why they voted a certain way on that legislation, right?
I agree that social media is dangerous for all the reasons mentioned, but I don't see why Zuckerberg and Musk still get to do all those things.
I'm not arguing against them explaining their rationale. I originally argued that they shouldn't be taken as experts.
Zuckerberg and Musk "get" to do these things because they are in the US, with majority US-based workers, running off US-based infrastructure. If any of these platforms are being used to facilitate attacks against the US, the government can choose any number of methods to step in and enforce compliance to mitigate the threat. That's it. This is about free speech in that not all speech is protected. If somebody uses TikTok to perform the digital equivalent of yelling fire in a crowded theater, the government sees a need to control it.
If Facebook was run and operated out of Tunisia, I'd expect these same conversations to be happening with them as well.
Except Facebook and Twitter are being used to attack the US and spread disinformation and the government isn't doing anything.
But I guess that's what I should expect from a government that cares more about national security than the privacy of its citizens.
The US government has been caught doing the same thing... poorly. You probably aren't going to find a lot of sources showing that the US is fighting these fights on Facebook and twitter, but you can read between the lines with interviews. In general, these kinds of things aren't performed out in the open.
Agree with you though. National security has trumped privacy. 9/11 changed a lot of things in a bad way.