this post was submitted on 11 Jul 2023
202 points (95.5% liked)
Technology
59314 readers
4603 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're not entirely wrong, but I have a related degree and actually did polling back in the day, so I'll add some nuance.
Most reputable political polls are surprisingly good. Pollsters get it wrong far far less than people think they do. Which is astonishing, given you're often polling a thousand people, to discover the opinions of millions. The problem is that people fail to read the small print, don't understand basic statistics or probabilities, and media misreport what they actually say.
Best example: 2016 US election. No one who knows a bit about polling was at all surprised by Trump winning. IRC if you aggregated, he had a 1/3 chance of winning. Him winning was invariably within the margin of error of many many polls. But the media misinterpreted them and then blamed bad polling for their own mistakes.
And that's not surprising. Polling how someone will tick a box on an election day in the near future, by asking them to do the equivalent of tick a box in a poll? Likely to be quite good predictor.
More vague stuff like this, it's harder. You're not necessarily measuring what you're measuring, and because the media invariably misrepresents scientific studies and polls, you need to read the small print and what they actually asked.
In any case, here's the pollsters article on it (including sample size, methodology, etc.):
https://www.pewresearch.org/short-reads/2023/07/10/majority-of-americans-say-tiktok-is-a-threat-to-national-security/
And the questions they asked:
https://www.pewresearch.org/wp-content/uploads/2023/07/SR_2023.07.10.23_tiktok_topline.pdf
For example, it would have been interesting if they'd asked "Is TikTok a threat to national security in the United States?" rather than "How much of a threat...?"
Changing the answer scale would likely also have resulted in different answers.
Also, do respondents know what national security is? It's a pretty vague term for layman.
Hell, do all respondents know what tiktok is? Because if you asked people if the Umbrella corporation is a threat to national security, it's like that many would answer yes.