this post was submitted on 22 Aug 2024
20 points (100.0% liked)

Open Source

31717 readers
118 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

I am a long-time NoScript extension (https://noscript.net/) user. For those who don't know this automatically blocks any javascript and let you accept them (temporarily or permanently) based on the scripts' origin domain.

NoScript as some quality-of-life option like 'accepting script from current page's domain by default' so only 3rd parties would be blocked (usefull in mobile where it is tedious to go to the menu).

When I saw LibreJS (https://www.gnu.org/software/librejs/) I though that would be a better version of NoScript but it is quiet different in usage and cares about license and not open-source code (maybe it can't).

Am I the only one who thought about checking for open-source JS scripts filtering (at least by default)? This would require reproducibility of 'compilation'/packaging. I think with lock files (npm, yarn, etc) this could be doable and we could have some automatic checks for code.

Maybe the trust system for who checks could be a problem. I wanted to discuss this matter for a while.

you are viewing a single comment's thread
view the rest of the comments
[–] 9point6@lemmy.world 2 points 4 months ago* (last edited 4 months ago)

No need to get aggravated, I completely grasp it, you've possibly misunderstood or not entirely read my comment if that's your takeaway.

I'm not talking about server code specifically, I'm going through the stages between the source code repo(s) and what your browser ends up receiving when you request a site.

NodeJS is relevant here because it's what runs nearly all major JS bundlers (webpack, vite, etc), which are what produces the code that ultimately runs in the browser for most websites you use. Essentially in a mathematical sense, the full set of dependencies for that process are a part of the input to the function that outputs the JS bundle(s).

I'm not really sure what you mean with that last part, really, anyone hosting something on the internet has to care about that stuff, not just businesses. GDPR can target individuals just as easily as for-profit companies, it's about the safety of the data, not who has it—I'm assuming you would not want to go personally bankrupt due to a deliberate neglect of security? Similarly, if you have a website that doesn't hit the performance NFRs that search engines set, no one will ever find it in search results because it'll be down on page 100. You will not be visiting websites which don't care about this stuff.

Either way, all of that is wider reasoning for the main point which we're getting away from a bit, so I'll try to summarise as best I can:

Basically unless you intend your idea to only work on entirely open source websites (which comprise a tiny percentage of the web), you're going to have to contend with these JS bundles, which as I've gone into, is basically an insurmountable task due to not having the complete set of inputs.

If you do only intend it to work with those completely open source websites, then crack on, I guess. There's still what looks to me like a crazy amount of things to figure out in order to create a filter that won't be able to work with nearly all web traffic, but if that's still worth it to you, then don't let me convince you otherwise.

Edit: typo