this post was submitted on 10 Aug 2024
590 points (98.5% liked)
Privacy
32159 readers
608 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Thanks, nice to have someone knowledgeable.
Would you say matrix is censorship resistant? I've very limited knowledge of it but given what you said I imagine that if I was trying to block matrix I would just need to query the url of the text file and check the DNS text entry, if either exist just add the domain to the blocklist.
Ok this raises a question for me. How do you find a url like this which wouldn't be like, "linked on their site" or something? I know it must be possible to like dump a URL list for a site to a textfile, I'm just wondering how.
Like say I want to find all the super secret pages on www.subgenius.com, they link some but say www.subgenius.com/pam1/pamphlet.html wasn't directly linked (it is, but pretend lol) but could be accessed by the URL, how would I find that URL? Can you just run like
someprogram -a www.subgenius.com -o subgenius.txt
because that would be cool.Maybe I've misunderstood how it works. I thought that when connecting to a matrix instance you would point to the domain name and the text file would be on a standard location (as with
/robots.txt
or all the files in/.well-known/
) so it would be easily discoverable. In fact I just checked and matrix does use/.well-known/
so one should be able to identify matrix servers by querying these URLs. Unless their is a way to use a non-standard location, but that would require further configuration on the client I guess.And just to answer your question, the only way to find some hidden file would be to brute force. This could obviously be extremely time consuming if the URL is long and random enough, especially if you add rate limiting (this last thing could be circumvented by using multiple IPs to scan, which would be easy for a state actor).
Edit: I've just realized I wasn't answering to the same person, the first part of the message was more for @TarantulaFudge@startrek.website
Ah maybe I've misunderstood then, lol. I didn't know any of that. Oh well!
Yeah the main thing is that the ports and addresses can change and it's nbd. From a firewall perspective, it's impossible to block them all. Especially when the clients are doing mundane https requests. Even if the server goes down or partial connectivity, the channel can still be used.
But this seems easy to automatically block, no? If a client is querying an unknown domain check for some Matrix related data in
/.well-known/
and add it to the block list if there is. And since the servers are publicly advertising the port used you just need to periodically check the list of known matrix domains you are creating in the first step.Russia is already doing DPI and blocking ESNI so that seems easy. A more widespread usage of ECH would help everyone, as is Signal advocating, but that's not the case yet.