this post was submitted on 24 Oct 2023
419 points (98.6% liked)
Technology
59243 readers
3422 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I was under the impression that higher bandwidth wireless networks required higher frequency bands for that data. Like a specific frequency should have a theoretical maximum data transfer rate and the only way to get around that would be some kind of fancy compression algorithms.
That is correct.
However the lowest GSM frequency was 300Mhz, so there is still quite a lot of bandwidth there (if I'm not mistaken to a theoretical maximum of 600Mbit/s for a 2 level signal, though in practice quite a lot less as this are radio-waves rather than signals in circuit lines, so encoding schemes have to be disgned for a lot more noise and other problems).
Anyways, the point being that the right encoding scheme can extract some Mbit/s from even the 300Mhz band.
Frequency isn't that relevant, it's frequency bandwidth. The bit rate is n/T with n being bits per symbol and T symbol duration which itself is 1/B with B being the frequency bandwidth. You want to increase the bit rate you can either increase the number of bits per symbol or increase the frequency bandwith. 5G allows bandwiths up to 400MHz per channel, there isn't enough space in the lower frequency ranges for such large bandwidths, so you go up.