I thought they weren’t turning off 2G. What’s the benefit? Other than forcing it sooner? Most places that used 2G still get exceptional coverage from it.
Coverage is from frequency, not generation of signal encoding.
The benefit is you can reuse the frequency bands for something better, like 5G. That’s what they did in my country, among others. So, now we get 5G on 3 different frequency ranges. High speed and long range.
I was under the impression that higher bandwidth wireless networks required higher frequency bands for that data. Like a specific frequency should have a theoretical maximum data transfer rate and the only way to get around that would be some kind of fancy compression algorithms.
However the lowest GSM frequency was 300Mhz, so there is still quite a lot of bandwidth there (if I’m not mistaken to a theoretical maximum of 600Mbit/s for a 2 level signal, though in practice quite a lot less as this are radio-waves rather than signals in circuit lines, so encoding schemes have to be disgned for a lot more noise and other problems).
Anyways, the point being that the right encoding scheme can extract some Mbit/s from even the 300Mhz band.
Frequency isn’t that relevant, it’s frequency bandwidth. The bit rate is n/T with n being bits per symbol and T symbol duration which itself is 1/B with B being the frequency bandwidth. You want to increase the bit rate you can either increase the number of bits per symbol or increase the frequency bandwith. 5G allows bandwiths up to 400MHz per channel, there isn’t enough space in the lower frequency ranges for such large bandwidths, so you go up.
Isn’t the infrastructure for 2G also a factor? Over here for example we have lots of towers in remote mountain regions, rather complicated to upgrade all of them. It can be done but it will take a while.
I thought they weren’t turning off 2G. What’s the benefit? Other than forcing it sooner? Most places that used 2G still get exceptional coverage from it.
Coverage is from frequency, not generation of signal encoding.
The benefit is you can reuse the frequency bands for something better, like 5G. That’s what they did in my country, among others. So, now we get 5G on 3 different frequency ranges. High speed and long range.
I was under the impression that higher bandwidth wireless networks required higher frequency bands for that data. Like a specific frequency should have a theoretical maximum data transfer rate and the only way to get around that would be some kind of fancy compression algorithms.
That is correct.
However the lowest GSM frequency was 300Mhz, so there is still quite a lot of bandwidth there (if I’m not mistaken to a theoretical maximum of 600Mbit/s for a 2 level signal, though in practice quite a lot less as this are radio-waves rather than signals in circuit lines, so encoding schemes have to be disgned for a lot more noise and other problems).
Anyways, the point being that the right encoding scheme can extract some Mbit/s from even the 300Mhz band.
Frequency isn’t that relevant, it’s frequency bandwidth. The bit rate is n/T with n being bits per symbol and T symbol duration which itself is 1/B with B being the frequency bandwidth. You want to increase the bit rate you can either increase the number of bits per symbol or increase the frequency bandwith. 5G allows bandwiths up to 400MHz per channel, there isn’t enough space in the lower frequency ranges for such large bandwidths, so you go up.
Isn’t the infrastructure for 2G also a factor? Over here for example we have lots of towers in remote mountain regions, rather complicated to upgrade all of them. It can be done but it will take a while.
Not complicated at all. For the most part, all they do is swap one box - the transmitter. That’s it.
(However, that doesn’t consider other things, like improvements in redundancy and safety, or construction standards that didn’t exist back then.)
But really, all that needs to be done is pull out one box, and slide in a new box. Not complicated at all.