I really wish standards like WiFi simply covered more usecases like this. All it would take is a few extra modulation and coding modes in the standard.
I should be able to walk 5 miles from my house, and still have my house wifi work, but at a data rate of 0.3 kbps.
I should never have wifi 'cutting in and out' or 'on the edge of the range' - it should be a simple matter of not being fast enough if I'm too far away. If Shannon[1] allows it, the standard should allow it too.
Obviously my phone would probably choose to switch to mobile data long before that. But for some use cases, like instant messaging, 0.3 kbps is fine, and there shouldn't be a need to use a totally different standard to achieve it.
> All it would take is a few extra modulation and coding modes in the standard. [...] but at a data rate of 0.3 kbps.
It would take more than just "a few extra modulation and coding modes". At 0.3 kbps, even a single 100-byte packet would already use up one whole third of the available airtime for the channel. (On the current slowest WiFi speed of 1 Mb/s, that packet takes a fraction of a percent of the available airtime, and even that's slow enough that some newer APs disable the older 802.11b speeds by default and set the slowest speed to 6 Mb/s.) And due to how the WiFi protocol works, each AP periodically (usually every 100/1024 seconds, so around 10 times per second) broadcasts a beacon at the slowest rate it accepts (and AFAIK the beacon is nowadays larger than 100 bytes).
It would probably require at least changing how the clear channel assessment works, to allow for overlapping transmissions, but that's a fundamental part of the standard. Once you're going that far, it probably makes more sense to define a new protocol, one better tuned for that use case.
You'd probably simply have to define that beacons are still sent at 1 Mbps, but define a new 'beaconless' mode which can connect and work from a long distance away without the beacons.
Good news, everyone! There is a WiFi standard for it: 802.11ah[1]. Seems like you can even buy hardware for it[2] today. Its most likely not going to show up in your phone anytime soon, though.
It also wouldn't be trivial from the hardware side. LoRa/NB-IoT/long-range low-speed low-power protocols generally use ISM band of 915 MHz in US and 800 something MHz, with some using 400 ish MHz. That will physically require a completely separate RF path than the 2.45 GHz and 5.8 GHz which WiFi uses. (No, even "integrated wideband transceivers" like AD936x have 1) unusably bad isolation and 2) filtering requirements)
This is also disregarding that "a few extra modulation schemes" is costly to implement. Narrow band schemes (like GMSK) are incompatible with wide band protocols like LoRa or WiFi with spread spectrum/chipping or OFDM. (at least if you are interested in the same performance to price ballpark) Of course many things nowadays use digital baseband so it's in theory flexible, but the front end is still a major limiting factor, unless you use many-GHz RFADC/DAC which still suffer from wide vs narrow band problems.
Also, most long range protocols are meant for ultra low power applications like smart-ag or energy harvesting which WiFi does not support.
Considering the existing amount of congestion with WiFi, if we want to increase range further, we'd likely need more chipping codes to fit in the same frequency allocation, which will increase the noise floor for everybody, thus reducing the range... etc
You'd run the whole thing over the same 2.4G/5G 20/40/80 Mhz channels as wifi, but probably using a gold-code scheme. You'd pre-arrange what code you'll use (perhaps derived from the base station mac address so there aren't collisions even across a city). Simulations would have to be done of high bandwidth close-by unencoded and low bandwidth long distance coded users coexisting, and defining a fairish sharing scheme between them.
The whole thing could probably be implemented with only custom firmware (ie. no hardware changes). All modern GPS's do below-noise-floor reception of GPS signals entirely in software - this would be the same. You might want custom hardware to avoid the rather huge power costs of leaving the main high bandwidth receivers being turned on the whole time though!
You could use the same MIMO mechanisms to get a better channel gain and increase the total amount of data users can transfer in a given city-sized area.
Even with a very directive antenna (or its black magic alternative, beam forming) it would use up a lot of spectrum over those 5 miles to deliver that 0.3kbps, worsening the connectivity of many people all around you. You'd hate it if everyone else was also doing it near you. Battery life would suck too.
Better connect to something else nearer to you, making a fairer use of the available radio spectrum, or if you absolutely need to connect to your own equipment, switch to a protocol more suited for such needs, like LoRa or even a custom-purpose version of Wi-Fi called HaLow (aka .11ah: https://en.wikipedia.org/wiki/IEEE_802.11ah)
Very roughly, every doubling of distance reduces received power and therefore theoretical data rate by a factor of 4. At 20 yards, I can get 2 Gbps out of my wifi, so at 5 miles, I can get 8 kbps with the same transmit power and antennas. Still plenty for instant messaging.
> From what I'm gathering, Canopy can be deployed over unlicensed frequencies (2.4 and 5 Ghz), allows for hundreds of subscribers connecting to a single Access Point, can provide up bandwidth in the 5-10 Mbps range, etc.
You can't beat physics. You'll need some sort of line-of-sight over those 5 miles or at least some path that allows signals to be reflected of something. Unless you venture into frequency ranges and transmit power levels where you can bounce signals off the ionosphere (or the moon), you are pretty limited in range without a repeater.
The closest to what you describe would actually be Bluetooth, but that's also a kitchen sink standard that tried to do everything and ended up with a lot of compatibility and connectivity issues that we still live with today.
I should be able to walk 5 miles from my house, and still have my house wifi work, but at a data rate of 0.3 kbps.
I should never have wifi 'cutting in and out' or 'on the edge of the range' - it should be a simple matter of not being fast enough if I'm too far away. If Shannon[1] allows it, the standard should allow it too.
Obviously my phone would probably choose to switch to mobile data long before that. But for some use cases, like instant messaging, 0.3 kbps is fine, and there shouldn't be a need to use a totally different standard to achieve it.
[1]: https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theore...