The reason 6Ghz was introduced with WiFi 6E and 7 was because 2.4Ghz and 5Ghz was very busy.
My question is why isn’t there anything in between? Why isn’t there a 3Ghz, 3.5Ghz, 4Ghz, etc?
Also, what if things that require very little data transmission used something lower than 2.4Ghz for longer range? (1Ghz or something?)
When we talk about 2.4, 5, or 6 GHz the devices don’t operate at exactly that frequency, but within a band more or less on that number. For example 5 GHz is actually a set of channels between 5150 and 5895 MHz.
Technically there’s 802.11y (3.65 GHz), 802.11j (4.9-5.0 GHz), etc. It’s just that several of these bands cannot be used universally across the globe, because they may be reserved for other purposes. By and the bands that end up being used are ones that don’t require licensing to operate.
Reference: https://en.wikipedia.org/wiki/List_of_WLAN_channels
IIRC Ubiquity make a line of point-to-point ethernet bridges that operate in the 20GHz range (because more bandwidth, and if you have line of sight you don’t care about interference as much). Responsible vendors won’t even sell you one without sighting a license cos they can also get in trouble for selling it to you if it turns out you are operating it illegally
I remember that post from slazer2au… https://lemmy.world/post/19338754
The difference there though is that those devices are point to point, not broadcast/receive. Iirc there are different rules in place for direct line of sight devices, and misuse will land you a meeting with the FCC, hence why the more powerful variants require a license.
Great answer getting to the point of the question.
that makes sense. regulations and such over radio waves