is that there are so many to choose from.
I just found out that 802.11a — what was the weaker precursor of 802.11b, the WiFi standard — is now bigger and stronger at 56 Mbits/second just like 802.11g.
What does this mean? All I’ve been able to determine is that 802.11a is better in a lot of ways, most notably that it operates in the 5 GHz spectrum rather than the increasingly crowded 2.4 GHz 802.11b, BlueTooth, and wireless phones operate.
And to add to my confusion, mixed-mode a/b equipment has been out for a year, well ahead of 802.11g. So why has Apple embraced the g standard? I can only assume that it’s interoperability with the 802.11b hardware they already support was a strong selling point.
For the life of me, I can’t see a whole lot of use for 56 Mbit anyway. It’s not 100 MBit so it’s not the next logical step up from 10. I’d like to dig up the article with Bill Joy talking abou the wireless stuff they (Sun) setup in Aspen and his determination that 1 MBit was about all anyone used: more than that made no difference. That may not be true in the these rich media days, but is 10 Mbits/second enough?