Thu Jan 04, 2007 9:56 am
Since knowlege might impart some civility and understanding:
Written only with the intention of helping to reduce the proliferation of misinformation with what I know. The standards evolved out of order, but here goes:
First came Spread Spectrum which could hop across about 80 small channels and they called it FHSS- Frequency Hopping Spread Spectrum with bit rates around possibly around 1Mbps, maybe higher.
Next came 802.11b. This put the signal on the air using a 20MHz wide channel, codified by Complementarty Code Keying to code data symbols into the DSSS-Direct Sequence Spread Spectrum signal. One big wide 'sperad out' signal. Different and higher data rates achieved by 'coding' more data into the 'symbols'. Up to 11Mbps.
Then they figured out it didnt always work, or it needed help with reflections (multipath) caused by obstructions and clutter, and marketing wanted to say it was Non-Line-of-Sight. Engineers finally got mad and told marketing, no its 'Near' Line-of-Sight since it will go through a wall you cant see through, etc.
An innovative engineer figured out that that original FHSS using lots of little channels had its merits. So they invented OFDM. Orthogonal Frequency Division Multiplexing. This concept uses lots of small channels carrying a smaller, but easier to decode amount of data. However, it needed to put all the small channels back together to deliver higher data rates. Simplistically, consider a piano. If you put your arm across the piano keys, the ear can actually hear all the notes. The OFDM reciever can hear all the channels, but it needs to put them back together in the right order because some of them bounce and take a little longer to get there. So the OFDM processor must be fast enough to put them all back together before the 'next' set of data starts coming in. They couldnt put this standard on the 2.4GHz band, because they would never be able to do enough testing to make it work(simple explanation). So they decided to put it on 5GHz and call it 802.11a. This standard had the capability of delivering 54Mbps raw throughput in the same channel size, and had the enhanced characteristic of being able to actually use the bouncing and reflected signals to its advantage and found that it would deliver reliable signals even through cluttered paths, therefore it was NLOS!
Well once they had that 802.11a OFDM worked out, using the same bandwidth, all they had to do then was prove that it didnt trash the band it was operating in, or appear as a significantly louder signal. They measured the average power and other things, and figured the specs to allow the OFDM to operate in the 2.4GHz band. So they had to give it a standard number--802.11g. So the G signal is identical to the A signal.
Once they got all that down, it was somewhat a combination of trickery, engineering and marketing that gives us the rest of the story. Bonding has been around for a long time before wireless, so put two channels together and you might get twice the throughput. However, the wider that signal, the more subject it is to every negative effect that can happen to a digital signal. Now if its Super A, or G, it has to put back together twice the amount of small channels, and the chance that you could lose some of the channels is probably more than 'twice'.
Other engineers figured that why not put more transmitters (in the SAME band) and/or antennas running in sync with each other to gain more throughput. This idea has merit, but the benefit, as it should be, is more reliability, and that is what really gets you the extra throughput.
Further proprietary variations and enhancements to receiver and antenna technology, without affecting the 'standard', also yield better reliability, and ultimately deliver more data. MIMO, XR, beam forming, XSPAN, Adaptive Radio and other magic are all good things.
Finally, the marketing guys figured out that if the engineers developed stuff that worked better, and they did their jobs to get the word out, they wouldnt have to play numbers games to sell more product because the equipment actually would sell itself if it actually did what the specifications said.
One more thing the engineers could do without much problem is make the essentially client-driven, indoor LAN based technology be more service provider driven. 802.11 is a 'CLIENT DRIVEN' protocol and to all the service providers out there, why would you want your 'clients' dictating how you 'the provider' delivers 'ACCESS' to your 'SERVICE?' So some engineers in some companies created software driven methods for the 'service provider access point' to exert control over when the 'clients' radio, or far end of a backhaul circuit, transmits and recieves, and maybe even provide a method to 'tune' parameters to improve reliability. If you have read this far, I appreciate it, and this is why the 'Nstreme' technology doesnt disconnect and reconnect--I am not an expert at this--but it is a result of putting the transmission control back where it belongs. This is not the only way to accomplish this, but it sure helps when attempting to operate a reliable service, and I am just glad to see that someone figured out how to overcome the inherent limitations of the upside down protocol rules.
So, now here comes WiMax. Thats just got to be another thread because its 802.16! BTW, WiMax uses OFDM. Guess who controls the transmission in this protocol, and guess what market segment and application requirement it is designed for?
BTW, I am new to using Mikrotik, and to this forum. What I have seen in this thread is a very enthusiastic group of wireless advocates that are all arguing intensely to accomplish the same end--putting more viable and reliable networks out there that dont break us to operate. I also operate other technologies that are virtually plug and play, but the opportunity to customize, integrate, control, tune to your own liking and for that matter, create 244km links from mountains going over water is a lot more fun than pluggin and playin.
As far as the original question, there is no right answer. My two cents for an answer that would give you 'A/B/G' in a way no-one actually suggested, is use 3 radio cards, three feedlines. 2-Omni Antennas, one 5Ghz physically isolated(as much as possible) directional antenna with as much Front-to-Back and Side-to-Side and Cross-Polarization rejection as possible. Make the 5GHz isolated antenna horizonal polarization, and operate it as low down the band as possible, with the 'A' Access Point as far up the band as possible. Thats all I can come up with for what could be an integrated ABG access point with backhaul......
Thats my story, and I am sticking to it. The verbosity was the only way to completely respond to this wild thread.
Good propagation to everyone...73s