Not in my case! With 'illegal' 40MHz bandwidth forced on and streaming slingbox wirelessly over 2.4GHz for +6Hrs only 1 dropped packet disproves this!
Not "illegal" per the FCC (they make the regulations for the US, not IEEE or WiFi Alliance).
Dropped packet is IP layer. Lotta things can cause that - most often, the internet, not WiFi. As said above, the measure of WiFi is an 802.11 analyzer that tallies up MAC ACK timeouts w/retransmissions, and uncorrectable frames (FER = frame error rate). FER is well hidden from consumers.
Good thing, too, because crummy hardware abounds that transmits distorted waveforms (bad FER as it leaves the transmitter, before the impairments of propagation some distance). Some (DD-WRT) users try to coerce the hardware to higher power than it was designed for, at a certain modulation rate (bit rate). Higher the rate, lower the RMS power in the channel. Laws of physics + economics of the cost of the final amplifier in WiFi products - that must be very, very linear at the highest OFDM bit rates. That's called the "OFDM back-off", meaning the transmitter RMS power has to be reduced to avoid distortion at the higher bit rates. The backoff is around 5dB; 6dB would be quarter power). Avoiding that backoff is costly in hardware in the competitive market, so usually, its not done. The lower rates in 802.11 are not OFDM, no backoff, higher power specs.
for those that cant afford the cwna course, you can also read http://www.amazon.com/Certified-Wire.../dp/0470438908 which will make you understand a few things like 802.11 frame retransmits and framecorruption. it is 700+ page book though so i wouldnt consider this "light" reading
understanding how wireless works on layer 1 and 2 will help you understand why it is behaving the way it is and why bullying your neighbors by forcing 40mhz on 2.4 ghz isnt going to help anyone.
so what if it says linkrate 300 mbps. you will get so many frame-errors that the effective rate will plummet (hence my suggestion to do an actual iperf/jperf benchmark)
thats internet speed. so that speedtest is useless.. 75 megabit is roughly half of what you can see on a good 300mbps wifi connection on the lan.
so if all you do is test the internet you will probably never notice issues. your wireless could be dropping/restransmitting frames like crazy and you probably wouldnt notice it, because your internet is considerably slower than your wireless..
test jperf between two pcs.. one connected to the cable and one connected via 300 mps wireless.. if you see 130+ mbps stable throughput i would consider that good. if you you get less than 100 you know your wireless has issues.
and to find out what exactly is causing the issues would require a lot of efford, but it would most likely (80%+) be related to your forced 40mhz + outside interference.
there is a reason why it is required to fall back to 20 mhz if you want to actually be 802.11 compliant.
40 mhz und 2.4 ghz will basically only work well if you are inside a shielded bunker or in a very rural area where no other wireless exists
I tried the 40 MHz only mode for 2.4 ghz on my rt-n66u with the latest stock firmware. I'm in an apartment building, so interference is plentiful. At 40 MHz, the transfer rates are extremely inconsistent, plummeting to as low as 5 Mbps in my testing. It does seem easier for my Logitech Revue to stream HD video at 40 MHz on 2.4 ghz. I sometimes get a forced "low bandwidth" switch to SD at 20 MHz despite my 30 Mbps down, 5 Mbps up cable connection. I never see that at 40 MHz. Browsing is where the inconsistencies are the most pronounced. There's either a little stutter when the page is loading or it stalls completely on my tablet.
It makes sense to use 20 MHz in my situation (or 5 ghz where I can). Using 40 MHz on 2.4 ghz gives you more bandwidth, but opens you up to that much more interference because you're using more bandwidth. In an apartment building saturated with AT&T and TWC 2.4 ghz routers, not to mention other appliances, it's better to get as much stuff as you can on 5 GHz. I was shocked to see that my 5 ghz network was the only one inssider detected around me.
Last thought: besides the conspiracy to eliminate 40 MHz on the 2.4 GHz frequency, what happened to external antennas on routers? I'm sure that antenna technology has progressed some, but they've become a rarity. Did OEMs mostly do away with them simply because the range wasn't needed or did antenna technology improve that much? It sort of makes sense to lessen the range of a router as a way to limit interference (once newer routers are adopted).
Last thought: besides the conspiracy to eliminate 40 MHz on the 2.4 GHz frequency,
To repeat information presented here, the 40 MHz operation in 2.4 GHz with coexistence mechanism is an 802.11-2012 spec requirement.
Originally Posted by small_law
what happened to external antennas on routers? I'm sure that antenna technology has progressed some, but they've become a rarity. Did OEMs mostly do away with them simply because the range wasn't needed or did antenna technology improve that much? It sort of makes sense to lessen the range of a router as a way to limit interference (once newer routers are adopted).
Three reasons: Cost, aesthetics and performance.
Dual-band three stream routers require three antennas per radio. Single band antennas are better than dual-band so that would be six antennas. That would have very low WAF.
Internal antennas are less costly than external, especially when they can be integrated on the main board.
Finally, there is no performance tradeoff. Antenna technology has indeed progressed. The plastic case is transparent to the RF. The main disadvantage is the ability to play with independent antenna position. This has become less important with 802.11n where signal multipath actually is a good thing (improves receiver gain).
Antenna technology has indeed progressed. The plastic case is transparent to the RF. The main disadvantage is the ability to play with independent antenna position. This has become less important with 802.11n where signal multipath actually is a good thing (improves receiver gain).
The jargon: "Beneficial multipath" versus "Destructive multipath". For many years, the latter has been converted to the former - using (way back) "adaptive equalizers" - which essentially create delay windows in time to recombine the later arriving signal power with the earlier ones. The later ones arrive late, well, due to the longer path that multipath takes. Ye ole analog TV- ghost picture... from a mountain, airplane, etc., is classic multipath.
DOCSIS cable modems have to deal with multipath due to reflections in imperfectly terminated coax and amplifers in coax. So they use adaptive equalizers - where the adaptive means it can change with varying conditions. These are non-OFDM signals ("single carrier").
In OFDM signals, as in popular 802.11, the same concepts apply for combining signals after making them, essentially, time coincident despite the delay times (multiple delay paths- tree leaves, icy road, some building materials, etc.). MIMO also adds space/time diversity - a subject of many graduate thesis, and the praise goes to the likes of NSA from decades back for inventing it. Used in the 60's for tropospheric scatter/over-the-horizon communications.
The enabler wasn't antennas per se, but low cost high speed digital signal processors.
It's amazing that today's economies have taken this to, what, 25% of $125 manufacturing cost?
I thought 20/40 coexistence only kicked in if legacy a/b/g networks (20mhz only) encroached into the channel range and not other 20/40 N devices? is that a wrong assumption?
That's incorrect. Any network that meets the coexistence interference criteria should prevent 40 MHz mode operation.
Important to note that networks that are right on the main or secondary channel, do not trigger coexistence fallback. So if you have an AP in Auto 20/40 mode with Primary Channel on 1 and secondary on 5, if a neighboring network is operating on either of those channels, 40 MHz operation will be allowed.
Originally Posted by ggbris
Also, if a 40 Mhz Only mode exists in the router - would that fail WiFi Alliance Certification, or is that an IEEE standard requirement only?
An AP can have that mode for marketing reasons. But if the AP does not obey 20/40 coexistence, i.e. properly fall back to 20 MHz mode, it is in violation of both the 802.11 standard and WiFi Alliance.