Wi-Fi has become so ubiquitous over the past decade and a half that we talk about it – and complain about it – like it’s part of the weather. Be honest, average user – the first thing you think when your connection starts acting up is “damn it, what’s wrong with the Wi-Fi now?”
But the degree to which Wi-Fi is likely to be the limiting factor for any given connection is shrinking. Wi-Fi has evolved quickly over the past few years, so much so that it can seem like wireless is outstripping wired networks in terms of raw capability.
So is Wi-Fi, finally, “fast enough?” The answer, unsurprisingly, is pretty complicated.
Maximum data rates for Wi-Fi, outlined in the IEEE’s 802.11 standards, have long outstripped the average American’s home internet connection. Data collected for Akamai’s State of the Internet reports show that the national average broadband connection provided about 3.7Mbps of throughput in 2007 – well below the theoretical maximum of 54Mbps offered by even 802.11g, a version of the standard first published in 2003. (The latest figure for average U.S. broadband connection speed is 15.3Mbps, as of the first quarter of 2016.)
In practice, however, production hardware rarely comes close to the theoretical maximum throughput. Multiple clients, interference, and a host of other issues means that the actual rate at which wireless access points can move information around is far below the theoretical limit.
Greg Ferro, a well-known networking analyst who blogs at Ethereal Mind, told Network World that the concept of “speed,” as applied to wireless networks, is a lot more complicated than it is for wired ones.
“[It] isn't just the rate at which data moves from handset to [access point],” he said. “It’s also about duty cycles and availability because the wireless spectrum is shared. Faster data rates means that frequencies are less used over time and thus more devices are able to use a given base station.”
Not just speeds and feeds
What this also means is that higher data rates can help contribute to modern Wi-Fi’s biggest problem – density.
Joel Coehoorn, who is the director of IT at York College in Nebraska, says there are three key problems that make rated maximum Wi-Fi throughput figures largely useless. First, the stated maximum speeds are frequently the product of unusual configurations that would be unsuitable for use in the real world, and that most client devices aren’t set up to handle them anyway.
Second, many people don’t realize that an access point has to support the lowest data rate client on its network, he noted. If, for example, a router capable of 150Mbps is supporting four clients, all of which are trying to download a 100MB file, the client that’s only capable of 24Mbps will limit the speeds of the other clients by using its share of the airtime to download less data.
“This one download which could have taken only about five seconds worth of air time on the channel now used six times as much,” he says.
Finally, of course, multiple clients will consume their own slices of bandwidth, dividing the possible throughput of a given connection by the number of clients on an AP – a problem that’s mitigated with wired connections, which are generally full-duplex.
For Coehoorn, while technological improvements have made Wi-Fi considerably faster and better able to handle the demands of its ubiquity, it’s still going to be a bottleneck in certain situations.
“Older clients, and fundamental limitations will continue to mean wired networks outperform wireless, even when the wireless has a higher raw data rate,” he says.
More Wi-Fi innovations to come
Still, even if the latest generation of Wi-Fi devices haven’t become the worry-free wonder-hardware that the more enthusiastic marketing efforts suggest, that isn’t to say that great strides haven’t been made. The beam-forming and multi-user capabilities in 802.11ac Wave 2 mean that technology could remain the standard for quite a long time. (Wave 2 gear boasts a theoretical maximum throughput of up to 3.47Gbps, compared to Wave 1’s 1.3Gbps, thanks mostly to the aforementioned beamforming technology and the use of wider channels.)
Craig Mathias, a wireless expert and Network World contributor, says that fully optimized wireless implementations that use 802.11ac Wave 2 gear should be able to handle most workloads for the next five years or so. However, he cautioned, demand can be difficult to predict.
“[A]dding new [access points], denser deployments, and more efficient management and analytics will be on the shopping lists of most network managers,” Mathias says. “We're not to set-and-forget just yet.”
And Mike Leibovitz, director of the office of the CTO at Wi-Fi vendor Extreme Networks, concurs, saying that the desire for better and faster among CIOs is pushing demand higher.
“Certainly the customers I talk to, and others in the industry, people are always interested in putting more cars on the highway and seeing how fast they can go,” he says. “It seems today that most conversations, definitely with higher people in organizations … they’re certainly more focused on the experience, and what they can do on top of the infrastructure.”