How much bandwidth do we need in day-to-day life? Do we need enough to stream 4K video at 60 frames per second while driving down the highway? How quickly do we need interactions to round-trip from our phones to provide a real-time feel and interact with new devices—like self-driving cars? In a world where billions have little access to high-speed data, why would a gigabits-per-second standard even matter?
Those are the questions that we should ask as cellular data networks continue to mature. Apple devoted a “Stan Sigman of Cingular at the iPhone introduction” level of time and attention to 5G at its recent iPhone 12 introduction, and many of us in the industry are still puzzling over why. Apple doesn’t usually parrot marketing points or let speakers from other companies drone on about things that aren’t immediately useful to Apple or its customers.
Fifth-generation (5G) cellular networks have already achieved a reasonable level of rollout across the US and a few other countries, and many more countries are aggressively pushing private companies to build out the infrastructure as a national goal. It will eventually allow phones, tablets, fixed devices, and other equipment to transfer data at speeds ranging from hundreds of megabits per second to several gigabits per second. That’s impressive, given that it’s far faster than the vast majority of broadband Internet connections in the developed world.
5G is inevitable, and it would be a simple joke to say that it’s “just one more” than 4G, but to some extent, that’s true. 3G was the first Internet-focused flavor of cell networks, and 4G and 5G built on those principles. But 5G is being marketed as The Next Big Thing that will have some kind of transformative effect on everyday life and business.
Even the current generation of 5G-equipped devices that really have 5G tech—not the “5GE” label that is just fast 4G—have the potential to make data move zippier and with fewer delays. In practice, though, true 5G is hard to find in the field, where 4G LTE often outpaces 5G with current-generation devices. (Apple’s 5G-enabled iPhones aren’t yet widespread, so we can’t compare their performance; it’s unlikely to be much different.)
However, 5G won’t be transformative for most people or purposes. Its advantages primarily accrue to cellular carriers, even more so than 3G or 4G, which offered significant boosts in throughput and allowed higher rates over broader areas. 5G will let carriers charge more for service in some cases, handle more customers simultaneously, break into new markets that require higher throughput or low latency, and equip more kinds of devices with ubiquitous high-speed cellular data connections.
For users, it will gradually feel like we have broadband no matter where we might be, which is not terribly exciting except when you want to stream a 4K movie in the backseat of a car on a highway or download a 5 GB file in a minute in a coffee shop. The level of excitement should be more akin to finding out your city has silently dug up the streets while you were sleeping, replaced 10-inch water mains with 20-inch ones, and then cleaned it all up without you knowing. 5G is better network plumbing that your “Internet utility” has to install to deal with the amount of data and new data connections it wants to move around a city.
(If you’re concerned about health issues related to 5G, I wrote an extensive article about why the current debate is mostly manufactured. See “Worried about 5G and Cancer? Here’s Why Wireless Networks Pose No Known Health Risk,” 6 December 2019.)
Let’s start with the 5G technology and move into its applications.
Five Gee Whiz
The cellular industry has advanced across five generations of standards, about one generation per decade, starting in the 1980s. The 1G standard was analog and entirely focused on voice, although slow data rates could be crammed through. (I once filed a newspaper column over 1G at 9600 bps.) 2G switched to digital, improved voice quality, and enabled throughput close to that of the 56 Kbps dial-up modems of the 1990s. Next, an interim 2.5G improvement called EDGE, a bridge to 3G, upped data rates to as fast as 200 Kbps in Apple’s first iPhone. (That iPhone avoided 3G because the chips available in mid-2007 drained batteries like the dickens.)
It wasn’t until 3G emerged that we saw glimmerings of modern, high-speed, ubiquitous Internet availability. While 3G came in many flavors, it started at roughly hundreds of Kbps upstream and just over 1 Mbps downstream in the best conditions. Over a few years, improved phone chips and base stations enabled 3G to reach over 7 Mbps downstream. Some versions allowed voice and data to flow simultaneously; others had to pause data while a call was active.
While the future of cellular was still in development as Long Term Evolution (LTE), which would be the underpinning of 4G networks, carriers in the US got antsy. They started labeling their faster 3G networks as “4G,” presaging what’s happening today with 5G. Early “4G” networks were only slightly faster. True 4G LTE boosted speeds into the current tens of Mbps range, although LTE’s specification allowed for up to 1 Gbps for fixed usage and 100 Mbps for mobile purposes. (4G and LTE are sometimes used together, as “4G LTE,” and sometimes LTE is used preferentially to 4G.)
Along with the evolution of these generations came an increase in the number of electromagnetic frequency ranges that cellular carriers could use. Every country slices its spectrum up a little differently, though North America and much of Europe are aligned, as are many adjacent countries in other regions. While you may be accustomed to unlicensed spectrum used for Wi-Fi, cellular carriers must generally purchase licenses—often time-limited leases—for swaths of spectrum at auction or in carefully arranged government deals that in some countries reek of patronage, nepotism, or outright corruption.
As cellular standards advanced, radio-chip manufacturing became more sophisticated, processing power and bandwidth demands from phones grew ever heavier, and spectrum availability became more baroque. Early cell phones, even well into the 3G era, had chips that could handle only a handful of popular bands. Apple made several models of iPhones to cope with worldwide differences. Over just a few years, though, Apple, Samsung, HTC, and others generally gained the ability to produce as few as two worldwide models that could handle dozens of bands. While 3G moved a bit in this direction, 4G was more substantial, and 5G takes the cake. If you want to get a sense of how many different frequency bands are currently used, consult Apple’s 5G and LTE iPhone bands page.
If you read down that list with a gimlet eye, you will note something intriguing: while most frequencies are listed as MHz (megahertz), just a few have GHz (gigahertz) following their names—and only on the newest iPhone models sold in the United States.
That’s because the actual innovation in 5G isn’t in better data rates in spectrum ranges used by 4G and earlier standards. Rather, it’s about millimeter-wave (mmWave) transmissions that work at extraordinarily high rates over very short distances. Let’s dig into that along with what else 5G offers.
Long and Slow or Short and Fast
When trying to increase data throughput in any communications system designed to pass information, wired or wireless, engineers are constrained by the Shannon-Hartley theorem, a proof developed by three brilliant people (Harry Nyquist was the third) and named for two of them. The theorem effectively explains the upper limit of information—in digital communications, the data rate—that can be carried by a system and how the presence of noise reduces that maximum rate.
There’s always noise, which disorganizes information. Noise is why you might see a Wi-Fi device advertised as carrying a maximum of 3.2 Gbps but measure only 500 Mbps of actual throughput when you copy a large file: with any interference or signal degradation over distance, the maximum data rate quickly drops down. (Wireless networking also has a fair amount of overhead—from 20 to 40 percent of throughput—that’s necessary for managing traffic and preventing competition among devices on the same and nearby networks.)
Throughput = Spectrum x Antennas
There are several methods to improve throughput within the constraints of Shannon-Hartley. One is to add spectrum: expand the frequency ranges to increase the amount of data that can flow. But adding frequencies requires the aforementioned government interaction. Countries are eager to spur innovation and investment, so they have regularly made more spectrum available to gain the ostensible future benefits of 5G.
Another method of improving throughput relies on adding antennas. That might sound like just improving reception or transmission, but for over 15 years, multiple-in, multiple-out (MIMO) radio systems have allowed devices to transmit simultaneous streams of data that a receiver can distinguish. By changing certain wireless characteristics and using different combinations of antennas, cellular and Wi-Fi base stations can even direct signals directly to specific devices, called beamforming.
MIMO allows frequency reuse in the same space, effectively multiplying throughput. It doesn’t violate Shannon-Hartley because it leverages distinct paths across the same volume of space. Imagine a billiard table on which you send balls caroming around along unique paths. The difference is that as long as wireless signals are on different paths, they pass through each other, unlike billiard balls.
But MIMO has a physical constraint: antennas have to be a particular length that corresponds with the frequency wavelength. The 2.4 GHz wavelength used in Wi-Fi is about 5 inches (12.5 cm), and commonly used antennas are designed to be half a wavelength. You’ve probably seen Wi-Fi routers festooned with antennas—some have 8, 9, 12, or even more external ones! But there’s a practical limit on adding more antennas, even for cellular towers, due to their size and the complexity of attaching them.
The millimeter-wave (mmWave) ranges available for 5G start at 24 GHz, which allows for extremely small antennas that can be packed together tightly. (A half-wavelength antenna at 24 GHz is 0.25 inch or 6.35 mm.) Cellular base stations might be equipped with several dozen antennas linked together into a phased array, which enables precise beamforming across a huge number of combinations of antennas. The industry calls this “massive MIMO.” Many, many more devices can each receive essentially their own full-speed data stream, even in a crowded environment. (A famous Wi-Fi failure in the early 2000s was a phased-array antenna that was so far ahead of its time that, despite successful prototypes, the company couldn’t take it into actual production. But the idea was sound—particularly at mmWave scale.)
The downside of mmWave hinges on the relationship between signal power and wavelength. Higher frequencies require more power than lower frequencies to achieve the same range at the same signal quality to noise ratio (the commonly seen SNR measurement). At the same power level, lower frequencies can’t transmit as much information as higher frequencies, but they travel further and penetrate solid objects better.
Range and penetration were two reasons why 2.4 GHz was preferred originally for Wi-Fi because, with the original very narrow Wi-Fi bands, transmissions could pass through objects, walls, and ceilings while maintaining a passable data rate. Wi-Fi in 5 GHz (and in 6 GHz in the US soon) relies on rules that allow for greater power and the capability to use much larger swaths of frequencies.
With mmWave, because the frequencies used are so high (starting at 24 GHz), its estimated range is like Wi-Fi: about a 500-foot (150-meter) radius. In comparison, cellular frequencies at 2 GHz enjoy a roughly 3-mile radius, and when you drop down to the even lower-frequency 700 MHz range, signals can travel within a 6-mile radius. (In practical terms, cell towers have to overlap to ensure seamless handoff and are placed far more densely than those maximum ranges to handle large numbers of users in dense urban areas.)
There’s one more parameter here, too, that can affect throughput. Network systems encode data through modulation, which (more or less) maps bits into an analog pattern. Quadrature amplitude modulation (QAM) is heavily used for wireless communications. You can think of it as a square containing a pattern of dots spread out across rows and columns, called a constellation. The dots as transmitted should be received exactly on the interstices where rows and columns cross, but QAM is designed to let a receiver nudge dots that don’t line up back into the right place.
Each generation of digital cellular and Wi-Fi technology has increased the size of this constellation, making it possible to cram more data into each time-slice of wireless transmission. Larger constellations require cleaner signals, which typically means that a device has to be relatively close to a transmitter to achieve the higher throughputs.
Conveniently, the high frequencies of mmWave require base stations to be located close together to provide coverage at all. That fits nicely with large QAM constellations requiring clean signals.
Alongside all of these changes to increase throughput is the potential for 5G to reduce latency, a lagging factor in cellular that’s a key attribute of responsive wired and Wi-Fi networks. Latency measures the amount of time it takes for a network transmission to pass from its origin to its destination, no matter how fast it goes. Think of the flow of water to a faucet: the water pressure and pipe width control the throughput—how much water can be delivered in a period of time— while latency measures how long it takes from turning the tap until water comes out.
4G networks have a latency of about 50 milliseconds. 5G should typically be closer to 10 ms, which is similar to modern Wi-Fi and roughly equivalent to the limits of human visual perception—the time between an image appearing and us processing it. However, 5G has the potential to drop even lower, down to 1 ms, which is the same latency that wired Ethernet can achieve.
For interactive purposes, high latency is a killer: it’s what makes you see or hear a lag when using videoconferencing or VoIP calls, and it prevents things from happening in what feels like a real-time way. That’s critical for gaming, but also for many industrial and business purposes, where the lag has to be as close to zero as possible.
There’s one more trick up cellular’s sleeve. Both 4G and 5G also employ a technique—used earlier in Wi-Fi standards—that breaks a wide swath of frequency set up as a channel into tiny sub-channels, each of which has its own modulation. If there’s interference or a reflection problem in one sub-channel, it doesn’t downgrade the throughput of the entire transmission. It’s like plowing a field and avoiding rocks.
For further reading, I suggest this highly understandable article about 5G at Waveform.
The Purported Potential Uses of 5G
The US is the first country in which 5G will rely on a triad of cellular frequencies: existing ones across a range of bands, new allocations up near the bands currently used for 5 GHz Wi-Fi and soon for 6 GHz Wi-Fi, and mmWave starting at 24 GHz. It’s a grand experiment for delivering broad-scale higher-performance in lower bands and super-fast throughput as needed in the much higher bands.
The uses cited for 5G include all things we do now, though carriers actually don’t mention video streaming all that often. Perhaps 4K-quality video streams just aren’t that compelling, especially given that some carriers already downscale video automatically or require a higher-priced subscription to get higher fidelity than 480p, and more expensive plans top out at 1080p.
Carriers are excited about (and investing in) 5G because they anticipate new money-making opportunities, particularly in industries in which low-latency, high-bandwidth, high-coverage wireless enables new products or services, or allows shifting intelligence from edge devices to central processing.
Some of the most compelling cases are:
- Augmented reality: In recent years, Apple has focused significant attention on AR, which can require a lot of constantly updated data that’s processed centrally and streamed to a device, all while responding to movements in the physical environment.
- Gaming: Gamers often required wired Ethernet connections in their homes for the best results. 5G will make mobile gaming more responsive.
- Rural access: Every generation of cellular technology promises better coverage for rural residents. Every generation often disappoints them, too, because carriers prefer to deploy service where they can more easily make money. However, 5G’s greater efficiency and variety of frequency options, particularly in some new frequency territory around 5 GHz and 6 GHz, should generally improve rural service.
- Urban/suburban access: In some cases, carriers and other parties might find it feasible to deliver high-speed urban and suburban residential broadband over 5G. It’s more likely to happen outside the US because in this country there’s sufficient inexpensive wired infrastructure (cable, phone wire, and fiber) in more densely populated areas. I pay $85 per month for unlimited gigabit Internet in Seattle; it’s hard to imagine a wireless provider offering even 100 Mbps at that price for residential-scale video and other use in the US. However, in some developed and developing countries, even relatively populated or dense areas lack wired or fiber-optic infrastructure at the level demanded.
- Remote medical procedures: We’ve all become more familiar with telemedicine consultations in the last few months, but with sufficient bandwidth, remote medical procedures are here today. Diagnosis and even robot-assisted surgery can be performed through remote linkages, but setting up a stable, low-latency, high-bandwidth network where a wired, low-latency broadband connection is unavailable, or for facilities that aren’t able to wire Ethernet into existing areas, would open up new possibilities. (That said, would you want a wireless surgeon operating on you? Seems like a hard sell.)
- Autonomous cars: A car can’t rely solely on a 5G network for robotic operations while it’s zooming down the highway, but it could overlay its onboard capabilities with information gathered around and ahead of it to reduce accidents and improve safety.
- Expanded sensor networks: 5G will enable massively scaled sensor networks for monitoring infrastructure. A Deloitte report suggested, “Imagine a scenario where millions of such devices can be connected in a city center, measuring temperature, humidity, air quality, flood levels, pedestrian traffic, and more.” I can imagine plenty of negative uses, too, but after suffering from weeks of bad air in Seattle recently, I can also acknowledge some of the more constructive purposes.
- Industrial robots: Robots used in factories have to be hard-wired for control to keep latency low. Wi-Fi relies on unlicensed frequencies, which makes depending on throughput sometimes iffy, as we’ve all seen. Licensed 5G inside manufacturing facilities could enable wireless robots and make it easy to move them or add new ones without rewiring the factory floor. These private 5G networks would be like Wi-Fi but with higher power, lower latency, and more stability thanks to running over restricted frequencies.
Additional use cases will surely arise as the networks are deployed, but you’re excused if you don’t find the list above compelling. That’s a problem for carriers, who are largely eating the cost of network updates, except Verizon, which is charging customers more for it; see below. It also troubles phone makers who want the engineering effort of adding 5G support to be seen as a major reason to buy the next generation of phones that have only incremental improvements otherwise. Smartphones haven’t reached the end of innovation in their features, but the camera, display, and processing improvements make less of a difference with each release.
In short, although 5G is inevitable and may become an important aspect of society’s networking infrastructure, there’s no reason for most people to upgrade to get it right now.
Carriers Plan Their Plans
When it comes to 5G rollouts, cellular carriers face a lot of competing problems and employ different marketing and pricing approaches, even as they have more or less adopted the same technology. It’s a bit like Coke and Pepsi if Coke only let you buy its sugar water in 12-packs of cans and Pepsi could only be purchased in 2-liter bottles.
For now, we’re seeing the major cellular firms roll out 5G networks in order to claim they have 5G networks in place—they want competitive bragging rights. Only a few limited areas have 1 Gbps or faster mmWave service available for customers. PCMag dug into maps for Verizon’s mmWave service and found it was scarce so far and, as expected, clustered in places that likely also have high-speed free or paid Wi-Fi. AT&T and T-Mobile have not yet announced mmWave plans. Here’s how it shakes out now:
- Verizon says its mmWave “5G Ultra Wideband” (UWB) can be found in 55 cities, while it has regular 5G across swaths of metropolitan areas nationwide. It charges $10 extra on its unlimited plans per line for 5G data rates.
- AT&T seemingly calls its current 4G network “5G,” but says “5G+” (actual 5G) is “available in select innovation zones in over 15 states across the US.” AT&T includes 5G throughput on its “Unlimited Starter, Extra, and Elite plans,” which start at $35 per month and require at least four lines.
- T-Mobile claims it has the biggest deployment, with over 7500 cities and towns having 5G in place, but given that the company also promises that “our network will be 8x faster than current LTE in just a few years, and 15x faster in the next six years,” it’s unclear which part of the network is faster 4G and which is actually next-generation 5G. At least T-Mobile says it won’t charge more for 5G service. (T-Mobile acquired Sprint earlier this year and has developed a 5G plan that coordinates the two brands.)
Verizon’s early mmWave deployments are promising, providing fiber-optic broadband and high-end Wi-Fi speeds in the extremely limited areas they cover, though I will ask again—to what end? I don’t need 1 Gbps while strolling down Newbury Street in Boston. But I can imagine appreciating excellent throughput when we’re once again surrounded by thousands of people in public.
More disappointing, however, is that the “normal” flavor of 5G, the generational upgrade to 4G, appears to be lagging behind 4G LTE performance in some areas where they overlap. That will change, but it seems odd that your fancy new iPhone with 5G capability could see worse performance than 4G in some places.
Are We Ready for 5G?
I hate to be a downer when it comes to improved technology that actually does what it says on the tin. 5G networks will provide substantial improvements in throughput and availability that we will notice—in a year or maybe two. Until then, not so much.
I’d almost rather the entire industry didn’t talk about it for a while, but 5G-involved companies have to talk about something because that’s how marketing works. Advertising that “we keep making things slightly faster” is not a winning campaign, particularly when your competitor is shooting off 5G fireworks.
5G is inevitable, in that all phones and cellular-capable devices will transition to supporting early flavors of it over the next year, including some relatively fast versions that use mmWave. The question is when we’ll see use cases that impact our everyday lives.
- New Members
- Understanding 5G, and Why It’s the Future (Not Present) for Mobile Communications
- Hiding Apple’s Big Sur Upgrade Badges
- Big Sur Is Here, but We Suggest You Say “No Sir” for Now
- Apple Network Failure Destroys an Afternoon of Worldwide Mac Productivity
- Apple M1 Chip Powers New MacBook Air, MacBook Pro, and Mac mini
- Apple Drops App Store Commission to 15% for Small Developers