Tim Newton of Neul–the company running the radio side of the white-space trials in the U.K.’s Bute and Cambridge–detailed the engineering challenges of the task when he spoke at the Future of Wireless conference in late June.
His thesis (PDF)–that the radio world white-space systems inhabit is very different to that of classic wireless engineering, and much novel thinking is needed–raised some interesting and important points.
The theory behind white-space wireless is simple. In order to avoid interference, TV transmitters on a common frequency are always geographically distant from each other, leaving the channel unused over much of the intervening land. As TV frequencies are particularly useful–combining good in-building penetration, copious bandwidth, decent range and small antennas–the thinking goes that whole new data services can be rolled out in those gaps, known as white space. To do this, the frequencies need to be carefully interleaved with the TV transmitters and low power used to prevent interference.
The difficult part
It’s a powerful concept. The side that sounds difficult–making sure that the white-space systems are properly configured to work alongside the complex interlocking channel allocations of the TV stations–isn’t so hard. White-space systems have GPS and every couple of hours interrogate a central database that tells them what frequencies to use: they can also easily refuse to initialize if they detect a strong, unexpected TV station where it’s not expected.
The main problem with white space is that despite the maps, it’s not white.
In classical radio theory, a receiver tuned to an empty channel picks up random noise that’s primarily generated by thermal effects within the receiver, the antenna and the surroundings. Everything that’s above absolute zero creates some electromagnetic noise by the random movement of electrons, and it’s this very well modeled and understood effect that puts limits on the lowest strength signal you can pick up.
This isn’t true for the TV bands in which white-space systems play. Although the transmitter coverage maps for TV stations seem to show large areas of the U.K. blank for each channel, this is only true if you’re trying to pick up enough signal to create a watchable picture. Detectable signals turn up much further afield than the maps suggest.
For a start, the TV transmitters themselves are very powerful. Each digital multiplex can be up to 250KW, and there will be six multiplexes per site, with plans for a further two. That’s two megawatts from the strongest transmitters. Furthermore, the antennas are perched on top of very tall masts, giving them line of sight to horizons that can be 100km away. Even beyond those horizons, the transmissions refract over the edge of the Earth and can be detected well beyond that limit.
The result is that for almost every part of the U.K., in almost every channel, there are detectable signals from distant transmitters that the white-space system must overcome. And adding to the fun, the local TV transmissions are very strong.
Mixed signals
One of the hardest parts of radio design is preventing very strong signals from mixing with each other and creating internal interference in the initial stages of the receiver. One of the normal tricks for avoiding this is to make the receiver slightly deaf, perhaps increasing the strength of the transmitter it’s picking up to compensate, but neither is an option for white-space radio. Thus, the receiver design must be excellent and use top-quality components.
That’s rather ironic, as one of the absolute rules for white-space radio is that it mustn’t interfere with TV reception–and TVs tend to have some of the worst radio circuits ever perpetrated on the public. Built to a price and with maximum flexibility taking precedence over high performance, TVs are liable to interference from perfectly well-behaved transmitters. That means white-space systems have to use as low a power as possible, even though they’ll operate on unused local frequencies.
The final source of noise is gadget-based. All digital electronic equipment emits some radio signals, and various standards exist to minimize the interference this causes. However, there’s a lot of it about, as you can easily hear if you hold a radio tuned to a dead channel close to a laptop, network wiring, Wi-Fi router, or other gizmo. Devices such as powerline networks and DSL modems use radio-frequency signals to carry data, and are particularly promiscuous in their emissions.
Again, if you’re a TV station or a mobile phone company, you can in general set your transmission level to overcome the low-level interference from all these factors; it’s your radio channel. White-space systems can’t do that.
Be clever
The answer is to be clever: analyze the noise, find out what’s causing it, and work around it. One technique is called kurtotic analysis. In effect, this looks for the characteristic spikes in energy that fingerprint a transmission that may be too weak to decode but is nevertheless strong enough to be a problem.
Digital TV transmissions are very well characterized, and there’s a good chance that in the future it’ll be practical to mathematically cancel them out in the receiver. Currently, it’s too computationally intensive to implement economically, but Moore’s Law may fix that.
It’s a mark of Neul’s confidence in its radio technology that despite this hostile radio environment, it reckons it can cover well over 99 percent of the U.K. population with between 4,000 and 5,000 base stations. A lot depends on the field trials, though: it’s not surprising that the first one was in the radio wastelands of the remote Scottish Isle of Bute. The second, in Cambridge, will be more interesting.
This article originally was posted on ZDNet UK as “The secret life of white-space radio.”