Frequency planning for microwave communications
In this extract from his new microwave handbook, Trevor Manning explains some of the intricacies of planning radio networks.
Most people have experienced a crackle on their radio or TV sound during an electrical storm. The crackle is caused by stray signals from the increased white noise of the lightning strike breaking through to the receiver. Telecoms receiver equipment will demodulate any signal within its passband, and all signals that do not originate from the original source are considered to be interference.
Interference analysis is a specialist field, but the basic issues should be understood to ensure a quality design as, in digital systems, interference can be a hidden gremlin. People may think their network is interference-free, but in fact, the interference may be present and only become an issue as the link fades. For example, a light rain storm that should have no impact on system performance could result in excessive errors and poor performance due to the latent interference, rather than the rain itself.
In analog radio networks, interference was a primary contributor to network performance even in unfaded conditions. Any signal degradation had an immediate effect on the system. In digital systems, the equipment is far more robust, and much less sensitive to interference issues. It is usually only in a faded condition that interference is noticed. Many designers only consider threshold (T) to interference (I) conditions, as the analysis of the unfaded carrier (C) is not of concern.
Ironically, the robustness of digital systems has often resulted in worse-performing, rather than better-performing networks. The reason for this is that with analog radio any performance issues were noticed at the time of commissioning in unfaded conditions. It was imperative to resolve them as everyone was aware there was an issue. In digital systems, very high levels of interference may be present but until the system fades there is no noticeable degradation in link performance, and so nothing is done to address the issue. Often, network operators are not even aware that they have this hidden performance issue.
The fundamental design issue is that the demodulator requires a minimum signal-to-noise ratio (SNR) to operate error free. This SNR value varies depending on the modulation scheme. With complex modulation schemes, the value may be very high, so even a very low-level unwanted signal may degrade the demodulator performance by adding to the noise floor. When the threshold is degraded, the link may not have an adequate effective fade margin (EFM), despite having a good flat fade margin (FFM). The degraded threshold is illustrated in Figure 1.
Spectrum allocation
The body that coordinates the international allocation of frequencies to ensure interference-free operation, is the International Telecommunications Union (ITU), an agency of the United Nations (UN). Frequency plans are created in the ITU-R F series, and any changes to global frequency allocations are decided at the World Radio Conference, which is held every three to four years.
It is up to the government of each country to appoint a frequency regulator who then uses the ITU standards to regulate spectrum use. Two ITU-approved plans may interfere with one another, which can create cross-border complications. It is also the reason that two plans cannot be used simultaneously within an individual country. It is the job of the regulator to pick one plan per frequency band, which all operators must use in that country.
Two main types of licensing exist: individual licences and general use licences.
With individual licensing, where spectrum is usually controlled and allocated by the country’s telecoms regulator, the specific apparatus is registered and approved for a specific effective isotropic radiated power (EIRP) output at a specific location. Individual licensing is sometimes referred to as apparatus licensing. In apparatus licensing, each end of a radio link is registered, and detailed interference calculations are done to guarantee the link will be interference free. A fee is paid to the regulator for this privilege, but in return, if any interference does occur, the regulator is obliged to resolve such potential interference events.
With general use licensing the equipment apparatus is licence-exempt when deployed, and the regulations for use are covered under regulations for short-range devices. The equipment is licensed for the entire class of equipment, but an individual licence is not required for each deployment. Broad technical parameters of the equipment are defined, but the equipment may be deployed in any geographic location. In the case of licence-exempt equipment, the usage of the same spectrum by others is not controlled, leading to an interference risk, and it is the operator’s responsibility to resolve it.
General or licence-exempt licensing is sometimes called unlicensed, although I prefer the term class licence, as used by some regulators, as even though the entire class of equipment is approved for generic use in any geography, the equipment is still licensed.
A new variant called light licensing refers to self-co-ordinating bands, such as E-band, where the user defines the channel to be used, and it is registered by the regulator. There is work being done for a more advanced approach to licensing in future, such as Dynamic Spectrum Allocation and Licensed Shared Access. Under this model, spectrum would be freed up when it was not being used to increase spectrum efficiencies.
Frequency planning
Spectrum is extremely valuable as it is a scarce resource. Operators can pay hundreds of millions of dollars to secure spectrum and with the massive capacity demands from 4G and 5G cellular networks, more and more fixed microwave link bands are being repurposed for mobile use. Spectrum planning is thus essential.
When interference calculations are done, all analysis is assumed to be co-channel. The way adjacent channels are dealt with is to convert them to co-channel via their filter roll-off characteristics called the Net Filter Discrimination (NFD), as shown in Figure 2.
Hence:
C/I (co) = C/I (adj) + NFD
where
C/I (co) = Carrier-to-Interference ratio of the co-channel signal
C/I (adj) = Carrier-to-Interference ratio of the adjacent channel signal
NFD = Net Filter Discrimination
As discussed earlier, in digital systems it is the relative interference level between the faded carrier and the unwanted interferer that is of interest. If the fading event affects the carrier equally to the interferer, then the interference is considered correlated, and the C/I is unaffected by fading, and so the fade margin can be ignored in the calculation.
There are three main interference considerations: bucking interference, nodal interference and overshoot interference.
Bucking interference, also called site sense interference, refers to the interference experienced by a receiver that is operating in the same portion of the frequency band as another transmitter at the local site. The rule is that all transmit signals at the entire site should be tuned to either the high or low end for a particular frequency band. Transmit low sites are called A-ends and transmit high sites are called B-ends, so site sense planning is also sometimes called A-B planning, as illustrated in Figure 3.
All transmitters at an A site transmit low, and so it is not possible for another operator to break through the antenna discrimination and duplexer isolation and cause interference. You may be forgiven for thinking that because you are on a different frequency channel in the band, and the interference is coming out the side of the unwanted transmit antenna into the side of your receive antenna, it will not cause problems. The issue is the dynamic range involved when a transmit signal could be as high as 20–30 dBm, and a faded receive level could be as low as -90 dBm, which is over 100 dB dynamic range!
The only way to reduce the transmit signal down to an acceptable level is to use site sense planning and use the filtering inherent in the T-R spacing of the channel plan.
In the past, the spectrum regulator would specify the site sense, so it was not as essential for the operator to check this, but today in many frequency bands such as E-band, it is assumed that the operator is aware of this problem and has planned accordingly. Site sense is also something to be aware of in license-exempt bands as there is no guarantee that there are no high-low site conflicts.
Site sense is also critical with any ring topology as you cannot have an uneven number of links in a loop. Referring to Figure 3, it can be seen that the two sites on the right could not be connected with a microwave link in the same frequency band as you would be attempting to connect two B-ends creating a Hi-Lo clash. An additional site would have to be added so that an even number of links were used to complete the ring. Alternatively, a different frequency band could be used, if allowed, where a different site sense could be allocated to the new frequency band.
The second type of interference, nodal interference, occurs when we reuse the same frequency channel at a node. The practice of reusing channels as many times as possible before moving to a different channel is good frequency planning. Frequency channels are an expensive and scarce resource and so should be preserved as much as possible. By reusing channels in a planned manner, much greater spectrum efficiencies can be gained. Nodal interference occurs into the antenna of the adjacent link, at the same site, as shown in Figure 4.
Choosing a high-performance antenna with low side lobes, and excellent front-back (F/B) ratio enables a planner to reuse a channel multiple times at a node. Bands like E-band with their very small antenna apertures enable a significant number of links to be packed into a node. Referring to an earlier discussion on pencil-thin beams, it is often this insight about the efficiency of frequency reuse at a node that makes planners erroneously think that they can, therefore, reuse the frequency at multiple sites.
In terms of nodal interference, changing polarisation only helps to reduce the interference when the interfering link is directed into the front half of the antenna. The cross-polar discrimination improvement can be determined by referencing the supplier’s Radiation Pattern Envelope (RPE) diagrams.
The final type of interference covered here is overshoot interference. This type of interference occurs at the subsequent site in a radio route, where the receiver is tuned to the transmitter site two links away, as shown in Figure 5. For each site, the transmit high (TX-H), transmit low (TX-L), receive high (RX-H) or receive low (RX-L) are specified.
Despite the interferer being a long way away, distance itself has limited impact on the interference considerations. We have discussed how much discrimination is required between the wanted Carrier to unwanted Interferer signals, and it typically exceeds 40 or 50 dB. Distance helps indirectly through Earth bulge, and physical blocking of the line-of-sight, but the interfering signal from a 50 km hop is only reduced by 6 dB, 100 km away, and 12 dB, 200 km away from pure signal degradation.
Considering that most long hops over 30 km would have a fade margin in the order of 40 dB and that the minimum SNR would probably be around 30 dB, a reduction of 70 dB in signal strength would be required before the interference could be ignored.
Even at 13,000 km away, the signal would only have reduced by 50 dB, based on distance alone. This is why a microwave signal can happily reach a geostationary satellite 36,000 km away. It is also why more attention should be paid by planners to reduce the antenna height placement and rely on diffraction to protect against interference.
In this case, the interference is into the front of the antenna, and so alternating polarisation is a good strategy.
From the analysis above, it can be seen that an entire radio chain could be designed on a single frequency pair by using the following planning guide:
Channel 1 (H) – Channel 1 (H) – Channel 1 (V) – Channel 1 (V) – etc.
It should be noted that if polarisation alternation is already being exploited for network capacity reasons, for example in a Co-Channel Dual Polarisation (CCDP) mode, then it cannot be used again for interference reduction.
In conclusion, spectrum planning is critical to preserve scarce and valuable spectrum success as well as eliminating the hidden gremlin of interference in order to guarantee a well performing network.
This is an edited extract from Trevor Manning’s book, Microwave Radio: Handy Reference Guide, which is available on Amazon.com. Manning runs an international business (TMC Global) that specialises in training and development of technical people who have transitioned into management, and is actively involved in microwave radio system design and planning through his advisory board position with Vertel. He has also presented microwave training workshops for ARCIA at Comms Connect conferences.
New record set for wireless data transmission
Researchers successfully sent data over the air at a speed of 938 Gbps over a record frequency...
Swedish hospital enhances connectivity with Maven Wireless
Maven Wireless provided a high-power digital distributed antenna system (DAS) to replace the...
Five ways AI is reshaping telco cloud networks
Telco cloud networks stand to experience a paradigm shift with the integration of AI, redefining...