New Tools in Frequency Planning
Europe's network planners are resorting to advanced frequency planning aids to speed network deployment and increase spectral efficiency, and mobile network operators face a number of technical and management challenges in this frequency planning process. These challenges apply equally whether they are new entrants rolling out their service for the first time, or mature carriers adjusting and adapting to the highly dynamic market and technology forces acting on all networks. Effective use of the frequency spectrum — one of the scarcest resources for any operator — means better network quality and increased capacity. The key lies in solving the most fundamental problem in cell planning: allocating the frequencies in the network in an optimal way.
However, even today, frequency planning is often performed manually as few good automated planning tools are available. Not many of these tools have established and retained credibility in the marketplace and the shortcomings of what is available so far have tended to make operators skeptical of automated tools as a whole. In theory, everyone agrees that the more planning can be automated, the better. But what if automatic frequency planning (AFP) tools are overtaken by the advancing complexity of the challenges they are required to meet (or even if they are never truly able to meet the more basic challenges)? If this scenario occurs, the tools will at best be described as providing useful management information — information, nonetheless, from which manual analyses, evaluations and decisions still have to be made.
Cell planners who must undertake those processes are scarce. If they can be freed from the basic frequency planning loop, operators will realise time and money savings. Then, the planners can concentrate on those tasks that can only be solved manually, and thus produce quality improvements in the network.
Manual cell planning takes weeks, sometimes months to complete. But as things stand, the current generation of frequency planning tools often bring about few, if any, time savings over all-manual methods. The time spent in correcting these frequency plans manually is typically several weeks because the tools reflect so many obvious design errors. Also, since the frequency-assignment problem is intrinsically so interdependent, one change often leads to another. First-generation tools are limited in their ability to solve the frequency problem accurately because of poorly devised — as they often are — optimisation algorithms. These early tools are derived from an engineering approach in which the computer automates the manual planning process, a semi-automatic approach or academic approach where the frequency planning problem is attacked by a standard optimisation method.
The engineering method, in which the computer is set to simulate the manual process, is not effective simply because the frequency problem cannot be solved well with this kind of sequential approach. The quality of the plan will not be any better than a manually derived plan. The only benefit of using a computer in this way is that a flawed planning process runs slightly faster. However, the process fails to utilise the computer to determine an optimised plan derived by extensive number crunching, its strong point.
Some of the current tools are based on a semi-automatic approach, providing interactive support for the planning process led by the planner. The interactive support takes the form of suggested planning options that are proposed to the planner based on some programmed calculations. The quality of the plans provided by the interactive tools to a large extent will not differ from the quality of the plans derived manually because the approach is still fundamentally sequential. However, these semi-automatic tools have the advantage over existing fully automated tools in that the planner has more control over the planning process and will be required to spend less time correcting the plan afterwards.
Algorithms produced with a lot of academic influence have a sounder theoretical basis, but tend to be very inefficient in solving the substantial problems of real-world cell planners. The algorithms often model the problem inaccurately and inflexibly because they sacrifice precision in order to fit into predetermined, standard optimisation methods. This approach produces solutions riddled with obvious mistakes that must be corrected.
One reason why so many existing tools fail is that they assume that all the specified planning rules pertaining to the optimisation algorithm must be fulfilled or else no plan can be formulated. In reality, planning problems encompass so many rules that no plan could be designed to obey them all. It would be impossible for cell planners to know which constraints to take away without fundamentally undermining the frequency plan. A better approach would be to specify as many planning rules as needed, but also build in a mechanism to specify the relative importance of the planning rules and allow the computer to make the choices based on these preferences.A requirement in establishing automatic, implementation-ready frequency plans is the ability to model and optimise the advanced system features and engineering approaches common in modern cell planning. These advanced methods are applied to enhance either the network capacity or quality. AFP tools must be able to support these new advanced methods. Current tools offer very limited support for these new features mainly because of the way the tools model the problem. Typically, existing tools represent the problem on the station level, that is, the station requires a particular number of carriers, and constraints between stations must be fulfilled. However, most of the new advanced system features and engineering approaches to enhance quality and capacity make clever use of characteristic differences within the station. Thus, unless the AFP tools model the problem on the more detailed level of the transceiver, these differences cannot be taken into account.To avoid this frustration and waste of cell planning resources, an AFP tool must be capable of generating implementation-ready frequency plans automatically. For many reasons, few AFP tools are able to meet this requirement but the AFP tools that can will form the promising next generation and undo much skepticism about their value. Those tools are a necessity for any operator anxious to enhance quality and capacity. Currently available AFP tools and manual planning fail completely to realise the potential savings that are becoming increasingly necessary in the cost-competitive mobile telecommunications marketplace. In practical terms, these savings will be realised through increasing the network capacity by tightening frequency re-use in the network. That is, savings can be realised by introducing more transceivers per station instead of increasing the number of sites and stations. Because each site costs approximately US $160 K to install and has an annual rental value of US $16 K, it is easy to derive enormous savings by obviating the need to install tens or even hundreds of these sites. Moreover, a further financial benefit to operators will be in the form of customer goodwill — and churn reduction — as the drop-call rate in the network will improve immediately.
Several factors exist that enable an AFP tool to generate truly implementation-ready frequency plans automatically. A key factor is how well the AFP tool models the frequency assignment problem. The frequency assignment problem inherently involves trade-offs and, therefore, must be modeled and optimised as such. For instance, it may occasionally be acceptable for carriers from neighboring stations to be assigned adjacent frequencies if that means that the interference situation will improve significantly. The AFP tool must allow for accurate modeling and optimising of such trade-off choices. Newer tools use penalty weights to express how undesirable it is for a particular planning rule to be violated. The penalty weights allow the optimisation algorithm to choose, just as the planner would, among which planning rules to violate. Penalty weights in tune with the cell planner's trade-off preferences and a robust optimisation algorithm make it possible to produce a solution that does not require manual changes afterwards. To generate implementation-ready plans automatically the AFP tool must consider several different kinds of planning preferences and model many innovative techniques and features that modern cell planning applies to enhance capacity and quality. These techniques include tightening the re-use through (intelligent) underlaid/overlaid (IOU) techniques that allow traffic carriers to have a tighter re-use than control carriers, cell sectorisation and splitting, and introducing a multilayered network with micro cells. Other techniques include half-rating and adding more frequencies, possibly via a frequency band of another service. There is no doubt that innovative users of new tools are deriving benefits already. According to Jill Wells, RF planner at BellSouth in New Zealand, "It is great to find a tool that keeps up to date with new technology, offering features such as layers to support IUO plans."
Innovative modeling of the interference table's input parameters is required to consider the approaches that tighten the frequency re-use. The interference table describes the predicted amount of traffic interference assuming that the carriers in a station are assigned to the same or adjacent frequencies as carriers in another interfering station. Several underlying assumptions exist in the interference table parameters that must be dealt with to achieve accurate modeling. First, the calculated interference parameter must be scaled down significantly to reflect the much lower pollution from traffic carriers that are not broadcasting constantly. Second, it is necessary to adjust the interference parameters according to the type of carrier. One parameter per station is not sufficient for assessing the implications of different carrier types within the station. The control carrier is critical for the functioning of a station and requires extra protection. This protection should be modeled by increasing the interference parameter for the control carrier in the station relative to the interference parameter for the traffic carriers. In the case of IOU it is also necessary to perform a similar scaling. The overlaid carriers, operating only in areas where the signal strength of the carrier is strong, will account for a much smaller portion of the total interference of the station than the underlaid carriers, as shown in Figure 1 .
The same sort of modeling applies to features such as intracell handover and preferred carriers within the station. The effect of these features is that the traffic distribution is skewed within the station. To allow the AFP tool to protect the carriers with relatively more traffic, it is necessary to model how much traffic is distributed to each carrier within the station. Obviously, the optimisation algorithm also is important for an AFP tool. To generate implementation-ready plans automatically, the optimisation algorithm must be extremely robust, converging reliably and rapidly to sound solutions. The only realistic way to solve this difficult problem is with an iterative, heuristic, local search algorithm. Reliable convergence is important to guarantee a high quality plan and to ensure that the optimised solution does not contain obvious mistakes requiring manual changes afterwards. Rapid convergence also is important in carrying out what-if analyses effectively. Typically, a particular frequency plan will violate some of the defined planning rules. The sum of the penalty weights for all the violated planning rules defines a quality index for the frequency plan. The lower the quality index, the better the frequency plan. Minimising this quality index is the objective for an automatic frequency planning optimisation algorithm. The quality index is useful in comparing the quality of different frequency plans quantitatively, complementing the current visual comparisons of interference plots. Further, the quality index also may be used to understand the quality impact of employing various planning strategies with frequency groups, various separation requirements, spare frequencies, different numbers of frequencies per layer and new carrier layers. Obviously, the quality index is related to frequency re-use. A tighter frequency re-use implies a higher quality index. Here, frequency re-use is defined as the number of available frequencies divided by the average number of carriers required per station. There is also something to be said about how the quality index relates to a tightening frequency re-use. The relationship found in numerous benchmark studies of real frequency assignment problems is that the rate of change in the quality index is constant per frequency re-use unit. A good optimisation algorithm will yield significant quality improvements. Typically, the quality index can be improved by a factor of 10 and sometimes even by a factor of 100 compared to the quality index of a frequency plan that has been generated manually or in conjunction with traditional AFP tools. If an operator can achieve quality of this order, tighter re-use will be possible and improved capacity will result, as shown in Figure 2 .
Members of the mobile telephony engineering community are well aware of the continuous integration occurring in both processes and support tools. At least three distinct processes are merging, including the planning and design, network verification and optimisation, and performance analysis processes. Today's planning and design process relies heavily on using network design software with geographical information systems and signal propagation prediction capabilities. The process' practitioners have been early adopters of decision support applications such as AFP tools. The network verification and optimisation process verifies the network quality and, by analysing drive test data, makes any necessary local corrections. This field exercise usually is performed by optimisation engineers equipped with drive test data tools, but seldom with AFP tools. Attempts to create such combined applications are underway. The performance analysis process derives information from analysing element statistics. This process relies on tools such as Metrica/NPR, whichcan analyse and present the network data as useful information. The integration of these tools with decision-support applications, such as AFP tools, will also re-engineer the planning and design process by making it dynamic and essentially iterative.
Currently there is a swift and unstoppable momentum towards integration in the processes of managing the quality in the network, a force that calls for tighter integration of the support tools. These decision-support applications play an important role in this integration as they are an integral part of the forthcoming merged-network quality-management process.
Joachim Samuelsson received his MSc from Linköping Institute of Technology and is the founding partner and managing director of cell planning specialist ComOpt (www.comopt.com). Previously, he worked at Ericsson. Samuelsson can be contacted via e-mail at email@example.com.