A long while ago I wrote a piece about surface multipath interference over the sea and recently, I had to explain that theory to someone. It was regarding their C band microwave access system which was not performing to their expectations. This system had one end station on the land and the other in the form of a tracking antenna mounted on a ship. Being a modern system, the signal level data could be recalled and graphed. In front of me I observed a great example of the fading phenomenon, so I dug out the old Excel model, tweaked the parameters for the atmospheric state of the day et voila:
The sea was a little unsettled on the day in question and the boat had several metres of heave. The multipath effects are clearly visible including impact of the heave that can be observed in the little oscillations in the last measured lobe. Beyond that, the signal diffracts due to the sea state causing the horizon to fall short of the smooth sea prediction.
Mobile broadband can be quite challenging from an interference perspective with certain band configurations posing significant risks. For example when 700/800 MHz is implemented in the same locale there is a huge potential for intermodulation products polluting the receiver uplink band as the downlink bands of 700/800 MHz are neighbours and many channel arrangements can lead to problematic third order products. The risk of issues can be especially difficult when working with within metallic structures because external interference can be generated due to the semiconducting properties of rust when illuminated by antenna systems. This is particularly acute on a boat or offshore structure as rusty is often not hard to find!
In illustrating the issue to help explain to others, I updated the IMP calculator and have improved the legend, added labelling as well as numerous other small enhancements.
An example of the new graphical display is a typical scenario with the highest of the three 10 MHz blocks in 800 intermodulating with the middle 700 block causing third order products in every 700 uplink block.
Other configurations are worse with products falling in both uplink bands as shown below.
You can find the software on the downloads page and it is yours to enjoy as-is. The software still doesn’t have an installer so you will need to ensure the runtime dependencies are present such as provided by Visual Basic 6.0 Runtime Plus.
A long while ago I had a need for an intermodulation calculator for an aeronautical project on a very busy site where a design was required for spurious free reception environment. Due to the complexity of the situation a software was born to help work out what intermodulation could be created when changing the frequency mix and evaluate the best combinations.
The concept of intermodulation distortion is quite simply where unwanted signals are generated as a result of non-linear processes, but it is more than just simple harmonics which are usually spaced far away from bands of interest, for intermodulation it is the sum and difference of multiple fundamental components and their harmonics that are the main issue because these can cause out-of-band interference that is impractical to filter out, hence the need to avoid them. Usually the lower the order of the products, the higher the probability they exist at significant amplitudes, so often there is a greater focus on judicious choice of frequency combinations to reduce the impact of lower order interference.
The products of intermodulation are manifest as discrete tones or spectrum shoulders adjacent to signals of interest. Predicting where the spectra may appear can be done with simple mathematics, but their amplitude depends upon many factors and is harder to predict. For example, in a power amplifier with data on the third order intercept specified, estimating the level of the 3rd order is trivial, but in passive components it is often the deterioration of these component over time or the quality of the installation that leads to a rise in level, for example poor installation and weathering can cause the oxidisation of connections leading to unexpected intermodulation interference. Some guidance exists on how to predict the amplitude in ITU recommendationSM.1134-1.
Recent site engineering projects with powerful emissions in close proximity to safety critical equipment caused me to spend some evening time to recreate an IMP Calculator based upon the features that proved so useful long ago which I offer for you to use in your projects as-is. This software creates lists and graphs of frequencies or blocks characterised by order of intermodulation in an endeavour to help avoid problematic combinations. Enjoy…
p.s. I update the software periodically depending upon my project needs and have recently uploaded a version with many small enhancements, such as showing the fundamental components and showing the receiver as a hatched box. The software doesn’t yet have an installer so you will need to ensure the runtime dependencies are present which can either be downloaded direct from Microsoft or preferably Visual Basic 6.0 Runtime Plus from Source Forge as this is more extensive and up to date.
Generic antenna radiation patterns are very useful in coverage planning before specific choices have been made and also in interference coordination to help make robust plans.
These patterns are typically based upon mathematical models that describe a broad family of antenna technology and lead to enveloping masks that generally describe the worst radiation that may be expected at any azimuth or elevation angle. This approach avoids the need to describe specific side or back lobes that may change when an antenna is mounted in a real environment compared to a measurement in an anechoic chamber.
To help planning I made a utility to visualise antenna generic antenna patterns and find antenna codes matching a specific antenna. The software is based upon the pattern types codified in the Harmonised Calculation Method (HCM) that is common for cross border interference management between many European nations and beyond. For more information on HCM see the official website where you can find more on the antenna patterns and their formulation.
The concept of the software is that you either specify a pattern direct using the codification scheme or you paste in numerical data describing a known pattern and then find a matching specification.
I made it some time ago but have recently dusted it down and done a bit of maintenance as I wanted to use it to make consistent plots on harmonised scales. You can download HCMPLOT and use freely, however as with all the freebies on this site you do so without any warranty whatsoever. At the moment the executable is not signed so I will update at some stage and maybe I will add more on file import/export for common antenna pattern formats.
In setting a challenging project for a very bright physics and mathematics student, I had to stay one step ahead and go back to fundamentals to work out the availability of a fading path as a function of direct and indirect signals.
This is useful in many applications where the fade margin required for reliable communications in narrowband channels can be characterised as a function of the amplitude of indirect scattered components relative the direct line-of-sight component.
This led me to develop this little fading calculator based upon Marcum’s Q function, which I have implemented as a custom function in a spreadsheet to resemble the Nakagami-Rice distribution as expressed in ITU-R P.1057-4 and plotted in figure 4.
I chose the numerical approximation of the Q function based upon ‘Another Recursive Method of Computing the Q Function’ by W F McGee (IEEE Transactions and Information Theory, July 1970).
A few years ago I had a number of projects in the broadcast world, where I was looking a national digital television and radio plans. I needed to look at what assignments were coordinated internationally to reconcile this with national data and look at neighbouring use. I then looked at opportunities to use coordinated assignments in some way, as some of the assignment plans were created before there was a good vision of national requirements.
Usually this kind of process involves using the ITU BRIFIC which is not for the faint hearted! Having much work to perform, I set about creating a tool to gather the relevant ITU data and import data from national databases, so I could compare, then modify and use in a coverage planning tool. The software needed good filtering, sorting, editing and comparison tools as well as a nice GUI with a map interface. These needs gave birth to the Broadcast data pump!
The tool can read BRIFIC databases with aplomb and has a number of file import filters for many types of ITU eNotice and the CEPT formats from Wiesbaden 95 and Chester 97. When it comes to export it has good support for exporting to Excel, Google Earth and planning tools such as ATDI ICS Telecom along with some limited support for LS Telcom CHIRplus_BC. The tool even has support for various legacy tools for Band II FM like ITU GE84PLN and even for EBU LEGBAC (requires a compiled version of the FORTRAN source).
The tool is available for download here as a free version with some of the specialist database connectors and obscure tools removed. Beyond this I may still develop this software occasionally to correct or add minor new features on request and some ad-hoc support, but just don’t ask for a manual!
GPS gives precise and reasonably accurate fixes of position. However, as with all complex systems its limitations need to be understood so that it can be used appropriately.
In radio system planning getting a reasonable accurate three dimensional coordinate for the location of an antenna or the base of a mast can be useful for predicting signal propagation to help assess coverage or potential interference between systems.
Nowadays, with GPS selective availability turned off, we can typically expect accuracies of the order 20 metres or an order of magnitude better with secondary corrections from Wide Area Augmentation System (WAAS) or other ancillary systems.
The accuracy is a function of many things, including number of satellites in view, their geometry relative the receiver and radio propagation effects along the paths. The error is generally expressed with dilution of precision ratios in space and time, which when all combined result in positions being expressed as confidence within a circular area in the horizontal plane or within a sphere in three dimensions.
The first simple observation is that we have different metrics for error in the horizontal and combined horizontal/vertical planes. Conditions that lead to low error in the horizontal plane may not coincide with the vertical. For example, if all the satellites in view are on the horizon and none overhead then we may get a low horizontal error, but a high vertical error. Typically the vertical error is several times worse than the horizontal.
The size of error in each plane may have disproportionate effects when applied to propagation prediction. This is particularly true when considering diffraction geometry to terrain or clutter obstacles. In the horizontal plane, a 20 metre position error is often insignificant when we are looking at systems with separation distances of the order hundreds of metres or kilometres, especially when using general purpose gridded terrain models to model terrain obstacles surrounding the location of the fix. However, the same kind of error in the vertical plane can lead to gross changes in diffraction geometry, especially when considering antenna heights that are low in comparison to the terrain or local clutter.
The vertical height reported by default in most GPS receivers is the height above the WGS84 ellipse. This ellipse is a hypothetical construct that is an oblate sphere which is a coarse approximation to the Earth’s surface. In fact the Earth has a more complex shape than an oblate sphere, and a better datum for measuring height the is that of a geoid, which is defined as the line where gravity pulls equally.
Many mistake GPS heights for height above mean sea level, which can lead to gross errors. To give an example of the height difference between the GPS ellipse and a geoid model, taking EGM96, which is a global geoid model, then using the handy UNAVCO geoid height calculator page, the geoid height at say Newlyn harbour is 53.5 metres above WGS84. If the raw GPS height is used without correction the vertical height error will be at least 53.5 m along with the other sources mentioned. Coupled with a terrain elevation database for terrestrial propagation prediction and the results will be grossly errored, as most terrain data will be referenced a local datum which will be closer to the geoid. In the case of the Great Britain, the heights are most often quoted against the average tidal height datum measured in Newlyn from 1912-1921. Note that there are usually better local geoid models than the global EGM96, such as the OSGM02 which is generally applicable in Great Britain.
Are GPS heights too good to be true? Well not if used appropriately!
An activity in radio network planning that is all too often left till last, is that of assessing the radiated contribution spilling over international borders to see if the level exceeds the threshold requiring formal coordination. This can be a problem if not tackled up front, because coordination activities can be lengthy.
Within CEPT this situation is anticipated, and there are various criteria agreed to allow radiation either side of a border without extraordinary activity. Typically these agreements permit an administration to implement a station that radiates towards a border, as long as the field strength incident at and beyond the border do not exceed defined limits for not more than 10% of the time. The criteria include common methods for predicting propagation which include long term statistics to satisfy the time criterion.
To expedite network roll-out, it is often interesting to engineer the network design to meet the limits. This can be done by limiting sites close to the border, omitting sectors pointing towards the border, using terrain to screen radiation towards the border, or optimisation of power delivered to the antenna and sector pointing. Often these measures need to be used in combination. For example, a site on a hill close to a border will probably require more than antenna down-tilting to be satisfactory. Of course if a configuration causing these limits to be exceeded is essential, classic coordination activities are still possible, but plenty of time must be allowed, hence it is best to identify sites requiring coordination early on in network planning.
In respect of LTE, there are several recommendations for the various harmonised bands in CEPT including the ECC recommendations (08)02 for 900/1800 MHz, (11)04 for 800 MHz, (11)05 for 2600 MHz and (14)04 for 2300 MHz. The attached CEPT LTE cross border coordination spreadsheet gives an example of how to assess if a site requires coordination for the 800/900/1800 MHz bands, but could be easily extended to cover others.
At a receiving antenna system we generally find a series of signal vectors that arrive and aggregate in positive and negative ways. Often we have a significant wanted signal and multiple unwanted ones from other co-channel users that lead to the planned service being limited by interference rather than noise. The question arises, how to aggregate these signals so that we can compute a wanted to unwanted ratio such that we can ultimately determine the availability of the desired service as we vary location? This post examines some of the approaches and gives a practical spreadsheet example.
Simple summing of the means of these signals does not suffice, because generally they will fade independently due to the diversity of the interference paths, so a meaningful prediction will require a method that takes account of their stochastic nature. To simplify matters these signal are generally characterised by a mean and variance assuming a log-normal distribution, and they will also have that distribution in aggregate. A straight-forward approach is to use Monte-Carlo method to sum many random samples to compute the aggregate mean and variance. With modern computing power this can be done with a high degree of confidence, however, this may require a significant number of samples. If pixel plotting techniques are used to predict the service at a huge number of locations, then the simulation times can be significantly long, this can be limiting when network plans are being evaluated.
However, there are many alternative techniques using numerical approximations to help estimate the sum of log-normal variates that require substantially less computing power. This includes the seminal method by L F Fenton which is often known as Fenton-Wilkinson method. Fenton’s approach has the benefit of being simple, but suffers from significant accuracy issues which are quite limiting when dealing with a large number of different vectors. The Schwartz-Yeh method is an improvement under most circumstances and is especially useful for summing uncorrelated sources, but it is more complex. Other techniques improve upon these methods and may offer better fidelity in dealing with correlation and larger numbers of signals, but there is no perfect method, and Schwartz-Yeh stands as a good general purpose method for a small number of uncorrelated signals.
The interference summing methods compared spreadsheet implements a number of functions that can be called to compare Monte-Carlo, Fenton-Wilkinson and Schwartz-Yeh methods. The spreadsheet contains an example with 20 components that are assumed to be uncorrelated. These components are simply passed as arrays to user defined VBA functions making the worksheet compact and easy to experiment with. Aside from the helper functions, there are only a few core functions whose code is easily modified for re-use in other programming environments.
Why is the aeronautical radio path between the ground and an elevated platform different to say a path between two stations on the surface? Surely the same principles are at play? Well yes they are, but it is more a question that the terrestrial path is often simplified when modelling stations near the surface of the Earth compared to the aeronautical case.
A simplification often applied, is to model of the atmosphere as uniform rather than inhomogeneous. Of course the density of air is greatest at the surface and ‘thins out’ with altitude. This may be fine for short range applications, or for those close to the surface of the Earth, but for a path that is slanted between a terrestrial station on the surface and an aircraft at altitude, the radio wave will encounter different densities of atmosphere, and so it will refract causing the ray to bend with a magnitude that varies with altitude.
To model the ray bending, ray tracing is often used with an exponential model of the atmosphere. An example of this is contained within the ITS IF77 model published in 1983 by Johnson and Gierhart. IF77 is the basis for the ITU recommendation P.528-3 on propagation for aeronautical mobile and radionavigation services using the VHF, UHF and SHF bands. The diagram below depicts an exaggerated example of the ray tracing used to generate key metrics for estimate of propagation losses for IF77.
Straight line geometry over estimates the ray bending leading to horizons being predicted as further away, so a longer line-of-sight region and thus smaller propagation losses. In IF77, the ray is traced using iterative methods to find the radio horizons, and then to correct the end station heights to lower effective heights. This is one aspect of the model that is by no means unique to aeronautical, but more significant than for say land mobile over short paths. Other effects that are significant in an aeronautical context include the role of antenna systems in attenuating the Earth reflected path, some of the principles of which were discussed in the post Plane old Earth on the sea!
Despite the age of the Johnson Gierhart work, it is a current topic of discussion in ITU SG3, because it was a well researched piece, and provides a solid basis from which to model the propagation path, which is required to help satisfy the need for improved spectrum planning, mostly for sharing and coexistence studies. This need arises from the relentless pressure for more spectrum to serve both land and aeronautical mobile systems.
Broadband cellular systems are becoming commonplace in maritime environments, with public and private networks desired over large sea areas to cover needs which were once uniquely served by satellite. To take advantage of economies of scale, land mobile cellular technology is often simply transplanted into the maritime context. In principle that is fine, however, propagation planning principles common in land mobile should not not be applied applied without forethought.
For example, planning methods based upon propagation models such as COST231 Hata that are often used in land mobile for predicting coverage and frequency planning are not suited to the maritime environment. These models account for a combination of physical effects for specific to a particular type of environment, and are not sufficient when taken out of context. Model tuning is the cell planners usual tool to improve validity, but it doesn’t really help, as fundamentally these models account for different physical effects, so a rethink in modelling approach is required.
Of course the maritime environment is not characterised by building or vegetation clutter, hence any approach based upon them is likely to come unstuck. However, whilst the environment is largely uncluttered in terms of obstacles above the surface of the sea which would cause diffraction losses, the sea surface itself is an excellent source of scattering that can lead to undesirable effects, as well as a source of shadowing due to waves in high sea states. Furthermore high seas are the potential source of unstable antenna platforms. These issues and many others are well understood by designers of other types of radio system deployed in a maritime context, and something useful can be learned from them.
In the radar world, designers are familiar with sea clutter, which is the term used to describe the excess information that can appear on a radar display as a function scattering for a given sea state. Other issues include lobing effects limiting range in the elevation plane due to surface reflection in calmer seas. Much of a radar system processor is devoted to alleviating these effects without compromising the ability to detect valid targets. In the fixed link world, paths crossing larger bodies of water may suffer heavy fading, particularly in calm conditions and space diversity in the vertical plane is often used to mitigate fading.
So, for maritime mobile network some of the considerations and techniques can be borrowed from fixed link and radar planning. One of the most basic things is to understand the nature of reflections from the surface, and in fact early planners of land mobile systems were quite aware of this, and simply talked of plane Earth modelling. In this case the effects of a specular refection is considered from the surface, and the interference pattern predicted for typical path geometry. Unfortunately common mobile radio diversity techniques with small displacements between antennas offer less relief than in a cluttered land mobile environment, which means that the antenna height of the mobile system must be carefully considered to avoid the effects of severe fading.
The plot below shows an example of the fading effect from surface multipath for a system with a base height of 100 m operating at 1.5 GHz to a receiver antenna height in the range 0 – 40 m above the sea surface at a range of 10 km.
In this case significant fades are present at multiples of 10 m, so unless the fades can be tolerated, the antenna configuration would require optimisation to ensure that bad heights are avoided, or a diversity configuration could be chosen with optimal antenna separation. These issues are well understood and have a clear impact that can demonstrated whatever the sophistication of current mobile technology.
The disappearance of the MH370 from air traffic control displays was possibly deliberate, or perhaps just the unfortunate result of a series of failures. Time may reveal. Perhaps more worrying is the disappearance of many planes over the well monitored skies over Europe. Earlier this month, there were reports from air traffic control centres over Austria, Germany, Slovakia and the Czech republic, that a number of aircraft disappeared from radar displays, presumably arising from the failure of secondary surveillance radar. Classic symptoms of interference were in evidence, with these aircraft being lost intermittently over a few minutes. Given that multiple radars and multiple planes were subject to the same issue, it seems like a case of unintentional interference, with some rogue system blocking the transponders on the aircraft. Initially a NATO electronic warfare exercise was blamed in Hungary. The defence ministry in Hungary denied the issues citing that the electronic warfare devices only had a range of 4000 metres. Whilst it is credible that these devices are only used for intentional interference over ranges of 4 km, it is rather odd to think that the denial implies that radio waves stop their interference potential at 4 km! However, more convincing is the fact that these military exercises only coincided on one of the occasions where the interference was reported. The investigation continues to try and identify the source, but until then I wonder if we may see some more unscientific comment, including perhaps the odd conspiracy theory or two!
So what is an optimum plan? I hear the word optimum all the time, and in my experience it often is used to describe a situation which is complex and needs improving, but those who use it are often not sure how. So how to take this and make more meaningful?
Optimum means the best. So what is best? Does it mean the most efficient, perhaps the most complex? Hang on doesn’t that imply length and costly? Maybe effective and practical are better descriptions? Well this can be formed if we have an objective, as this allows us to direct the plan giving us the practical approach, and to evaluate it giving us the effectiveness. Note that there may be several objectives, but best not to have too big a shopping list as some may be in conflict and dilute the plan e.g. a plan that gives maximum flexibility will not be the most efficient in terms of its use of resources. Resources are all the things required to enable the plan e.g. the sub-band available for planning. Also a resource is the effort to create the plan as well as that available to enact it. We also need to consider other practical aspects, such as constraints. These may be straightforward, e.g. the limitations of receiver technology and how we represent it. Constraints may also exist in the form of assumptions that arise due to poor definition of the problem e.g. lack of data describing service area. Assumptions can also be constraints in that are too general or invalid, or there may be limited resources available to do the optimization. Assumptions are also needed to say something about the future. In general it is very hard to predict the future, so an optimum plan will have to make assumptions. If the plan is not forgiving of change, then that is a danger. The best plans are perhaps those that are based upon simple and manageable criteria (i.e. SMART), and are subject to periodic review to ensure that they are appropriate throughout the lifetime of the plan.
It is therefore paramount that optimum is not confused with ideal; perhaps a better phrase is a practical plan that is effective delivering a tangible benefit compared to the situation of today. So optimum is perhaps a word that is best avoided, as it can be interpreted in so many ways and lead to unrealistic expectations. Effective and practical are terms that seem to be more balanced and meaningful…
In my post on Safeguarding aeronautical communications from windfarms, I said that I would try and understand more on significant work and report back, because the official guidance was quite limited and it It was clear to me from my own projects that work had been done but frustratingly nothing published.
The Aviation Management Board Meeting held at MOD, Eskmeals on Thursday, 18 July 2013 gave a hint of the work in progress. The public minutes from the meeting contain a reference regarding work done:
“NATS was funding this £1.8m research programme over two years. It has five aims: to build on earlier work; to develop scientifically credible evidence on detrimental effects; to determine operational impacts; to develop guidance material; and for NATS to develop software tools to assess wind farm applications. QinetiQ and Pipehawk were the main contractors and the work would be undertaken at the Shooters Bottom and Red Tile sites. The research was due to start in August and run until the end of September 2013. The software would be developed by January 2015. The guidance document would need to be timed to fit in with CAA’s CAP schedule.”
So nothing tangible yet, although it turns out that NATS have published some of the trial data in an FAQ in relation to a windfarm planning application, but it is a little obscure as it only seems to appear on the Preston Council planning portal rather than being more prominent on NATS website. The FAQ confirms a little on the nature of issues and notes whilst measurements have been done, notably at Shooter Bottom Farm 2009 and Goonhilly in 2007, that there are no concrete criteria, and more work needs to be done.
Since this post was created in 2014 the FAQ link to Preston City Council planning portal in respect of AGA impact no longer functions, so I attach a copy of the FAQ from my archives. This cites a NATS internal report from December 2009 concerning Shooters Bottom Farm Field Trial which yielded an RCS of 48.6 dBm² in the back scatter region and 54.9 dBm² in the forward scatter region at VHF for a turbine of hub height 65 m and tip height 100 m.
Finally in 2019 CAP670 was updated in a more comprehensive way and now contains “Appendix A to GEN 02: Methodology for the Prediction of Wind Turbine Interference Impact on Aeronautical Radio Station Infrastructure”, which is what I presume is the culmination of this work that was due in 2015. This section includes specific methods and planning figures that can be used in make safeguarding assessments. The RCS planning figure for a large industrial turbine is 41 dBm² at VHF in the back scatter region and 51 dBm² in the forward scatter region. These figures are significantly lower than the Shooters Bottom trial, so presumably reflect other data perhaps of a more statistical nature.
In doing a review of current literature on wind farm interference to aeronautical radiocommunication services, it seems that the guidance for safeguarding VHF communications has advanced little.
Specifically in the UK, we have the CAA policy in the form of CAP764, which says:
“Until further information is available, issues concerning wind turbines and VHF communications should be dealt with on a case-by-case basis and reference made to the guidance contained in Section GEN-01 of CAP 670.”
Consulting CAP670, it offers safeguarding distances around the facilities and the following note:
“A wind farm whose blade tips, at their maximum height, are below the visual horizon when viewed from a point situated 25 m above an aeronautical radio station site may be acceptable to an ANSP.”
So does that mean even after a decade where specific measurements have certainly been done that there are no accepted improvements to line-of-sight modelling? When I have finished my research I will try and answer.
…but then the Age of Enlightenment arrived when a movement of philosophers and scientists advocated a rational set of values based upon critical questioning and reasoning that gave birth to the scientific method.
This blog attempts to try and bring to bear these values in the form of observations on the radio spectrum and the technologies using it for telecommunication purposes.
Interested readers may be those involved in radio spectrum engineering and network planning, from markets spanning national regulators, network operators, equipment vendors, system integrators and consultants.
The endeavour is to bring clear thinking to the planning and evaluation of radiocommunication services.