Posted on

Bandwidth and noise

As communications systems have grown in capacity, they have expanded in physical bandwidth in an increasingly congested RF spectrum.

Effective digital communications planning requires more than knowledge of antennas and propagation fundamentals. It now needs an intimate understanding of bandwidth and noise to co-exist and communicate efficiently.

Unfortunately, this key aspect of modern RF systems is often taught badly, or in some cases not at all, leading to often unwelcome surprises for equipment operators in the field. As a technology and market agnostic platform we’ve observed poor bandwidth knowledge in many markets, but notably MANET and 5G, both of which are accelerating the deployment of wideband systems, often with little to no planning beyond a topographic map study.

Both radio markets are evolving from narrower channel technologies, which in the case of MANET and the VHF Combat-Net-Radio (CNR) it replaces was measured in KHz so they need to update their theory training content and associated software to convey these potentially complex topics to novice students in a digestible manner.

Increasing bandwidth increases noise, which reduces coverage

Teaching noise

As bandwidth increases, so does channel noise. This simple concept might seem easy to remember for a student but without visual aides, and since the demise of analog systems; audible aides, it is hard to demonstrate in practice.

A good teacher may show visual aides like noise charts, FFTs, spectograms and a bad one may show some Johnson-Nyquist formulas buried within an all-day powerpoint which is not helpful except for getting paid.

FFT showing a narrow signal and wideband interference

A student can tick the right box on their exam(s) but spend their career wasting bandwidth and struggling to establish communications because they believe that big is better – it isn’t, or worse still, that bandwidth has no effect on the coverage since that’s a function of transmitted power and/or height. Having an intimate understanding of the interplay between bandwidth, receiver sensitivity and thermal noise will make spectrum users more efficient, effective and considerate.

Bandwidth MHzThermal noise (dBm)
0.1-124
1-114
2-111
4-108
8-105
16-102
32-99
64-96
Bandwidth thermal noise table based on a temperature of 21C

Which waveform is best?

Comparing digital radios is complicated due to the myriad of features, waveforms and software.

Given a particular waveform it will have characteristics such as a minimum Signal-to-Noise Ratio (SNR) value which it requires to achieve a symbol rate necessary to deliver a fast data link for example. This dB value must sit proud of the noise floor so if the noise floor is high at -90dBm, coverage will be reduced and conversely, by taking it to somewhere quiet eg. -110dBm, the coverage will improve by 20 decibels – a huge difference.

To compare waveforms precisely, the same noise floor should be simulated, initially, with fixed values to eliminate random error in field testing. The sensitivity values will be somewhere between 3 and 20dB depending on what the waveform and target Bit-Error-Rate (BER) is.

Bit Error Rate (BER) describes the ratio of errors in a data stream. An ok value is 10-3 or 1 bad bit in a 1000 or 0.1%. This increases with noise until a signal is unrecoverable. For more on BER see an older blog here.

For ground radios designed for noisy environments, a BER of 10-2 (1 error in 100 or 1%) is used here for extracting our planning thresholds from a chart of SNR curves. For an airborne system without obstacles this could be higher, for example 10-5.

Signal to Noise Ratio for different modulation schemas against error rates.

A narrow waveform eg. QPSK gives better coverage and works better in noisy conditions. This is the fallback telemetry mode used in many data radios.

A wide waveform eg. QAM64 is capable of better throughput and delivering high bandwidth streams such as HD video.

The best radio is one which can use different waveforms to satisfy both coverage and capacity.

Modelling bandwidth: A tutorial

Modelling RF Bandwidth and noise

Quick reference guide

A quick reference guide for using bandwidth and noise is available here. For other guides see here.

Conclusion

Bandwidth and noise is essential knowledge for anyone deploying wideband systems or comparing waveforms.

RF theory training can be enhanced (and needs to be) with visual tooling to let students quickly observe the impact of different inputs in a controlled environment with templates to minimise user error.

For information on how SOOTHSAYER can help with signals training see here.

Posted on

Connecting smart cows to moove data

Smart cow

Smart farming

Smart farming is using Internet of Things (IoT) technologies in agriculture to enable efficient use of resources.

For this blog we’re focused on cattle farming on large, fence-less farms in New Zealand. The farms in question are vast and remote so connectivity options are limited. This is why an off-grid sub-GHz LPWAN network is ideal due to its long range and the requirement to only send small, infrequent, packets of data.

For the solution to be cost-effective compared with Satellite, as little infrastructure as possible is needed which in this case is a LPWAN gateway on a pole, some collars for the herd and an app to manage the system via a web service.

Siting the LPWAN gateway(s) properly is critical to achieving not only coverage across the farm(s) but to reduce the number of gateways, which reduces complexity and cost.

Sub GHz LPWAN on the farm

An 868MHz LPWAN signal can go for many miles under the right conditions. We know this well from powering the Helium LPWAN network’s planning tool, Helium Vision, where people can communicate data 50 miles with a fraction of a watt of RF power and an omni directional antenna.

Despite it’s useful diffraction properties which enables it to work non-line-of-sight (NLOS), it’s still sensitive to obstructions so clutter on the farm such as buildings and trees needs modelling accurately. CloudRF has 10m Landcover for New Zealand from the European Space Agency and 10m DSM from the LINZ Geospatial agency.

These data sets are adequate for most outdoor scenarios but are not fine enough to model a farm complex of buildings, such as tall grain silos, metal sheds and seasonal obstacles. For high resolution you could source your own surface model, as our customer Halter did…

Farm buildings and silos

Use case: Halter

Halter are a novel agri-tech startup focused on cattle management with a unique solar powered collar.

They needed accessible RF planning software to help their engineers site LPWAN gateways. Having used and liked Cloud-RF, they needed higher resolution surface models of the farms, and no pesky API restrictions!

They also planned to build their own tools on top of our powerful physics based API which is smart as it allows their R&D team to focus on their primary product, and not waste time reinventing the wheel.

Their options were either buy expensive commercial data or self generate data using a drone and photogrammetry software such as Pix4D. Given the prohibitive cost of high resolution commercial LiDAR, it would only take a few jobs to make a return on the purchase of a decent drone!

https://halterhq.com/

Halter purchased a private Keyhole Radio server from us which included the API they needed. The server runs as virtual machine and crucially, lets them import their own terrain data.

They were quickly able to import high resolution, organic data into their server as GeoTIFF files. This allowed them to work with data which was very current, even hours old, so would be an accurate model of tree heights and man made obstructions.

The terrain format accepted by Keyhole Radio and SOOTHSAYER is GeoTIFF, Int16 resolution and WGS84 (EPSG:4326) projection.

LPWAN coverage on a farm in New Zealand

1m resolution

It wasn’t all plain sailing though, they found that there was a limit to the physical tile sizes our server could use caused by memory. The solution was to reprocess the large tile into smaller tiles to make it digestible.

A 5000 x 5000 GeoTIFF at Int16 resolution will require 50MB of disk space. If this is 5m LiDAR, the physical width is 25km x 25km. Our engine can super-sample, so if you used this tile, but requested 1m resolution, it would create a raster in memory measuring 25,000 x 25,000 pixels which would need 1.25GB of memory.

For 1m resolution however, tiles measuring 1000 x 1000px would only require 2MB of disk and memory. You may need to load in a few, lets say 16, to do your model but that’s still only 32MB.

You could also resolve this by increasing the memory available to the server but it’s recommended to prepare data into smaller parcels. We support 1m resolution in our API but don’t hold a lot of 1m data sets due to their substantial cost and size. If you already have 1m data, a Keyhole Radio or SOOTHSAYER server is the answer.

1m resolution

Summary

Cloud-RF’s powerful API is ideal for efficient smart farming.

Our private servers will let you take it to the next level with terrain data you can source yourself, no API restrictions and as a bonus, they work without an internet connection!

Finally, all our jokes are offal.

Posted on

Improving LTE modelling with field test data

Network Signal Guru

We took a field trip to model LTE (4G) coverage in order to collect data we can use to develop calibration utilities and improve modelling as modelling is only as accurate as your inputs.

We focused on a single remote cell in the Peak District national park identified through cellmapper.net. We expected to find one cell only but were surprised to be serviced by several distant LTE cells, not evident in the crowd sourced app and equally significant, we established limited or no coverage in an area where a national map suggested coverage was available.

Key findings

  • Data revealed the crowd-sourced coverage app was conservative in rural areas
  • Data revealed the operator’s network map was optimistic in rural areas
  • Modelling was matched with 2.5dB RMSE for a cell 12km away
  • Modelling was on average accurate to 5.5 dB RMSE
  • Improvements to modelling have been identified

Equipment and process

We used a rooted Samsung Galaxy Tab with an integrated Qualcomm X11 LTE modem, running both Network Signal Guru (NSG) and Cellmapper. NSG requires root access to lock to a cell which was necessary to prevent our survey tablet from hopping around not only protocols (2G,3G) but neighbouring cells.

Cellmapper is a crowd-sourcing app which writes signal strength readings to a CSV file, convenient for our analysis. Before embarking we planned a route around a remote cell on the edge of available coverage maps.

Both apps record various LTE power levels such as Received Power Received Signal (RSRP), Received Power Received Quality (RSRQ) and Received Signal Strength Indicator (RSSI). For this test we use RSSI which is typically a stronger value than the others as it is the measured carrier, irrespective of bandwidth.

Cellmapper
Network Signal Guru

Receiver measurement calibration

Radio receivers are subject to measurement error, typically in the range of 0.5 to 3.0dB for very expensive and consumer grade equipment respectively. As we were using a consumer grade Snapdragon 662 SoC with an X11 LTE modem, we needed to find out it’s measurement error. The Qualcomm datasheets we could find didn’t list this value so we used empirical measurements to establish it.

During our survey we paused at a site 3km from a tower with line of sight where we recorded continuous power readings with the tablet static on the ground for about 15 minutes. Consistency is essential for calibration. We have analysed these readings to establish a standard deviation in readings of 3.1dB for the X11 LTE modem, which puts it at the consumer end of the spectrum for survey device accuracy, in accordance with its price.

We use the error of 3.1dB in our analysis by subtracting it from the Root Mean Square Error (RMSE).

python3 receiver_calibration.py receiver_calibration_3km_LOS.csv 
[-71.0, -71.0, -67.0, -71.0, -67.0, -69.0, -69.0, -69.0, -71.0, -77.0, -67.0, -71.0, -73.0, -71.0, -71.0, -71.0, -75.0, -69.0, -73.0, -69.0, -75.0, -69.0, -71.0, -79.0, -73.0, -65.0, -73.0, -71.0, -71.0, -75.0, -71.0, -73.0, -77.0, -75.0, -69.0, -75.0, -75.0, -73.0, -77.0, -73.0, -71.0, -75.0, -73.0, -79.0, -75.0, -71.0, -71.0, -75.0, -67.0, -73.0, -73.0]
Mean: -72.1dB
Error: 3.1dB

The survey

Setting off uphill from Derwent Dam car park with Sheffield’s man-of-the-mountain, Chris, we approached our target cell located on the hillside below us.

As we neared 500m we used NSG to lock onto a strong LTE signal which we believed was the target (CID 130256660, PCI 270), based on proximity and strength (-60dBm RSSI). It was in fact a tower on a distant hill 12 km to the south with line of sight, CID 131413770. Surprise number 1.

Field testing in the Peak District

At the top of the hill we could see the target tower’s directional panels which confirmed it was configured to serve the A57 “Snake pass” road below. One panel was oriented north-west towards Manchester (CID 130256660) and the other south-east towards Sheffield (CID 130256650). Based on the dimensions of the panels we estimated their beamwidth as at least 120 degrees and gain of at least 10dBi.

As we passed the eNodeB along the hill top we were conscious of a number of cell neighbours and performed a targeted re-selection where we ended up briefly attached via the antennas back-scatter of *6650, made possible by our proximity. This didn’t last for long before we re-selected to a strong signal (PCI 337) which we were convinced was CID *6660. It wasn’t. Surprise number 2.

Surveying LTE cells with Network Signal Guru

We later found out this was also a distant cell with line of sight near Hope, 7km due south of us!

Marching on happily with a great signal, we started a gentle descent until we lost the horizon behind us. At this point the neighbours we observed disappeared and our serving cell (in Hope) became very weak as we entered the signal’s (diffracted) beyond-line-of-sight (BLOS) shadow. As predicted in the video based on the surrounding high plateau we could see, we lost the signal as we continued to head north toward Alport Castles, a local feature. We descended into the valley below without a signal (despite a national map suggesting otherwise) and continued the next 3.5km without any coverage at all 🙁

Losing all cells on a high plateau

As we exited the remote valley, heading towards the A57 road, we reacquired a signal and finally locked onto our target, *6660, with an excellent signal and line of sight (Credit to Chris for spotting the tower in the trees at 2.7km!). As observed, the directional pattern was focused on the road and we were in the main beam.

The elusive target cell!

Acquisition of the target cell

A quick map study and we elected to march west until we lost the signal. This event occurred after only 1.5km thankfully as we hand railed the road over undulating terrain. We followed the same route back to the acquisition point which doubled our measurements for this section.

Confirming the direction of the powerful 10MHz LTE800 bearer with a portable spectrum analyzer with a bad LCD.

From the A57, in the main lobe, we climbed the hill to the south east and headed back toward the target cell. Knowing it had a directional pattern, we anticipated signal strength decreasing as we exited the main lobe which we confirmed as we drew parallel with the cell, eventually circumnavigating it to the south.

As we exited the beam of *6660 and entered the influence *6650 we re-selected for the final phase of our journey which would take us into a steep ravine and then up the hill, right past the cell.

A sweaty climb up a steep hill behind the cell, saw signal strength and field testing enthusiasm collapse which was fixed with some fizzy snakes. We lost the cell for good only 500m behind it due to the convex hill and directional pattern.

Moral of the story is, in RF, proximity to an access point is no guarantee of service!

Serving cells

Hagg Farm eNodeB
Cell IDPCILocationTechnologyFrequencyComments
130256650272Hagg Farm
(South East)
LTE Band 20
10MHz
806MHz DL~15m AGL
130256660270Hagg Farm
(North west)
LTE Band 20
10MHz
806MHz DL~15m AGL
131377930337Hope QuarryLTE Band 20
10MHz
806MHz DL~15m AGL
131413770441Little HucklowLTE Band 20
10MHz
806MHz DL~20m AGL
Table of serving cells featured in our data

The cells all have a downlink and an uplink frequency. As these four cells share the same downlink they are separated in time using a multiplexing schema and the Physical Cell Identifier (PCI) code. If we only took out a spectrum analyser we’d never know which cell we are looking at otherwise.

Data analysis

We chose to model after field testing. We could have done it before but it would have ruined all the surprises that came up during analysis like the serving cell 13km away!

We extracted the CSV data (1034 rows) from the survey tablet which for cell mapper was located at /storage/emulated/0/Android/data/cellmapper.net.cellmapper/files/.

We sorted it by cell and created clean CSV files for each cell with only the location and RSSI.

Cellmapper CSV survey data

Gap analysis

Formatted CSV data. group = CID, id = RSSI

We used our new “coverage check” CSV import tool. This tool allow the import of customer locations which can be tested against visible coverage layers to report a correlation.

This is a binary yes/no comparison with a summary report eg. “87% coverage” which is handy for comparing options.

It cannot automatically calibrate field test measurements but is useful for gap analysis as a “first pass” toward calibration.

This tool is handy for manually aligning the modelling until it matches visually but is too simplistic for calibration.

Coverage check tool showing an 87% correlation. The deadspot is where modelling was conservative.

Fine tuning inputs

Our confidence level for the inputs started around 50%, based on known frequencies, heights and power levels for the UK network. For the first cell, we used a combination of known, observed and assumed values.

You can be forgiven for thinking why not do field testing with known transmission parameters but even then you must calibrate as old batteries, weathered connectors and battered antennas will all impact a transmitters actual effective radiated power (ERP).

As we working LTE800 we used the ITM model, designed for this UHF band when it was conceived for TV broadcasting. This general purpose model has built in diffraction and also has a reliability variable which we can use for fine tuning.

Known values: frequency, location, approximate height, approximate azimuth
Estimated values: Antenna azimuth, beamwidth, gain, RF power, exact height

Once we had a coverage plot using some sensible power values and the coverage-check tool reported a correlation > 90% we rendered it using the Greyscale GIS colour schema and download a GeoTIFF raster. This contains fine grain signal values to 1 dB resolution.

Calibration process

We suggest this workflow for the calibration process.

We also have an API capable of returning data in open vector and raster formats including SHP and GeoTIFF so there are other ways to do this...

  1. Gap analysis with the coverage check tool in the web interface and approximate/rough inputs
  2. Power balancing with the path profile tool for selected points only (Recommend a LOS link at long range)
  3. Gap analysis with the coverage check tool in the web interface and power balanced inputs
  4. Regenerate the layer with the GIS schema and export for precision offline calibration
  5. Make minor (1-2dB) adjustments to either the loss or gain values for LOS links, and/or clutter profiles for BLOS until the calibration script reports an RMSE value < 10.

Offline calibration

Using a Python script and the rasterio library we were able to query each row from the CSV data against the GeoTIFF raster instantly, negating the need for many recursive API calls.

The offline method is more efficient when working with large point-to-multipoint layers and spreadsheets than calling the API directly. It computes a mean error which can be positive or negative and a more useful root mean square error (RMSE) which is always positive. A lower figure is better with 0dB being ideal (and also impossible).

The API method is still valid for testing select points or calibrating dynamically.

python3 Offline_Calibration.py 131413770.csv 131413770.tiff 
...
Lat: 53.402250 Lon -1.760703 Measured: -65.0dBm Modelled: -68.0dBm Error: 3.0dB Mean error: 0.5dB
Lat: 53.402252 Lon -1.760699 Measured: -59.0dBm Modelled: -68.0dBm Error: 9.0dB Mean error: 0.6dB
Lat: 53.402253 Lon -1.760699 Measured: -61.0dBm Modelled: -68.0dBm Error: 7.0dB Mean error: 0.7dB
Lat: 53.402252 Lon -1.760698 Measured: -63.0dBm Modelled: -68.0dBm Error: 5.0dB Mean error: 0.8dB

Model error is mean 0.8dB, pure RMSE 5.6dB based upon 84 measurements
Receiver measurement error: 3.1dB
RMSE adjusted for receiver error: 2.5dB
The modelling inputs are excellent.
RMSEDescriptionComment
0 to 3ExcellentCalibrated!
3 to 6Very goodInputs are very close. Fine tuning needed.
6 to 9GoodInputs are good but more tuning needed
9 to 12OKInputs are OK but not tuned
> 12BadInputs are bad. Check basics eg. Height, Frequency, Power
Suggested interpretation of calibration scores. Requirements will vary by scenario.

The results

We achieved results better than expected. We were aiming for under 6dB RMSE and achieved 3dB, at 7km range, which is excellent, and coincidentally as accurate as the measurement accuracy of our survey device.

Manual calibration can be time consuming, and collecting good data definitely is. We felt we could have improved the scores further with more data, like the antenna data sheets for starters, but were happy with our 3dB.

The good

The best results came from the distant cells where LOS was achieved. This makes sense as without obstacles to complicate things the path loss decays at a predictable rate, based on wavelength, which can be plotted as a clean curve. Once we had this power balanced using the path profile tool and manual adjustments, it produced a great match with the data due to the open nature of the high plateau.

Calibrated modelling for a cell 7km away with line of sight

The ugly

The other cells, like our target at Hagg Farm (South east), served a more complex piece of ground in the valley which had steep ravines and tall trees. As expected we didn’t fair as well here and achieved 10dB RMSE. Analysis of where we lost accuracy can be summarised as follows:

  • Trees. We found 2m LiDAR to be too conservative here as this contains the tree canopy. We tried smoother DSM with clutter profiles which gave a better result but didn’t go as far as adjusting our clutter profile. This is a future trees blog!
  • Proximity. Counter-intuitively, being closer to the cell is not better for measurements and calibration than being further away. This is due both to the way path loss decays on a curve and the highly directional panels cells use. Small differences near a cell produce large differences in data, compared with very small differences up on the plateau with the distant LOS cells. We can model directional panels but are guessing what the *actual* beamwidths are.
CellPointsMeanRMSE
130256650164-0.79.8
1302566604600.96.9
131377930127-1.63.0
131413770840.82.5
TOTAL835-0.155.5
Table of scores for the calibration of the four measured cells

Look forward

The so what of all this is we have proven the software is capable of high accuracy, given the right input, and we have identified areas to improve it further.

  • We will be adjusting our LiDAR to soften it in areas where this is available to fully exploit recent developments with our custom clutter profiles.
  • We will be integrating recent developments with offline calibration into our user interface to make this manual process smoother, simpler and faster.
  • We will work on automating calibration. Some might call this machine learning blah but it’s just software.

Expect another field testing blog all about……trees.

Consolidated and calibrated LTE coverage for the valley, served by four cells

Scripts and data

You can download our field test data and Python scripts here as well as Google Earth KMZs showing the route,cells and measurements.

Posted on

GPU propagation engine

5G cell

Our fast GPU engine is perfect for modelling wireless coverage

We have developed the next generation of fast radio simulation engines for urban modelling with NVIDIA CUDA technology and Graphics Processing Units (GPUs).

The engine was made to meet demand across many sectors, especially FWA, 5G and CBRS for speed and accuracy.

As well as fast viewsheds, it enables a new automated best-site-analysis capability, which will accelerate site selection and improve efficiency whilst keeping a human in the loop. As we can do clutter attenuation, it’s suitable for VHF and LPWAN also.

Designed for 5G

5G networks are much denser than legacy standards due to the limited range of mmWave signals, necessary for high bandwidth data. The same limitation means these signals are very sensitive to obstructions, and Line of Sight (LOS) coverage is essential for performance.

With 1 metre accuracy and support for LiDAR, 3D clutter and custom clutter profiles, you can model rural and urban areas with high precision.

We can do Trees too

Unlike simplistic viewshed GPU tools designed for speed, we can model actual tree attenuation for beyond line of sight sub-GHz signals such as LPWAN and VHF. Trees can be configured as clutter profiles, along with shrubs, swamps, urban areas and 18 classes of Land cover and custom clutter.

Area coverage

The simplest mode is a fast “2.5D” viewshed (with a path loss model) which creates a point-to-multipoint heatmap around a given site using LiDAR data. Ours has better Physics than some of the “line of sight” eye candy on the market and doesn’t have trouble with Sub-GHz frequencies which are harder to model accurately.

This is up to 50 times faster than our multi-threaded CPU engine, SLEIPNIR.

GPU demo January 2022

In this mode we can do diffraction and material attenuation with our custom clutter classes.

Best site analysis

Best Site Analysis (BSA) is a monte-carlo analysis technique across a wide area of interest to identify the best locations for a transmitter. This can be done quickly with a new /bsa API call. The output will identify optimal sites, and just as important, inefficient sites.

This feature is powerful for IoT gateway placement, 5G deployments and ad-hoc networking where the best site might presently be determined by a map study based on contours as opposed to a LiDAR model.

Best Site Analysis on ATAK

High speed

Our GPU engine is up to 50 times faster through the API than the current (CPU) engine SLEIPNIRTM

By harnessing the power of high performance graphics cards, we are able to complete high resolution LiDAR plots in near real time, negating the need for a “go” button, or even a progress bar. This speed enables API integration with vehicles and robots which will need to model wireless propagation quickly to make better decisions, especially when they’re off the grid. It was designed around consumer grade cards like the GeForce series but supports enterprise Tesla grade cards due to our card agnostic design.

Economical

Our implementation is efficient by design. We want speed to model wireless coverage but not if it requires kilowatts of power. During testing we worked with older GeForce consumer cards and were able to model millions of points in several milliseconds with less than 50W of power. Or in other words, the same power as flicking a light bulb on and off.

Any fool can buy large cards and waste electricity, but we’re proud to have a solution which is fast and economical. This also means it can be run on a laptop as it’s available now as our SOOTHSAYER product.

Open API

The GPU engine is an “engine” parameter in our /area API so you can use it from any interface (or your own custom interface) by setting the engine option in the request body. The OpenAPI 3.0 compliant API returns JSON which contains a PNG image so for existing API integrations using our PNG layers there will be no code changes required to enable it.

Self-hosted GPU server

Instead of buying milk every month you can buy the cow. We also sell SOOTHSAYER which is a self-hosted server with our GPU engine onboard. It supports NVidia GPU cards from Maxwell architecture onwards and most enterprise Hypervisors like ESXi and Proxmox. You get to use your existing LiDAR data too, so you’re not buying it twice.

To see how easy it is to setup a GPU card with SOOTHSAYER we’ve made a video:

SOOTHSAYER GPU setup

Accessible

Using GPU cards to model Physics, including EM propagation, is an established concept dating back 20 years, despite businessmen claiming otherwise. What is novel here, is making this exciting technology accessible to users priced out of premium tools.

Staying true to CloudRF’s accessible and affordable principles, we’ve included it in our service as an optional processing engine.

CloudRF is a member of the NVIDIA inception program

Posted on

Enhancing accuracy with environment profiles

Clutter profile manager

In radio planning, accurate terrain data is only half the story.

The other data you need, if you want accurate results, is everything above the surface such as buildings and trees.

This is known as land cover or in radio engineering; clutter data.

Clutter data

Clutter manager with 18 bands

In October 2021, the European Space Agency released a global 10m land cover data set called WorldCover with 9 clutter bands.

In our opinion, the ESA data is a sharp improvement on a similar ESRI/Microsoft 10m land cover data set also published this year.

The land cover can be used to enhance coarse 30m data sets to distinguish between homes and gardens, or crops and rivers. It’s space mapped so has every recent substantial building unlike community building datasets which can be patchy outside of Europe.

This data was previously very expensive. A price reflected in the eye watering pricing of legacy WindowsTM planning tools.

The data has 9 bands which have been mapped to 9 land cover codes in Cloud-RFTM. Combined with our recent 9 custom clutter bands, we have 18 unique bands of clutter which you can use simultaneously.

Read more about the codes in the documentation here.

Explore the data we have on the ESA WorldCover viewer application here:

https://viewer.esa-worldcover.org/worldcover/

Custom clutter enrichment

We have integrated the 10m data into our SLEIPNIRTM propagation engine which as of version 1.5, can work with third party and custom clutter tiles simultaneously, in different resolutions.

This is significant as it means you can have a 30m DSM base layer, enhanced with a 10m land cover layer, enriched further with a 2m building which you created yourself. Effectively this gives you a 10m global base accuracy with potential for 2m accuracy if you add custom obstacles. The interface will let you upload multiple items as a GeoJSON or KML file.

Demo 1 – The Jungle

Always a tricky environment to communicate in, and model accurately due to dense tree canopies. In this demo, a remote region of the Congo has been selected at random for a portable VHF radio on 75MHz with a 3km planning radius.

This area has 30m DSM which out of the box produces an unrealistic plot resembling undulating flat terrain. This is because the thick tree canopy is represented as hard ground and the signal is diffracting along as if it were bare earth. The result therefore is that 3km is possible in all directions.

By adding our “Tropical.clt” clutter profile, calibrated for medium height, dense trees, we get a very different view which shows the effective range through the trees to be closer to 1km, or less, with much better coverage down the river basin, due to the lack of obstructions.

Demo 2 – A region without LiDAR

Scotland has very poor public LiDAR compared with England which has good coverage at 1m and 2m.

For this demo, Stirling was chosen which has 30m DSM only. A cell tower on a hill serving the town produces an optimistic view of coverage by default but when enhanced with a “Temperate.clt” clutter profile, calibrated for solid and tall town houses and pine forests (eg. > 50N, Northern Europe, Northern USA) we get a much more conservative prediction. As a bonus, the base resolution has improved three fold to 10m.

Demo 3 – A region with 2m LiDAR

You might think if you’ve got high resolution LiDAR data that’s enough. Wrong. Soft obstacles like trees especially will produce excessive diffraction as if they were spiky terrain. This manifests itself as optimistic ‘great’ coverage due to the diffraction coverage. By adding our “Temperate.clt” profile again we make trees absorb power and see where there are nulls in our coverage – beyond the houses and woods.

Despite our land cover being only 10m resolution, we are able to benefit from the full LiDAR resolution with 2m accuracy.

Inspecting a profile

The path profile tool will now show you colour coded land cover as well as custom clutter and 3D buildings. Crops are yellow, grass is green(!), Trees are dark green, built-up areas are red, 3D buildings are grey, water is blue…

The most significant feature in this image isn’t the coloured land cover, or the custom building (as both are features we’ve done before), or the fact we know the tidal river Severn sits lower than the man-made Canal beside it, It’s the fact that both are being used in the same model at the same time. They are different sources, different resolutions, different densities…

Path profile for 860m link showing 2m LiDAR, 10m Land cover and 2m custom building

Using and editing a profile

Clutter menu with 3D buildings enabled

Once you’ve got the hang of switching profiles you may find it needs optimising for your region. With the clutter manager in the web interface, premium customers can create their own profile based on field measurements for highly accurate predictions. After all no two forests or neighbourhoods are the same density.

Create your perfect profile and save it to your account. The system has 5 regional profiles ready for all users and you can add your own.

To use them, pick from the Clutter > Profile menu and ensure “Landcover” is set to “Enabled”.

If you have created custom clutter and want to use that, set Custom clutter to “Enabled” to blend it in.

For more see the web interface clutter section in the documentation.

Using clutter from the API

We played with a few designs before settling on this very simple template method where you set a profile within the environment menu as follows. This is a new value “clt” and you can still use the existing “cll” and “clm” values to manage the system clutter and custom clutter layers.

JSON request excerpt for a temperate “European” profile, with custom clutter, with 3D buildings and a 3D building density of 0.25dB/m

  "environment": {
        "clt": "Temperate.clt",
        "clm": 1,
        "cll": 2,
        "mat": 0.25
    },

Example for Jungle profile, without custom clutter, without 3D buildings.

  "environment": {
        "clt": "Jungle.clt",
        "clm": 0,
        "cll": 1,
        "mat": 0
    },

Further reading:

CloudRF API on Postman: https://docs.cloudrf.com/

OpenAPI reference: https://cloudrf.com/documentation/developer/swagger-ui/

What’s next?

Now that we have highly configurable environment profiles. it’s time to tune them with field testing. We’ve bought a heap of comms equipment and will be using it to optimise these profiles with real world measurements.

Posted on

Modelling microwave links over the horizon

Parabolic antenna radiation

In this study we look at modelling long range microwave links and the key parameters which help get the best out of mobile microwave terminals. When sited properly, a low power microwave terminal can communicate over 100km. When sited badly the same terminal can fail to communicate 5km…

A brief history of microwave

Commercial terrestrial microwave links spread in the 1950s during post-war radio innovation and are used today as backhaul in many key public and commercial networks. A microwave station typically consists of a large tower on high ground with round parabolic dishes communicating in UHF (300MHz) bands and above. As Wi-Fi spread after the millennium, outdoor fixed wireless access (FWA) terminals for long-range (>2km) consumer wireless links became increasingly popular, especially around ISM bands, but only recently have portable tracking terminals like the AVwatch MTS become available, intended for mobile ground to air use at distances exceeding 150km.

The theory

A microwave link is designed to be high capacity and focused in order to carry a large amount of information from one point to another. For this reason they need a short wavelength so are found in UHF and SHF bands above 300MHz.

The signal has a fresnel zone around it which is sensitive to obstructions. Achieving a line-of-sight link is not a guarantee of a good connection if the fresnel zone is obstructed by trees or buildings. The size of this zone is inverse to the frequency so a higher frequency has a smaller zone, akin to a laser beam, compared with a lower frequency which has a larger zone and so requires to be higher above the earth to clear it.

Fresnel zone in 3D

Radio horizon

The maximum distance a microwave link can go over the earth has little to do with RF power and much more to do with the dish heights and the horizon which limits how far a (short wavelength) signal can go. Whilst refraction can extend a link beyond the horizon, it is variable like the weather so impractical to model accurately and in a timely fashion. A simple formula to calculate the radio horizon is 4.12 x sq(height) where height is the combined transmitter and receiver altitudes. This formula produces a table of horizons which show that an improvement in height of several meters translates to a range improvement of several kilometers due to the earth’s curvature.

Transmitter height mReceiver height mRadio horizon km
116
11014
15029
110041
120058
140082
1800116
11600164

Parabolic antennas

As signal attenuation is substantial at these frequencies they require a highly directional antenna to improve forward gain and cancel noise from other angles. The larger the dish size the greater the gain and the smaller the beam.

A microwave dish antenna is easily recognised as a polar plot by it’s prominent main lobe, symmetrical side lobes and minimal back scatter. It has a very high front-to-back ratio which describes the ratio of forward power to rear in the order of +50dB. Due to it’s high directional gain it only needs to be driven with a modest amount of RF power to generate an effective radiated power of several hundred watts.

Using and creating a directional pattern

In CloudRF you can choose from thousands of crowd sourced patterns, upload your own in TIA/EIA-804-B / NSMA standards or create your own using a few parameters.

To select a template, open the Antennas menu in the web interface and click the database icon. This will open a search form. Search by manufacturer, eg. Cambium, or model. When you find a pattern you want click the green plus symbol to add it to your favourites list. You can now proceed to set the azimuth and tilt as if you were affixing it to a pole.

If the pattern does not exist, you can choose to use a “custom pattern” and define the horizontal and vertical beamwidths in degrees as well as the gain and front-to-back ratios in decibels to generate polar plots. These can be downloaded as a legacy .ant text file which you can upload in the service as a private pattern. A custom pattern is quick to self-generate but lacks side lobes and the full accuracy of a detailed pattern from a manufacturer.

An over the horizon link

For this demo, we’re simulating a link from the cliffs of Dover in England across the English Channel to Calais, France, a distance of 40km across the sea with no obstructions. The 18dBi terminal is 1m off the ground and is using only 3 Watts / 34.7dBm power for a total effective radiated power of 189W / 52.8dBm. A receiver threshold of -100dBm was used. This is too low for high speed waveforms but would be ok for a telemetry fallback waveform like QPSK.

A bad link

With a ground receiver on top of the cliff, the link just reaches Calais. It is obstructed on the radio horizon at ~25km, a full 10km before the coastline but the height advantage of the cliff makes line of sight just possible to some parts of the town. Despite just achieving line of sight, this link would still be unsuitable due to the majority of the fresnel zone being obstructed.

A good link

With the same cliff top terminal and RF parameters, the distant receiver is swapped for a drone 300m above the ground. The increase in height extends the link from ~25km to 75km, deep into France with good LOS.

An ugly link

This time the same terminal which just achieved 75km was misused down on the beach to communicate with a small boat in the channel. It’s effective range was less than 6km due to the radio horizon. As you can see from the normalised path profile chart below, the curvature impact is substantial when the stations are on the earth!

Thresholds and modulation

The simplest way to limit the modelling is with received power measured in decibel milliwatts. In this common scale, -100dBm is a sensible threshold for most digital systems. For planning purposes, a 10dB fade margin should be added for a -90dBm threshold. The actual thresholds needed will vary by systems and waveforms. Many commercial microwave links operate very high symbol rate modulation schemas which need received power above -70dBm to function.

You can also use Bit-Error-Rate (BER) as a threshold. This unit is used in conjunction with the noise floor and the desired signal-to-noise ratio (SNR) to derive a threshold. A modulation schema like QAM64 requires a relatively high 15dB SNR compared with 5dB for QPSK which can function on weak links. These thresholds are not absolute which is why we set the desired error rate. Errors are inevitable and the relationship between the BER and SNR is best visualised as curves. If you know you want QPSK for example but are not sure what error rate to use, use a mid level error rate such as 10-3 (One bad bit for every 1000 bits) which will give you a 7dB SNR. If the local noise floor is -120dBm your equivalent receiver threshold is a pretty low -113dBm.

Conclusion

Forget RF power, height is everything in creating a successful microwave link. This might mean moving a terminal several kilometers away from the distant station in order to gain a few meters in height but the benefit will be many more kilometers in range.

Posted on

Mapping mesh networks

Mesh network

Mobile ad-hoc networks (MANET) are an increasingly popular architecture in emergency services and Defence communications. Unlike classic repeater based networks, MANET radio network communications do not have fixed infrastructure so must form self-healing, self-routing networks.

MANET radio modules are well suited to working either off-grid away in remote areas or for providing resilience and independence in well served cities which may be suffering from power and/or network failure.

The bandwidth requirements and throughput of MANET networks varies substantially by waveform. Some are designed for range, others maximum throughput. For this reason, manufacturers offer a range of frequency modules.

Why RF planning tools don’t get used for emergency networks

RF planning software has evolved substantially in the 30 years it’s been used to build out fixed infrastructure networks. Time sensitive customers such as the emergency services have a difficult relationship with these tools. They need them, and often buy them, but don’t have the time to use them to their full capabilities. As a result they rarely get used on anything except training and exercises. Even then the numbers of staff directly interfacing with them will be very small, even in very large organisations of radio users.

The focus for most RF tools is planning with static sites. Whether that’s clicking on a map or uploading a spreadsheet of hundreds of locations it’s still static. MANET requires dynamic inputs and continuous computation which is where APIs come to the fore…

An API for MANET

Cloud-RF’s latest API has a function designed for ad-hoc networks called ‘points’. The points API functions like a point-to-point profile in terms of it’s input and output except it accepts an array of transmitters. This means you can test 10,50 or 500 transmitter nodes back to a single receiver in a single API call. It’s also fast as you’d expect and can model a link every millisecond so the 870 distinct links demonstrated in the video were processed in under a second, every second.

For more information on the points API see our documentation here: https://docs.cloudrf.com

Radio mapping planning

In this video, we demonstrate the Cloud-RF points API to model a MANET network (Mobile Ad-hoc Network). For this demo 30 nodes were moved around a 16km track covering a variety of terrain. Each node was tested against 29 siblings for a total of 870 links per second.


Coloured links denote good (green) average (amber) and poor (red) links between the nodes and map to 5dB, 10dB and 20dB signal-to-noise ratios. Only links exceeding 5dB SNR are shown or it looks like a bad game of kerplunk!

The radio settings used were L band (1-2GHz) with only 1 watt of power. This conservative start setting was chosen to show a dynamic range of links. Later in the video the template is switched at the database to demonstrate the impact or gain of using different bands such as 2.4GHz and 500MHz.

Integrating your data

The demo video used mock data and an unpublished script to present the results as a KML. The source of the data is irrelevant so long as it’s accurate and time sensitive. This could be a radio vendor’s dashboard or database. Many of the leading vendors such as DTC, Harris, Persistent Systems, Silvus and Trellisware have location aware GPS modules and software interfaces to display reported radio positions.

The required format for a point is WGS84 decimal degrees. The height is taken from the template which is defined within the body of the points request. The new APIv2 makes defining a template easy as a JSON object so you can have a local archive of template .json files.

A suggested workflow for API integration for dynamic points is as follows:

  1. Fetch a list of all radio locations as decimal degrees
  2. Choose a template as a JSON object
  3. Make an API request using the data and a client script to https://api.cloudrf.com/points
  4. Parse the JSON response to extract the results for each node
  5. Put the results on a map as lines
  6. Style the lines based upon your own local rules for your equipment, QoS and waveform eg. < 5dB is red

Download example client scripts from our Github site: https://github.com/Cloud-RF/CloudRF-API-clients

For assistance with integration and hosting options email support@cloudrf.com

Autonomous vehicles

Where this points API will really add value is in mapping and assisting autonomous vehicles who are invariably fitted with MANET radio modules. Whether it’s a drone or a UGV, this API can be used to rapidly exercise multiple routes to help make better decisions.

Posted on

RF penetration demonstration

During infantry training, soldiers are shown firsthand the impact of different weapons upon different materials to help them make better decisions about good cover versus bad cover. Spoiler: The railway sleeper doesn’t make it 🙁

As tactical radios have moved several hundred megahertz up the spectrum from their cold-war VHF roots, material attenuation is a serious issue which needs demonstrating to enable better route selection and siting. Unlike shooting at building materials it’s hard to visualise invisible radio signals, and therefore teach good siting, but equally important as ground based above-VHF signals are easily blocked in urban environments.

This blog provides a visual demonstration of the physical relationship between different wavelengths and attenuating obstacles only. It does not compare modulation schemas, multi-path, radios or technologies.

Bricks and wavelengths

Clutter data refers to obstacles above the ground such as trees and buildings. Cloud-RF has 9 classes of clutter data within the service which you can use and build with. Each class (Bricks +) has a different attenuation rate measured in decibels per metre (dB/m). This rate is a nominal value based upon the material density and derived from the ITU-R P.833-7 standard and empirical testing with broadcast signals in European homes.

A signal can only endure a limited amount of attenuation before it is lost into the noise floor. In free space attenuation is minimal but with obstacles it can be substantial. This is why a Wi-Fi router in a window can be hard to use within another room in the house but the same router is detectable from a hill a mile away.

The attenuation rate is an average based upon a hollow building with solid walls.

Common building materials attenuate signals to different amounts based on their density and the signals wavelength.

A higher wavelength signal such as L band (1-2GHz) will be attenuated more than VHF (30-300MHz) for example.

A long wavelength signal like HF will suffer minimal attenuation making it better suited to communicating through multiple brick walls.

The layer cake house

A brick house is not just brick. It’s bricks, concrete blocks, glass, insulation, stud walls, furniture and surfaces of varying absorption and reflection characteristics. Modelling every building material and multi-path precisely, is possible, given enough data and time due to the exponential complexity of multi-path but wholly impractical.

A trade-off for accurate urban modelling is to assign a local attenuation value. It’s local since building regulations vary by country and era so a 1930s brick house in the UK has different characteristics to a 1960s timber house in Germany. Taking the brick house we can identify the nominal value by adding up the materials and dividing it by the size.

For example, 2 x solid 10dB brick walls plus a 5 dB margin for interior walls and furniture would be 25dB. Divide this by a 10m size and you have 2.5dB/m. Using some local empirical testing you can quickly refine this for useful value for an entire city (assuming consistent architecture) but in reality the *precise* value will vary by each property, even on a street of the same design, due to interior layouts and furniture.

Range setup

We created nine 4 metre tall targets using each of the 9 clutter classes in attenuation order from left-to-right, measuring 10x10m and fired radio-bulletsTM at them from a distance of 300m using the same RF power of 1W.

The following bands were compared: HF 20MHz, VHF 70MHz, UHF 700MHz, UHF 1200MHz, UHF 2.4GHz. SHF 5.8GHz.

The ITU-R P.525 model was used to provide a consistent reference.

Only the stronger direct-ray is modelled. Multipath effects mean that reflections will reach into some of the displayed null zones, with an inherent reflection loss for each bounce, but these are nearly impossible to model accurately and in a practical time.

Here are the results.

HF 20MHz

VHF 70MHz

UHF 700MHz

UHF 1200MHz

UHF 2.4GHz

SHF 5.8GHz

Findings

  • Dense materials, especially concrete, attenuates higher frequency signals more than natural materials like trees
  • Lower UHF signals perform much better than SHF with the same power
  • Higher frequencies with low power can be blocked by a single house, even after only 300m
  • HF eats bricks for breakfast!

Summary

Modern tactical UHF radios, and their software eco-systems, are unrecognisable from their cold-war VHF ‘voice only’ ancestors in terms of capabilities but have an Achilles heel in the form of material penetration. To get the best coverage the network density must be flexed to match the neighbourhood.

This is obvious when comparing rolling terrain with a urban environment but the building materials and street sizes in the urban environment will make a significant difference too. Ground units which communicated effectively in a city in one country may find the same tactics and working ranges ineffective in another city with the same radios and settings. Understanding the impact of material penetration will help planning and communication.

Posted on

Modelling the Bit Error Rate (BER)

2400MHz_propagation_models When simulating radio propagation you can choose to model results in a variety of ways: Path loss will show you the attenuation in decibels (dB), Received Power will show you the signal strength at the receiver in dBm and field strength will show you the signal strength in micro-volts (dBuV/m). If you are using a digital modulation schema such as Quadrature-Amplitude-Modulation (QAM) your effective coverage will be dictated by the desired Bit Error Rate (BER) and local noise floor. This blog will describe these concepts and show you how to apply them to model a given modulation schema.

Bit Error Rate (BER)

The Bit Error Rate (BER) is the number of acceptable errors you are prepared to tolerate. This is typically a number between 0.1 (every 10th bit is bad!) and 0.000001 (Only one in a million is bad). This ratio is closely linked to the Signal-to-Noise-Ratio (SNR) which is measured in decibels (dB). A high SNR is required for a low BER. A low SNR will have an increased BER. Put simply a strong signal is better than a weak one and has less chance of errors. The reason error increases with SNR is because of noise. The closer you get to the noise floor for your band (about -100dBm at 2.4GHz), the more unstable and unpredictable things become.
DecimalExponentialLink quality
0.110e-1Bad
0.0110e-2Not bad
0.00110e-3OK
0.000110e-4Good
0.0000110e-5Very good
0.00000110e-6Excellent

The noise floor

The noise floor is the ambient power present in the RF spectrum for your location, frequency, temperature and bandwidth. Understanding the noise floor is important when modelling Bit Error Rate as it is subject to change and will determine your SNR. The SNR will determine your BER so if you want good coverage you need to know your noise floor so you can set your power accordingly. There are several factors that influence noise floor:

Location

A lot of noise if man-made so the noise floor is higher in a city than in the mountains. The difference varies not just by city but by country as countries have different spectrum authorities and regulate spectrum usage differently. The difference between a city and the countryside for a popular band like 2.4GHz is huge and can be over 6dB. Using a calibrated spectrum analyser with averaging is a good way to measure the noise floor. Ensure you set the bandwidth to your system’s bandwidth for best results. If you don’t own a spectrum analyser you can use Boltzmann’s Constant (see bandwidth section) and add an arbitrary margin to it depending on your location. This table has some suggested generic values:
LocationSignalNoise floor
Rural / RemoteWiFi 2.4GHz-101dBm
SuburbanWiFi 2.4GHz-98dBm
Urban cityWiFi 2.4GHz-95dBm
Rural / RemoteWiFi 5.8GHz-98dBm
SuburbanWiFi 5.8GHz-95dBm
Urban cityWiFi 5.8GHz-92dBm

Frequency

Thermal noise is spread uniformly over the entire frequency spectrum but man-made noise is not. The 2.4GHz ISM band is much busier than neighbouring bands for example due to its unlicensed nature. As a result the noise floor is several dB higher than a ‘quieter’ piece of the RF spectrum. Some of the quietest spectrum is co-incidentally the most tightly regulated, which keeps users down, which reduces noise, and improves performance.

Temperature

Thermal noise increases with temperature so in general you will get slightly more distance for your power in northern Scandanavia than in central Africa. The difference is about 1dB between a cold day and a hot day so can be considered negligible when compared with other factors. Budget for a hot day with an extra dB in your planning.

Bandwidth

Bandwidth has a direct influence on noise power because of Boltzmann’s Constant. This simple formula lets you calculate the absolute noise from the bandwidth. There are different ways to apply the formula but if you use dBm then the simplest form is: Noise floor dBm = -114dBm + 10 Log(Bandwidth in MHz) Using this formula you get the following results.
Receiver Bandwidth MHzNoise floorEquivalent system
0.1-124dBmLPWAN
1.0-114dBmBluetooth
10-104dBmWiFi 10MHz
20-101dBmWiFi 20MHz
40-98dBmWiFi 40MHz
100-94dBmSpectrum analyser with 100MHz FFT
If in doubt, use a noise floor of -98dBm

Example

A low power 20MHz wide 64QAM signal is being simulated in a city. The noise power is computed from the bandwidth with Boltzmann’s Constant as -101dBm to which we add +3dB for man-made noise putting the noise floor at -98dBm. When selecting a BER of 0.1 / 10e-1 the SNR is 11dB which equates to a receiver threshold of -87dBm.
The difference in propagation between the two error rates is noticeable with 64QAM but what happens if you switch up the modulation to 1024QAM which carries a higher SNR?
Posted on

Uplink and Downlink

A popular question when modelling GSM / UMTS / TETRA / LTE networks is how can I show the coverage from the mobile subscribers (Uplink)? Showing a tower’s coverage (Downlink) is easy but how do you go the other way back to the tower? If you know your equipment capabilities (tower and subscribers) you can calculate link budgets for both the uplink and downlink and then use those values to perform an area prediction. Here’s a simple example with the GSM900 band and without some of the other gains and losses which can complicate this for the benefit of novices. You can always add in your own gains and losses where you like to suit your needs.

1. Calculate the total effective radiated power for the BTS tower by adding the power and antenna gain (Limited to 33dBm in the UK)

2. Repeat for handset (Limited to 23dBm in the UK)

3. Calculate the minimum receive level for the BTS by subtracting the receive antenna gain from the receiver sensitivity eg. -110 – 10 = -120

4. Repeat for handset

5. Calculate the maximum allowed path loss (MAPL) by subtracting the minimum receive level from the ERP.

6. Repeat for handset

A balanced network will have similar values. If your base station can radiate for miles but your handsets cannot you have an unbalanced and inefficient network.

Finally, to see this on a map, use the ‘Path loss (dB)’ output mode in CloudRF along with the ‘Custom RGB’ colour schema. Enter the uplink value into the green box and the downlink value into the blue box and run the calculation. A typical cell site will have a greater reach (blue) than it’s subscribers (green). The system will automatically factor in the effect of terrain, ground absorption, antenna heights to give you an accurate prediction.