5G ‘Health Scares’ and the Anti-Tech Bandwagon (Part 2)

Firstly, I should probably explain a little of my background: a fair chunk of my work involves designing point to point wireless networks between buildings, and consulting on rural wireless broadband deployments for people in parts of the country which can’t get fibre broadband, and which BT have shown no interest in upgrading. In the past, I’ve also been involved in installing VSAT systems for British embassies abroad.

I’d like to think I have a reasonable grasp of how electromagnetic waves propagate through the atmosphere, though much of what I wrote in my earlier above is little more than A-Level physics. And before the inevitable accusations of bias come along, no, I don’t do any work for the mobile phone networks, nor do I have any interests (financial or otherwise) in their deployments.

Now that’s out of the way, back to science :)
That electromagnetic waves can have an effect on the human body is of little doubt, and no engineer would claim otherwise. But we need to look into that in rather more detail before we jump to “oh no, EM waves bad!!” scare stories.

There are two key elements to this: frequency and power. Different frequencies propagate in different ways depending on the material they’re trying to go through, but in *very* general terms, higher frequencies allow you to send more data, but struggle to propagate through objects. As an example, 10-12Ghz frequencies really struggle with rain, which makes them poor choices for long-range wireless communication, but excellent for weather radar, since you can measure reflections and signal returns. Take your home wi-fi as another example: 2.4Ghz will go through walls fairly well, but 5Ghz will usually give you better performance.

Now, let’s consider the human body: we’re big squishy, watery masses. The power required to push a high frequency signal through us is quite high, and the loss rate is huge. We’re far more at risk from very low frequencies: indeed, if you stand right in front of a misconfigured bass box at a gig with some seriously powerful gear, you can quite literally vibrate a person’s internal organs into mush (the spleen seems to be particularly affected – I’ll leave it for biologists to answer that one!). It’s why very low frequencies are now being used in medical practice to shatter gall stones and kidney stones as a potential alternative to surgery for high-risk patients, for example.

That brings us back to power: those bass boxes I’m referring to above are pushing something in the region of 1000W+, not dissimilar from your microwave oven, so it’s no surprise they can do damage. By contrast, even the most powerful wireless deployments I’m involved with rarely push more than 5W EIRP @ 5.8Ghz – a minuscule fraction of the power levels I’ve quoted above.

But I digress. Bringing it back to mobile networks, those are cellular by design. That means our devices connect to their nearest ‘cell’, then automatically hand over to the next ‘cell’ as we move around, and so on. This is quite different from a TV transmitter, for example, which involves one big transmission station per region with a whopping great big antenna, power supply, and outputting at fairly high power.

EM propagation is subject to the inverse square rule: as you double the distance from the transmitter, the signal strength decreases by the square of that distance. So if I’m 100m from a transmission and I increase my distance to 400m (4x further away), the signal I receive is 16x weaker than it was at 100m.

Why is that important? Because the more densely packed the ‘cells’ are within a given area, the lower the power required to transmit and receive. Look back to that big TV transmitter: it needs to transmit out to a range of – in some cases – hundreds of miles through all manner of atmospheric conditions, including rain, sleet, snow, etc. – all of which will have a degrading effect on signal propagation.

In practical terms, that means most ‘cells’ in a mobile network are transmitting at very low power levels by comparison to TV transmitters.

So the argument most frequently peddled by 5G (and before them, 4G) detractors that “more power will be needed” is fundamentally inaccurate – by having more ‘cells’ in a given area, each cell needs to operate at a reduced power level to achieve the same level of coverage.

Don’t believe me? You can test it yourself. Next time you go somewhere with little or no coverage on your mobile device, make a note of how much more quickly battery life drops: your device is having to increase its own transmit power in an attempt to either find a signal, or hang onto the weak signal it has.

So, we’ve addressed the fundamental misunderstandings about power, let’s now look at frequencies. As I said at the outset, different frequencies can indeed affect the human body in different ways.

In the UK, the spectrum allocated for 5G deployments is the old analogue TV space around 700Mhz. The data layer – 2.3Ghz (recovered from the MoD) and 3.4-3.8Ghz (recovered from amateur radio repeaters) – is also spectrum that’s been extensively used for decades.

So we’re not actually talking about anything that’s not been used before. If there were significant health concerns as a result of these frequencies being used, we’d have seen them long ago – remember, TV has been around for close to 70 years now, and TV transmissions were several orders of magnitude more powerful than any of the modern cell-based transmission systems.

Hopefully I’ve demonstrated that on both power and frequency, if there were legitimate health concerns, we’d have uncovered them long before now. 5G is just the latest in a series of bandwagons being targeted by those who don’t understand (or have chosen to deliberately misunderstand) how EM propagation works in reality. And as I said at the outset, none of this is really much beyond A-Level physics.

Chris

IT Consultant, Network Engineer, Photographer, Audiophile.

You may also like...