Radiation Monitoring - Everton Hills, Queensland Australia - FAQ


FAQ

Last updated: 22nd Nov 2017


Why did you build a radiation monitoring station?

Owing to recent (2017) political events, the chance of some sort of nuclear war is higher than it has ever been in my lifetime—and I am in my 50's and have lived through most of the Cold War. I believe that, while still unlikely, the chance of a bomb going off somewhere, probably through accident or stupidity, is non-negligible. Were this to happen, I believe it would be useful to have some form of fallout monitoring in place. There is not to my knowledge any official monitoring network in Australia and I could easily imagine a deluge of misinformation and panic drowning out any useful information were something unfortunate to occur.

By building my own monitoring station I can at least get information that, if not as accurate as that from a professionally built station, is at least trustworthy (in the sense of being under my personal control) and relevant to my exact location.

This station uses Geiger tubes. A scintillation counter would be better, as these tell you what's actually in the fallout. However, a scintillation counter is much more expensive (thousands of dollars rather than a few hundred) and complex to set up. This station is, however, still fairly sensitive. In principle, with the default 1 hour filter, it can pick up changes in background level above +/- 1.5%, so it should give at least some idea of what's going on.

What do you mean by "Radiation"?

Throughout this page I have used the term "Radiation" for the purposes of brevity. Technically I am referring to ionising radiation, and more specifically gamma radiation above perhaps 20 keV. The Geiger tubes in this station will not detect alpha radiation and have an unknown (and probably low) sensitivity to beta radiation.

Ionising radiation is the unequivocally bad stuff you get from nuclear fallout. This station is not concerned with non-ionising radiation, such as one gets from WiFi, mobile phones etc.

What does the monitoring station consist of?

This station was custom designed and built. It currently consists of two (nominally) identical Geiger counters, in order to provide some level of redundancy.

Radiation Monitoring Station

Each counter uses a total of five SI-22G tubes and one SI-3BG. The pulses from each tube are recorded separately, this allows failed tubes to be detected and also provides the opportunity to cross-check readings; a sudden increase on one tube is most likely not real (i.e. a faulty tube or a statistical artifact), but an increase on all tubes simultaneously is most likely real. This is part of the reason why a total of ten SI-22Gs were chosen rather than a single, more sensitive, pancake type tube. Another reason is that pancake tubes are more delicate and expensive and there a greater risk of them being damaged in transit when being shipped from the other side of the world.

The SI-3BG is a high-range tube used in Soviet-era military Geiger counters. This tube is useless at background levels—you get one click every five minutes or so. The reason for including it is just in case there is heavy fallout. One problem with Geiger tubes is that they suffer from an effect known as fold-back; beyond a certain level of radiation, the higher the radiation level, the lower the count rate. This can make them dangerously inaccurate at higher than expected rates. As the SI-22Gs are quite sensitive, this fold-back happens at a relatively low level, and while this level is still way above background levels, it could potentially be exceeded during heavy fallout.

Now the chance of heavy fallout in this part of the world is extremely low, but there are still scenarios where it may occur. E.g. recently the nuclear-powered aircraft carrier USS Ronald Regan visited Brisbane. Were the reactor in this to melt down with the wind blowing the wrong way, things could get very unpleasant here. Mainly though, I didn't want to knowingly design a monitoring station that was going to fall over just when it was needed the most.

The Geiger counter circuit is based on the Theremino Geiger adaptor, which is a simple yet stable and efficient design.

In addition, the monitoring station uses an Arduino Nano clone as a brain and an ENC28J60-based Ethernet board to communicate with this server. This is very cheap hardware, but seems so far to be adequate for the job. Data is logged in a database on this server and is retrieved by a custom-written program which displays it in graphical form on this site.

Where did you get the Geiger tubes from?

The tubes are "New (old stock)", purchased from a Ukrainian seller (any-devices) on eBay. They were manufactured by the Soviet Union in the 1970s and 80s but are (supposedly) unused. Everything I have purchased from this seller has worked, although often shipping takes some time.

How do I interpret this data?

Currently, the normal background rate for this area is 0.12 +/- 0.01 μSv/h. Note that this is μSv/h; i.e. millionths of a Sievert per hour.

Normally this rate will remain steady, with fluctuations mainly caused by statistical noise. You can get an idea of how much statistical noise is present by turning on the Error Level indications.

The Error Level indications consist of upper and lower lines on the graph. It may look as if they indicate maximum and minimum values, but in fact they are set to three standard deviations plus and minus the average value for the period. In English, this means that any change outside of these lines has a 99.7% chance of being a real change. The nature of the statistical noise on a Geiger tube means you can never be 100% sure that a change is real, however by sampling a lot of data (which is what we try to do here), we can get very close.

The Error Level indications are helpful for periods up to a few days. Beyond that, long term drift makes them less useful. It would probably be possible to make these error level indications track the average to some degree so that they would be more useful over a longer period, however this has not been done (and needs to be approached with caution so as not to be misleading).

Under natural conditions, there are two main things I am aware of which may cause changes in the radiation level; changes in cosmic radiation and "radon washout events". The Bureau of Meteorology have a Guide to Space Radiation, which explains about Cosmic Radiation (and provides a very useful summary of radiation in general and is worth reading for this alone), however it is not clear to what extent this will influence the radiation readings at this site. It would seem likely that a large solar flare would be detectable.

Radon washout events are caused by radon and its daughter products, which are gradually released for the earth into the atmosphere, being washed out of the air by rain. Apparently this is more pronounced if there is heavy rain after a long dry spell. I believe a minor radon washout event may have occurred at about 8pm 22nd Sept 2017 (only a couple of days after station was first switched on), and the background rate increased by about 10% for a period of about an hour.

This is boring! Why is nothing happening?

Nearly all of the time, there will be nothing happening, the radiation level will look like a boring virtually flat line, constant to within a couple of percent (and much of this variation will be statistical artifacts). This is normal and it is a good thing.

If I knew that things would remain like this—generally flat with the exception of an occasional small peak due to radon washout—I would probably not bother with a monitoring station. There's not enough to see to be worth the effort involved. However, if something does happen, it is important to have a good amount of data from before the event for comparison purposes.

So currently the monitoring station can be thought of as performing the unglamorous but necessary role of establishing a baseline.

The level is high! Should I panic?

In terms of actual dose, even a quite substantial increase in this background level is (in my opinion at least) not in itself harmful. Elsewhere in the world, people live with apparent impunity in areas where the naturally occurring background levels are up to ten times as high as the level here. However, the point is that if the level is trending upwards significantly, there must be some cause.

Most likely the cause will be a radon washout event. Radon washout events have been part of the natural environment pretty much forever and have not been shown to be harmful. Some of the radon daughter products involved in such an event are alpha emitters, which are not picked up by this station, however alpha emitters are only a problem when taken internally, so getting rainwater on your skin during a washout event isn't a problem.

I suspect that radon washout events actually decrease a person's overall radiation dose, because they wash out of the air radon daughter products like polonium which would otherwise be breathed in. The main way alpha emitters like polonium hurt you is through being breathed into the lungs and in fact this is a significant component of the background dose to which we are all subject.

But what if it's not radon washout?

If the level rises more than say, 20% above the usual value (i.e. more than about 0.15 μSv/h), there is something unusual happening. You will need to use your judgement with this; if there are storms about in the Brisbane area it could still be a radon washout (I don't as yet have enough data to determine how far these things can go). If there are solar flares mentioned on the news, this could be the cause, and of course if there is reason to believe that fallout from some event may be on the way, then this might be what you're seeing.

Bear in mind that the chance of a nuclear disaster affecting this part of the world is very low and the chance of fallout reaching us before such a disaster was in the news is even lower. Consequently it is more likely to simply be a fault in the counter, or some localised event (maybe someone who has been treated with medical radioisotopes is standing near the detector or the guy next door has done some landscaping with rocks with monazite in them etc. etc.). I will try to post a notification if something anomalous happens. If you are concerned, your priority should be to check other monitoring stations; this should give a better idea if the event is real, and if so, what areas are being affected.

OK, but what levels are actually dangerous?

If you are genuinely interested in knowing about the dangers of ionising radiation, the web is one of the worst possible places to find this information. You have every extreme from those claiming that almost undetectably small doses of radiation have extraordinarily devastating effects, to those who claim that doses which are on the borderline of causing acute radiation poisoning are actually health-giving. And of course everyone claims that everyone else is either a fanatical tree-hugger seeking to have us all live in caves or an evil member of the military/industrial complex bent on destroying the world.

However, it turns out there is actually objective information out there. You should try to find a copy of ICRP Publication 103. ICRP stands for International Commission on Radiological Protection and is an independent (non-government, non-industry) commission of medical professionals that has been around since 1928. That's not a typo; by 1928 people had been burning themselves with X-ray machines and getting poisoned by radium long enough that things needed to be done. They view things from a medical perspective and their purpose is to weigh up the positives (X-rays, radiotherapy) against the negatives. This is in contrast to other organisations such as the IAEA (International Atomic Energy Agency) and the ANSTO (Australian Nuclear Science and Technology Organisation), whose stated aims include the promotion of nuclear technology and therefore have something of a conflict of interest when it comes to safety.

ICRP Publication 103 is not distributed free of charge, but with some searching it is possible to find copies on the web (it's best I don't link them owing to copyright issues). If you can't find ICRP Publication 103, one document that is freely available is BEIR VII Phase 2. This is useful and objective and in fact the ICRP publication itself uses information from this report.

A problem here is these publications are both lengthy (300-400 pages) and contain a lot of detail, and it's super annoying to ask the question "is X dangerous" and have to spend an hour reading only to find out the answer is "it depends".

A rough guide:

The range of 0.1 to 0.2 μSv/h is normal for Australia. Some places might go to 0.25 (different figures are quoted by different people and I have been unable to find any definitive data). A few specific places (generally near mines and mineral outcrops) will be higher; E.g. if you visit a place like Radium Hill in South Australia, you can assume the level will be somewhat higher than average.

Some places in the world have natural background levels of up to 1 μSv/h. Such places are permanently habitable without apparent ill effect. My understanding is that anything under about 1.5 μSv/h is suitable for permanent habitation; this is based in ICRP data, although they do not make any specific recommendations on this.

Above this level, it starts getting complex because the duration of exposure comes into play. For instance, the 1.5 μSv/h figure above was based on the total dose over a lifetime. So a 40 year-old person (who is unlikely to live more than another 40 years) could probably tolerate a dose rate of closer to 3 μSv/h for the rest of their life. However, you would also have to factor in what that person had been exposed to in the past.

Another factor when considering long-term issues is that radioisotopes decay over time. Most naturally occurring radioisotopes (uranium, thorium and potassium-40) have extraordinarily long half-lives and don't decay noticeably over a human lifetime. Even short-lived natural radioisotopes (radon and its daughter products) are created by the decay of uranium and thorium and so are maintained at relatively constant levels. Man-made contamination, however, often consists of radioisotopes with comparatively short half-lives.

So moving into a contaminated area with a high background level, with the intent of living there permanently, might be OK if it's known that the level will decay quickly enough that the total dose over time remains low.

I think it would be fair to say that levels between 1 and 10 μSv/h are not especially dangerous. We are exposed to levels in this range when we fly (I have measured up to 3.5 μSv/h in an aircraft in flight). You could live in an area with such a rate for an extended period (although ideally not permanently) and your main concern would not be the background level itself, but more if you were consuming contaminated food and water.

I probably need to stress this point. Except for a very few locations, a background level above 1 μSv/h would be due to man-made contamination. In this case, the external background radiation (i.e. what we are measuring at this monitoring site) may be less important than the effects of taking this contamination into your body; either through food or water, or (particularly importantly for some contaminants) breathing it in. To make sensible decisions on safety, you need to know exactly what contamination is present. A private individual would be unlikely to have sufficient information to make this judgement (and most certainly this monitoring site does not provide this information).

For what it is worth, I do not believe that an isolated nuclear disaster in the Northern hemisphere, such as a reactor meltdown or an accidental bomb detonation, would send the levels in this part of the world even close to 1 μSv/h. Quite possibly it would be barely detectable by this monitoring station. Why did I build the station if this is the case? Because a) I might be wrong and b) A larger disaster, or one closer to home, is not impossible.

From 10 to perhaps 100 μSv/h, you start to get into area of chronic radiation poisoning if you live there long enough. At the low end of this range, this takes a period of decades, but as the rate increases the bad effects occur sooner in proportion. At these rates, both internal dose, and decreased exposure due to radioactive decay over time, would be important considerations to factor in. This can get complicated because higher radiation levels are often associated with short-lived radionuclides, so an area with initially higher radiation levels might actually be less dangerous in terms of chronic radiation poisoning if the levels are going to decay quickly. You really need to know what type of contamination is present to make a judgement.

Apparently South Australian regulations require that areas accessible to the public do not exceed 25 μSv/h. [It is not clear whether any places in South Australia exceed this level; I believe that even the former nuclear test sites at Maralinga are now below this level.]. Roughly speaking, hanging around in an area with this rate for a fortnight gives you the equivalent of one whole-body CT scan, so spending long periods in such a place is inadvisable, but a short visit shouldn't be a big deal.

Above these levels (100+ μSv/h) it's a case of definitely keeping away unless you know exactly what's going on. You can assume the cause is man-made contamination spread about the place and you would need to take precautions against this (masks, protective clothes etc.). And, by the way, protective clothing prevents you from becoming contaminated but cannot block gamma rays.

When we start looking at levels 1 mSv/h (1000 μSv/h) and above, we're talking about places where you cannot hang around for any significant length of time; you need to either retreat into a shelter or evacuate.

According to Nuclear War Survival Skills [C H Kearny 1987]: "6 R per day [2,500 μSv/h or 2.5 mSv/h] can be tolerated for up to two months without losing the ability to work". As the name of the work suggests, this applies to war-time conditions where there's the imminent possibility of death, and I suppose in that case you do what has to be done and hope for the best. But this is what you'd have to call a heroic dose of radiation, appropriate only in the most extreme of emergencies. The total dose you would receive after 2 months exposure to 6 R/day is around 3.6 Sv, which stands a significant chance of being fatal.

In a civilian context, you probably wouldn't want to remain in an area with a rate of 2.5 mSv/h for more than a few hours (you would be getting the equivalent of a CT scan every 3 hours).

Much higher levels than this are possible. If you're in the direct path of heavy fallout from a nuclear weapon, the level could theoretically reach over a thousand times this amount (10+ Sv/h) during the first few hours. This would be enough to overload even the high-range SI-3BG tube in the monitoring station (quite probably to the extent of physically damaging it), although this is an academic problem since it's inconceivable there would be power and Internet available under these conditions.

A (fairly meagre) silver lining to fallout like this is that the radioisotopes that make it so active are short-lived (in fact, these two things go together). The theory is that if fallout starts off at 10 Sv/h, it will drop to 1 Sv/h after only 7 hours, to 0.1 Sv/h after 48 hours and to 0.01 Sv/h (10 mSv/h) after a fortnight. Unfortunately, I believe this only applies to fallout from a bomb; fallout from a reactor meltdown will not reduce so quickly because it contains a greater proportion of long-lived isotopes, and nuclear waste is pretty much all long-lived. A nuclear waste spill that starts out at 10 Sv/h will probably still measure close to 10 Sv/h a year later, unless the waste gets covered over or physically removed by rain etc.

What about Smart Meters?

The idea that smart electricity meters emit some sort of harmful radiation is a myth put out by anti-environmentalists to frustrate energy conservation measures.

What other radiation monitoring sites are there?

Some relevant sites (from nearest to furthest away) are:

What limitations does this monitoring have?

There are no standards for radiation monitoring stations that I am aware of, except that they are often located 1 m above ground level. This station is somewhat lower than 1 m; this is due to physical limitations at the site, including the need to keep it out of direct sunlight as the SI-22G tubes have a relatively low temperature rating. The type and quantity of tubes is also probably unique (although I believe the site in WA mentioned above may have used a single SI-22G). This means that the readings at this site are not directly comparable to other sites.

There are some other limitations to this monitoring which it is worth mentioning:

Energy Compensation

Geiger tubes are over-sensitive to low energy gamma radiation. To clarify: "low energy", means that the gamma ray photons are individually of low energy (e.g. 50 keV* rather than 500 keV) rather than that the overall dose rate is low. You can think of it like different radioisotopes emit different types of gamma rays, and that a Geiger tube can over-count some of these types.

[* = As this is the Internet, someone may try to score points by claiming that gamma rays with an energy less than 100 keV should really be referred to as X-rays. I am aware of this definition but consider it to be an unhelpful artificial distinction which I choose to ignore.]

The recognised way to fix this is to partially shield the tube with a substance that blocks the lower energy gamma rays. The best shielding material is apparently tin foil(!); but this has to be the actual metal tin—not kitchen-type aluminium foil—and it has to have a very specific thickness, which depends on the characteristics of the tube. Unfortunately, I don't have sufficient information to determine what thickness is required and foil of this type is expensive and difficult to obtain. Therefore I have not installed any energy compensation on these tubes.

Some would say that this means it's invalid to display a dose rate on this site. This is technically true, but may not be much of a problem in practice. The natural radioisotopes that produce the bulk of the background radiation have relatively high energy gamma rays and will be correctly accounted for. With fallout, it's more complicated because different things may be present depending on its origin and how old it is. Some of the more notorious components of fallout—iodine-131 and caesium-137—have high energy gamma rays, so will also be correctly accounted for (in fact, many Geiger tubes are calibrated against caesium-137). If there is a slight tendency to over-count due to low energy gamma emitters like americium-241 (which is fairly nasty and might be present if a reactor melts down), I don't think that under the circumstances you should complain.

I display an estimated dose rate by default simply in order to have a figure that means something. Displaying a count rate (and I have provided this as an option for the purists) is technically correct, but isn't very meaningful.

Alpha Radiation

Alpha radiation is not very penetrating and is easily blocked by most materials. In order to detect it with a Geiger tube, it is necessary for this tube to incorporate a thin "alpha window" to allow the alpha particles into the interior of the tube. This alpha window is usually made from an extremely thin and fragile piece of mica, which renders the tube vulnerable to damage. I have heard that it is thin enough to be punctured by a blade of grass, although I cannot vouch for this personally.

Owing to the expense and fragility of alpha-sensitive tubes, I have not used them in this station.

I believe it's legitimate not to use an alpha-sensitive tube. Alpha radiation really needs to be treated differently to beta and gamma since it does not contribute to a person's external dose (which is what this station is reporting) and only becomes an issue when breathed in or swallowed. Logically, you would need to quantify alpha emitters by concentration in the air (e.g. Bq/m3) and report this separately.

Such monitoring is feasible (it's how radon detectors work) and I may set up such monitoring in future.

Dead-time compensation and fold-back

Geiger tubes suffer a phenomenon known as fold-back at high levels of radiation. In simple terms, the tube is pulsing so fast that often one pulse has not finished before the next starts. The electronics cannot detect this as a separate pulse and hence pulses are lost. This means that as the radiation level increases, the pulse rate of the tube increases less in proportion. Beyond a certain level, the pulse rate decreases as the radiation level increases, this is known as fold-back.

So we have two issues; the reduction in pulse rate as radiation level increases, which needs to be compensated for or it will lead to inaccuracy, and actual fold-back, which can't be compensated for and needs to be detected.

The former is known as "dead-time compensation"; because you are compensating for the time the tube is "dead" (i.e already producing a pulse and unable to produce another one). There is a standard formula used for dead-time compensation that gets bandied about on the web, however it turns out that this formula is a simplification and only works at low count rates. Apparently it only compensates for "non-paralyzable dead-time", whereas a Geiger tube has both paralyzable and non-paralyzable dead-time. Apparently, there is no analytical solution to compensating for paralyzable dead-time, so you have to solve it using iterative numerical methods. This is bad enough (although doable), but it also requires knowing tube parameters that can only be derived by complicated experiments that it's not possible for a private individual to perform.

My intent is to compensate based on a (somewhat dodgy) graph of count versus radiation level that appears to apply to this tube, probably using simple linear interpolation. At the time of writing this has not been done.

I have approached the fold-back problem by installing the SI-3BG tubes; at high rates we switch to using the SI-3BGs and ignore the SI-22Gs. Of course a SI-3BG will experience fold-back eventually, but at such a high level that it is highly unlikely that the rest of the system would still be working.

Calibration

This is quite problematic. There is little data available on these Geiger tubes, and some of it is misleading. A figure of 540 μR/count is often seen for the SI-22G. It appears that everyone is quoting a single (as yet unidentified) source for this. This figure is wrong by about a factor of two. There is a data sheet for what is supposedly a modern equivalent of this tube (a Gamma-11C) which quotes 205-221 μR/count (incorrectly specified as count/s per μR/h, but is actually count/s per μR/s), but it is not clear at what rate this ratio was measured. There is a graph, but it it only starts at a point considerably above background levels.

There is also the separate can of worms involving converting R (Roentgens) to Sv (Sieverts), which, strictly speaking, can't be done precisely. And in addition, any Soviet era documents might be using the GOST definition of the Rontgen, which is about 5% different to that used by the West. I have used the conversion of 100 R = 1 Sv, which is a horrible oversimplification, but one which is commonly used elsewhere.

In addition, regardless of the data-sheet values, the actual tubes vary in sensitivity by up to about 2% from each other.

The obvious solution in this case would be to calibrate the each tube against a known source or meter. The problem is that I have neither. Consumer-grade Geiger counters use low sensitivity tubes, so the statistical error can be significant even if the counter has been properly calibrated; and getting a counter which has a reasonable assurance of actually having been calibrated at all requires purchasing one of the more expensive types (> $1000).

As for obtaining a radioactive source, I have absolutely no desire to obtain such a thing. My problem is not with the radiation, but with the politics: the security forces in this country are not renown for their technical knowledge (and are led by a guy who is frankly a malicious idiot), and I can easily imagine that possession of even a tiny, harmless source might lead to some sort of extreme overreaction. I'm probably already on a list somewhere for purchasing the Geiger tubes.

So what I've done is to take the measurements made by Radu Motisan on his blog PocketMagic and use these to derive a conversion factor that is valid at background levels. This gives results which tally closely with the Caloundra monitoring station. Later, I will use the graphical data referred to above to extend this to higher levels.