Home Forum Community Energy Compensation and Counting Statistics

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #1716
    jwet
    Participant

    First of all, I think this is a nice project and I applaud what you’ve done. My intention here is to provide some constructive if not positive feedback.

    As an radiation monitoring instrument developer holding several patents, I can tell you that the sensor part of this instrument is pretty poorly thought through. The networking aspects and reporting infrastructure are great but if you don’t start with solid data at the beginning, your results aren’t accurate.

    There are two major problems that I see with the front end of this instrument and some others that I can ignore. The biggest problem is that the Geiger-Mueller Tube is not energy compensated. It over-responds strongly to low energy radiation- you can look at the spec sheets for the GM Tube and see this easily. Though the tube is biased and measuring radiation, its output unit can’t be assigned any accepted units. If the entire network is built out of these uncompensated detectors, then an argument could be made that the relative readings are relevant. This assumes that the “source term” (energy mix) of the radiation doesn’t change- this is not a good assumption.

    The other issue is with counting statistics- radiation measurement is affected by quantum effects and statistics and low rates especially. The basis for this is that the standard deviation of a given count is proportional to the square of of the total counts. For example, if your sample is 100 counts, the 1 sigma statistics are 10 counts (10 cts) and the two sigma statistics are 20 cts. For low levels of radiation, these counting statistics give rise to enormous errors. Radiation instruments need to take this into account and report readings that take this into account.

    I would recommend some basic study of radiation measurement. A standard college text “Radiation Detection and Measurment” by Glen Knoll covers the above and other more subtle topics.

    Laudable effort, great infrastructure, I’m just not sure what you’re reading.

    Respectfully

    #1717
    uRADMonitor
    Keymaster

    Hello John,

    Your feedback is appreciated in the sense that anyone is free and welcome to express their thoughts and concerns. To serve those interested and in an attempt to save some of my time, I’ve presented most of the construction and technical details on the blog page, which might cover some of your concerns: https://www.uradmonitor.com/blog

    You will probably agree that the network’s main contribution comes from its ability to pinpoint trend changes and this goes above isolated absolute measurements which otherwise might require scientific accuracy. The procedure is automated using little power and ruling out most of possible human errors.

    The energy compensation is achieved by the thick metallic enclosure attenuating most of any incident low energy radiation where the non-liniarity manifests its peak. But if we are to be truly rigorous about this scenario we will quickly conclude that natural sources of radiation are placed well above the non-linearity interval. Allow me to quote from a different reply (got this question several times):
    “The context for what we do here is extremely important: remember that uRADMonitor measures background radiation. This background is composed of cosmic radiation, of very high energy, and three main terrestrial components of background gammas that are K-40 1462 keV, Bi-214 1760 keV from the U decay series, and Tl-206 from the Th decay series. Smaller contributions to total background come from Pb-212 239 keV, Bi-209 609 keV, Tl-208 908 keV and Bi-214 1120 keV. The most serious non-linearity of energy response in uncompensated GM tubes occurs below 150 keV – and most of that below 100 keV – so none of the background emissions listed above will be seriously affected, nor will the majority of cosmic rays.”
    But regardless of such details, in the event of a nuclear contamination there will be a consistent raising trend clearly visible on multiple nodes from affected locations and that is excellent to inform us all that something is happening. I stated several times that I don’t claim any rights of absolute precision on the readings, so all those interested are free to seek out more information or ask for a second opinion.

    Finally I’m not sure I understood what was your idea on the standard deviation and specifically what “quantum effects ” are you referring to? I’d point out that a geiger detector (like most of other man made tools) remains a very simple device, and it doesn’t make much sense to cover it with a veil of impenetrable complexity just to take it out of our reach.

    #1719
    jwet
    Participant

    As I said, my intent is only to offer constructive help to improve your design. I won’t make any further comments after this clarification. As I said what you’ve done is very laudable.

    On Energy compensation- your packaging, a few mm of Aluminum, does help with some low energy over response but not very much. You listed a number of high energy isotopes which are generally associated with background. However the assumption is that if there was an accident at a nuclear power plant for example, you would have a completely differnt mix of isotopes (source term as I mentioned before). Some important isotopes would Kr-85, Xe-133, I-131, Cs-137 and perhaps Co-60, if you look at the spectrum of these isotopes, you’ll find a lot of lower energy contributions. The idea of energy compensation is to make a detector that will read some accepted units regardless of the source term.

    On counting statistics, I’m not trying to obfuscate or make things complex but using terms like quantum effects. All this means is that at low rates like background, radiation is quantized, it is not continuous. Imagine measuring very small quantities of rain fall- at some point you get to “drops”, this is the same as counts in radiation. If you measure 100 rain drops in some interval, the statistical standard deviation (one sigma) is the square root of this total count- or 10 drops. This means that 67% of the time if you measure 100 drops, the actual rain fall could be anywhere between 90 and 110 drops. At very low count rates like 5 counts- the one sigma interval is the square root of 5, about 2.2 which is +-40% error. The standard for radiation detection is usually 2 sigma, a 95% confidence interval. This is not complicated but is real.

    A very simple way to implement some control over counting statistics is to just wait for a given number of counts total and let the time vary. A common way to do is to time the overflow of a 10 bit counter. For 1024 counts, the 2 sigma statistics would be about 6% – a reasonable measurement in finite time. At very low count rates, it could take a long time to produce a reading but a reading any sooner would be statistically inaccurate. This is the tradeoff.

    Good luck on your project. I really like it. I think it could be improved with a few changes and don’t want to detract from what you’ve done- its really quite good.

    Constructively,
    John

Viewing 3 posts - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.