The recent atypical measurements raised a few questions on the nature of the events: malfunction of the detector (specifically the Geiger Tube due to age) or a legitimate manifestation of a radiation phenomenon: http://www.pocketmagic.net/2013/01/urad ... n-reports/
Some bits of additional info, even if redundant, should share some light on the Geiger detector, as used in the uRADMonitor system. For a start, here is the current circuit diagram:
The R7 10M anode resistor is a quality component, mounted correctly without fingerprints, dust or humidity. Should this resistor malfunction in any way, one result could be avalanche discharges in the Geiger tube. This resistor has been checked recently and it meets the requirements.
Resistor R5 is used in a voltage divider , by the microcontroller to continuously measure the voltage across the Geiger tube, using one of its ADC (analog to digital) ports. In just a few words, for 400V set on the tube, the voltage divider will return 3.96V ( 400V x 47/(4700+47) ). The micro-controller is set to use a relative voltage of 5V (Vref). The ADC port that measures the tube voltage, has a resolution of 10bits, so for a maximum of 5V on PC3/ADC3, the software would read the max value of 2^10 = 1024. For 4V we would get the proportional value. By doing so, the measurement of the anode voltage is extremely precise.
Should the voltage on tube be lower than the preset threshold (400V), the duty cycle factor of the inverter's PWM is increased, in small steps, until we reach the target voltage (in a given tolerance, initially set to 5V and now changed to 2V). If the voltage on the tube is too high, we do the opposite and decrease the duty factor. In a few words, this is a known regularization mechanism that works well, fact confirmed by the logs, showing several months of constant voltage tube values.
Some time ago, there was an issue with R5: After some use, it "burned" and got interrupted. As a result, the voltage measurement could no longer be done, and the software saw the voltage on tube as being 0. As a result, the duty cycle began to rise, resulting in uncontrolled high voltage generation, way above the tube's safety limits. Luckily, this high voltage also resulted in microcontroller temporarily failure, and so the PWM generation stopped . This in turn, also stopped the dangerous voltage from being applied to the tube (except for the first few miliseconds). R5 has been replaced with a better quality resistor.
I was recommended the following useful resources:
1. An investigation into the causes of short lifetimes of geiger-muller tubes used in aircraft oil gauging systems
2. Test Procedure for Geiger-Mueller Radiation Detectors
3. Geiger Mueller Counting
This valuable resources are a must-read, proving their utility especially for understanding failure causes that can result in avalanche discharges and erroneous readings . Because of the recent atypical radiation measurements recorded by uRADMonitor, I need to evaluate every possibility including a malfunction of the tube.
Test Procedure for Geiger-Mueller Radiation Detectors (2), proposes a 15% slope as the cut off between good and bad tubes: "In general, the value of the slope must be less than 15 % to consider that the detector is in good conditions." This seems to be a very large figure when you think of really good new GM tubes having plateau slopes of less than 3%.
I ran a few measurements with the current setup. SBM-19's operation interval is 350-475V. For my particular tube, I got unsatisfactory readings for 350V, but good performance in the 375-450V interval, despite its age.
The measurements has been performed during several hours, to acquire sufficient data for computing average values.
Voltage on tube / CPM (with natural background radiation)
349.33 / 65.88
375.64 / 78.62
399.65 / 79.98
424.71 / 80.18
449.51 / 79.68
Complete measurement details in attached PDF document:
To test the tube's performance indicator as presented in "Test Procedure for Geiger-Mueller Radiation Detectors",
I set N1 = 78.62 , V1 = 375.64V , N2 = 79.68 and V2 = 449.51V
For P = 100 * ((N2 - N1)/(V2-V1)) * (100 / ( (N1+N2) / 2)) we get: 1.81060853% much under the 15% limit proposed in the paper.
The device has been set to 375V (in software), as compared to 400V used from October 2012 until now.