
doi: 10.1063/1.3275658
The development of high‐speed, high‐performance gamma‐ray spectroscopy algorithms is critical to the success of many automated threat detection systems. In response to this need a proliferation of such algorithms has taken place. With this proliferation comes the necessary and non‐trivial task of validation. There is (and always will be) insufficient experimental data to determine performance of spectroscopy algorithms over the relevant factor space at any reasonable precision. In the case of gamma‐ray spectroscopy, there are hundreds of radioisotopes of interest, which may come in arbitrary admixtures, there are many materials of unknown quantity, which may be found in the intervening space between the source and the detection system, and there are also irregular variations in the detector systems themselves. All of these factors and more should be explored to determine algorithm/system performance. This paper describes a statistical framework for the performance estimation and comparison of gamma‐ray spectroscopy algorithms. The framework relies heavily on data of increasing levels of artificiality to sufficiently cover the factor space. At each level rigorous statistical methods are employed to validate performance estimates.
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
