Radiograph calculation
An overview of the data pipeline from detector interactions to MR rendering is shown in Figure 2. The H2DPI system was set up on an optical board to ensure repeatability, and a more detailed description of the detector architecture can be found in the Methods section.

Overview of data processing from particle interactions to an MR screen. (a) Radiation interactions were observed in the H2DPI detector system via SiPM arrays connected to a digital transducer. (B) acquired waveforms are sent to the acquisition computer at regular intervals via an optical wire, where they are filtered and transmitted over WiFi to the HoloLens2. (c) HoloLens2 displays data in users’ field of view. single events (Dr) to estimate the total count rates in each detector (that is, both neutrons and gamma rays), which show a particular shape depending on the azimuthal position of the source (H). Average count rates for each detector are given as input to a pre-trained neural network (F) predicts the azimuth of the source, and shows an arrow to the user. Double scattering events in two different detector volumes (g) allows the reconstruction of the cone of incidence of the detected particle with the opening angle \(\alpha\). Post particle classification of individual events in organic glass (OGS) flashing via pulse shape discrimination (h) enables the differentiation of neutron-neutron and gamma-neutron incidence cones. A distinction line between pulse shape groups (white) is found by fitting a double Gaussian into each power equivalent slice and taking the valley as the decision threshold. Using the back projection algorithm, cones are superimposed to create the radiation image (I) in angular space (azimuth \(\theta\)Height \(\Phi\)) to gamma rays or fast neutrons. Then the user is shown a 3D projection of the MRI image.
The H2DPI consists of a suite of detectors optimized for imaging, in particular with a lattice arrangement of 12 scintillating organic glass rods for rapid detection of neutrons, gamma rays, and 8 CeBr3 Inorganic flash for photoelectric gamma ray detection. The radiographs are computed by double scattering events in H2DPI (Fig. 2g), i.e. all waveforms are filtered into event pairs that satisfy the conditions for occurring within 30 ns of each other in two different detector volumes. These doubles events are then classified via single event waveforms into gamma ray or fast neutron events4 (Fig. 2h), and filtered to consider only the neutron double and the neutron and gamma double. Each double scatter event has an associated estimated incidence conewhich are superimposed to create what is called a simple rear projection image3. Using neutron-neutron or gamma-neutron events, a neutron and/or gamma-ray profile can be cumulatively calculated over time in angular space (Fig. 2a).
Digital waveforms are transmitted from the detector system (Fig. 2a) in random batch sizes (usually set to write a batch approximately every 2 seconds) to the acquisition computer (Fig. 2b). Here events are nominated for cones of double scattering occurrence. The individual cones are then available on a locally hosted server that HoloLens2 can access over the wireless network (Fig. 2c). Sending individual cones instead of images has been found to reduce overall network strain, allowing for a consistent data flow from the detector to the HoloLens.
View real-time MRI radiation data
MR rendering in HoloLens2 is achieved through several steps using an application written in the Unity game engine25 which works on HoloLens. First, the HoloLens is provided with a detector location. 3D spatial mapping is a core feature of HoloLens, and the API provides the spatial map of the perimeter to the end programmer. We also use the HoloLens API which allows QR codes to be scanned in space to locate specific 3D coordinates in a given spatial map. A QR code is mapped onto the H2DPI surface at a known position relative to the center of the detector system (see Fig. 3a,d), enabling the computation of the reference location.
At the same time, HoloLens uses the local WiFi network to query a server hosted on your PC to retrieve detector data. After receiving the cones, the visualizer performs a simple back projection to generate the radiation image texture in the H2DPI space. Then pixel shaders are used to render the image on the spatial grid. For each pixel on the spatial grid, the screen pixel position is converted into reference coordinates, which are then converted into H2DPI spherical coordinates, which are used to sample the radiation image. A color map is applied to convey the intensity, which creates a colored impression of the spatial grid (it’s otherwise transparent). More intense coloring of the lattice indicates a more intense radiation profile in this spatial direction.
On top of the spatial grid staining, we duplicate each pixel in the posterior projected texture and project a beam from H2DPI to illustrate the angular nature of the information in the image. If the ray intersects the spatial network, and crosses a certain threshold, it draws a line between H2DPI and the spatial network. All mentioned processing steps usually require less than 1 second, providing a visually smooth experience. Raywidth or spatial grid coloration is available to the user through a virtual menu located on their wrist (for example, when the user’s wrist is in the HoloLens’ field of view, a menu appears with virtual buttons for settings).
As an illustration of the resulting visual experience, we show images captured with HoloLens depicting two experiments in Figure 3. In the first example, we masked 137Cs source under one of three caps (Fig. 3a). After about 30 seconds of acquisition time, the HoloLens displays the colored spatial grid (Fig. 3b) or rays (Fig. 3c) according to the rear projection image. At any time, the user can switch between viewing the gamma ray or the neutron image. Examples of neutron and gamma ray images are shown in the figure. 3e,f, where we show an MR view of an experimental setup using 252Adjust the CF source in front of the detection system.
An important aspect of hotspot color shading is the choice of color map. Hereby, we have followed the rule to use remarkably uniform sequential color maps26. We implemented the standard perceptually unified sequential color maps used in Matplotlib (viridis, inferno, magma, plasma, cividis)27 As options, the user can choose them in the default list. So the user can choose the appropriate color scheme to increase the contrast for a particular background, or to adjust the user’s ability to distinguish colors. For example, a colored graphic containing red (hell) on a green background (such as grass) may not be helpful to a user with deuteranomaly, and they may choose to switch to a yellow/blue-based map (cividis).

Examples of mixed reality images as seen by a user wearing a HoloLens2. (a) A 100 \(\moe\)Ci 137The C source is set under one of three covers in front of the detector system. (B) in MR, the user is shown a spatial grid coloration approximately 30 seconds after acquisition. (c) The user may also choose to optionally display the rays indicating the hotspot. (Dr) a 1 mCi 252CF source in front of the system. (H) a gamma-ray image is formed within about 20 seconds. (F) The user can switch to the neutron image (acquired at the same time) via the wrist menu. A convergent neutron image intersecting the expected source location is seen after about 1 minute.
Quick source search via neural network
Double scattering events are rare compared to single scattering events, and in many scenarios the convergent image may not form sufficiently within a given measurement time. In order to optimally guide the detector operator in moving the detector system to a closer location, we propose to use the count rates of individual interactions to quickly predict the location of the source (Fig. 2d), since count rates can be estimated at any point in the measurement. Due to the arrangement of the detector grid, we would expect a gradient in count rates (Fig. 2e, i.e., detectors closer to the source have a higher count rate than those farther away, enabling the statistical learning method to predict the angle of incidence of radiation).
Count rates for each detector are estimated by rolling averaging over the waveforms over 1–2 measurement seconds long and are presented as inputs to a trained neural network that produces an estimated azimuthal location of the source. We describe the neural network (training data and hyperparameters) in the Methods section. In MR, the user is shown an arrow indicating the azimuthal direction of the prediction.
Figure 4 shows an example of an MR visualization for source orientation estimation for both 137Cs and 252CF sources. Prediction, due to the small size of the neural network, takes less than 1 second and delivers the expected result. Estimating uncertainty in neural network predictions is beyond the scope of this study, and we therefore present this result qualitatively as a proof of concept. Note that the neural network has been trained 252Cf simulations, but the predictions were correct for both 252Cf and 137Cs experiments. This shows promise for future investigations to elucidate the performance of a neural network, particularly with regard to training it on multiple sources, multiple distances, as well as elevation angle.

Examples of mixed reality visuals as seen by a user wearing a HoloLens2 for rapid source orientation estimation using a neural network. (a) A 100 \(\moe\)Ci 137The source of Cs is kept at three different locations around the detector, the MR screen displays a yellow arrow to represent the neural networks guess. (B) a 1 mCi 252The Cf source is kept at various locations around the detector. Predicting and displaying the estimated location of the resource took less than 1 second in each case.
Detection limitations of the current H2DPI system
Double scattering events are relatively rare, and their occurrence constitutes the main limitation for achieving a converging radiation profile, through an experimentally determined threshold. In Figure 5 we show the rates of double scattering events for 252Cf and 137Sources of Cs for increasing system distances to illustrate this. As the distance increases, as is evident from the decrease in the single event rate following approximately the inverse square law, we observe a decrease in the double scattering rates. We also indicate the approximate event rates that would produce a convergent image (arbitrarily defined as 1000 symbols, based on the authors’ experience), showing that the sources used can only be imaged in less than 1 hour for convergence within a radius of 1 meters around the system. The H2DPI system will be future-updated to contain up to 64 scintillating organic glass rods, boosting detection efficiency nearly ten times for neutrons and twenty times for gamma rays, relieving convergence time anxiety4.
However, the sources we used are relatively weak. Extrapolation of rates of coupled events with activity is expected to scale linearly, allowing predictions for specific scenarios. For example, most radiation incidents due to loss or theft of sources in the late 20th century involved sources with activities >100 Gb28, and thus three times more powerful than the sources we used in this work. The asymptotic image hereby will be , assuming the appropriate model that we used for the gamma events of 252Cf, configured within 1 minute at a distance of more than 10 m, allowing a secure assessment of the source location even with the existing system.

Double dispersion reaction rates (per minute) with a source-to-detector system distance of 100 \(\moe\)Ci 137Cs and 1 mCi 252CF sources. Inverse square law model fits are shown for illustration, along with two lines indicating the time required to achieve a convergent image (1000 double scatter events). Error bars are often smaller than the mark.