Description
This work explores the coupling of computer vision algorithms for real-time scene
reconstruction with mobile gamma-ray imaging platforms. These algorithms, solutions to a
class of problem known as simultaneous localization and mapping (SLAM), provide
estimates of the location and orientation (i.e. pose) of the system, as well as a point-cloud
model of the surrounding scene as the sensor traverses the environment. The SLAM
package used in this work is RGBDSLAM, and open-source SLAM solver that uses data
from rgb-d cameras; the most notable example of which is the Microsoft Kinect. A
Microsoft Kinect and the RGBDSLAM software have been integrated into multiple
gamma-ray imaging platforms; one HPGe-based and another based on CZT. An iterative
algorithm based on Compton imaging has been developed to reconstruct 3D distributions
of gamma-ray sources from the pose estimates and the interaction information from the
position-sensitive gamma-ray detectors. The 3D model is also incorporated into the image
reconstruction to decrease reconstruction time and improve image quality.