Namvar, Morteza, Ivan, Lucian et Mcdonald, James G.. 2025. « Augmented reality-based radiation visualization for nuclear-facility decommissioning ». Communication lors de la conférence : CSME-CFDSC-CSR 2025 International Congress (Montreal, QC, Canada, May 25-28, 2025).
Prévisualisation |
PDF
497 - Augmented reality-based radiation .pdf - Version publiée Licence d'utilisation : Tous les droits réservés aux détenteurs du droit d'auteur. Télécharger (107kB) | Prévisualisation |
Résumé
The decommissioning of nuclear power plants is a complex process that requires precise knowledge of radiation fields to identify high-risk zones, satisfy the safety requirements for protection of workers and equipment, and enhance the efficiency and cost-effectiveness of the operations. Augmented Reality (AR) offers a novel real-world approach to radiation visualization by overlaying digital radiation maps onto the real-world environment. This work presents an AR-based, three-dimensional (3D), radiation visualization system that provides an intuitive and immersive representation of radiation fields within nuclear facilities, thereby enhancing the occupational radiation safety. The process includes: 1) 3D Environment Capture: a LiDAR-based point cloud is generated to reconstruct the facility’s geometric structure, 2) Digital Model Generation: a CAD-based digital model to integrate the facility layout with radiation field data, 3) Radiation Field Computation: precomputed radiation data is used to estimate the radiation distribution within the facility, which is further refined in real time as additional measurements are made, 4) AR Integration: the digital model is projected into an AR environment using a hologram capable device, which currently is a “Microsoft HoloLens 2”, and 5) Real-Time Interaction: users interact with the AR interface to analyze radiation intensity, inspect different sections of the facility, and filter data layers to focus on specific areas of interest.The application is developed using the Unity framework as the AR development platform, leveraging Microsoft’s Mixed Reality Toolkit (MRTK 3) for user interaction. The back-end computations are implemented in Python, with VTK for radiation field rendering and Open3D for aligning AR-generated point clouds with pre-computed models. The entire system is deployed in a two-tier architecture, where the AR device serves as the front end and the remote server handles computational processing.The developed application provides an overlay of radiation on physical boundaries. The radiation intensity field is displayed as a transparent 3D “heat map” overlaid onto facility surfaces. Users can adjust intensity thresholds using an intuitive AR user interface to filter critical zones. It is also possible to view radiation fields in specific slices of the facility by dynamically adjusting a slicing plane. This feature provides insight into hidden radiation hot spots and allows targeted analysis. Moreover, to help identify the spatial distribution of radiation sources, the application renders isosurfaces that represent constant radiation intensity levels. The application significantly enhances situational awareness and supports real-time decision-making, making it a valuable tool for decommissioning efforts, radiation safety training, and emergency response scenarios.
| Type de document: | Communication (Communication) |
|---|---|
| Informations complémentaires: | Progress in Canadian Mechanical Engineering, Volume 8. Co-chairs: Lucas A. Hof, Giuseppe Di Labbio, Antoine Tahan, Marlène Sanjosé, Sébastien Lalonde and Nicole R. Demarquette. |
| Date de dépôt: | 18 déc. 2025 14:19 |
| Dernière modification: | 18 déc. 2025 14:19 |
| URI: | https://espace2.etsmtl.ca/id/eprint/31972 |
Actions (Authentification requise)
![]() |
Dernière vérification avant le dépôt |

