Human-Machine Collaborative Structural Assessment Through Mixed Reality and 5G

Problem Statement

Most of North America’s critical infrastructure that were built in the middle of the 20th century are rapidly approaching the end of their design lifecycles and are in need of urgent repairs. Over the last ten years, novel Artificial Intelligence (AI) based structure assessment methods and tools have been proposed to identify and quantify various structural damage indicators (e.g., cracking, spalling, corrosion) from visual data gathered by advanced platforms. However, widespread adoption of these technologies in the field has been quite limited due to the lack of real-time interaction and collaboration between the inspector and the technology during the process of inspection, as well as the scarcity of computational resources available to the inspector on-site for real-time data processing.

 

Approach

My research plan is integrating Mixed Reality (MR) technology with 5G-enabled AI edge computing for collaborative infrastructure assessments. First, robotic platforms that are equipped with powerful visual sensors (color, lidar, infrared, etc.) scan the physical structures to create a digital twin representation of the site. This data is then uploaded to a 5G-enabled Multi-access Edge Computing (MEC) server, where an AI analyzes the visual data to detect any geometric changes or visual defects in the structure and provide results in real-time. Inspectors wearing commercial MR devices (Microsoft’s HoloLens 2) are then able to view a detailed 3D scanned map of the structure that contains the annotated locations of structural defects or other relevant information processed by AI, as well as information available from previous inspections or existing BIM models. Therefore, MR-equipped inspectors and data collection platforms can collaborate with each other in real-time by relying on this network and with the help of readily available edge computational power for AI image processing tasks.

 

Ongoing Work:

Through funding obtained from Canada’s largest telecom, Rogers, we are in the process of integrating AR/VR collaboration for remote inspections. We are also partnering with the Ministry of Transportation of Ontario (MTO) to develop 5G-enabled solutions to rapidly digitize infrastructure assets.

 

Key Results and Visuals:

Figure 1: Interactive defect quantification through MR device for damage area measurement using AI

 

Figure 2: Human-robot collaboration through image-based localization

 Figure 3: Real-time AR-VR collaboration for remote inspections

 

Relevant Publications

  1. Al-Sabbag, Z. A., Yeum, C. M., & Narasimhan, S. (2022). Enabling Human–Machine Collaboration in Infrastructure Inspections through Mixed Reality. Advanced Engineering Informatics. (doi:10.1016/j.aei.2022.101709).
  2. Al-Sabbag, Z. A., Yeum, C. M., & Narasimhan, S. (2022). Interactive Defect Quantification using Extended Reality. Advanced Engineering Informatics. (doi:10.1016/j.aei.2021.101473).
  3. Al-Sabbag, Z. A., Connelly, J. P., Yeum, C. M., & Narasimhan, S. (2020). Real-time Quantitative Visual Inspection using Extended Reality. Journal of Computational Vision and Imaging Systems, 6(1), 1-3. (doi: 10.15353/jcvis.v6i1.3557)

See Publications for a complete list.

 

Students

 
Zaid Abbas Al-Sabbag, BASc (2019, University of Waterloo)

Zaid Abbas Al-Sabbag

Sunwoong Choi