Skip to content

Remote control through virtual reality (VR) envinorment using NeRF and Gaussian Splatting (3DGS) technology and Unreal Engine 5.2 for rescue tasks.

Notifications You must be signed in to change notification settings

Robcib-GIT/3DGS-in-SAR-Robotics

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

(#gemelo-digital-control-telemático-y-rv-generada-por-nerf-y-3dgs-en-tareas-de-rescate)

DIGITAL TWIN, TELEMATIC CONTROL AND VR GENERATED BY NERF AND 3DGS IN RESCUE TASKS

In the digital age, digital twins have emerged as key tools for the simulation and real-time control of complex physical environments. This final degree project (TFG) presents an immersive telematic control system that combines digital twins with virtual reality (VR) in Unreal Engine, employing advanced technologies such as Neural Radiance Fields (NeRF) and 3D Gaussian Splatting (3DGS). The research demonstrates the feasibility of integrating these 3D reconstruction methods into rescue operations, providing an immersive and detailed visualization of the affected environment, facilitating a faster response and reducing the risk for rescue teams. The main contributions include the identification of optimal tools for the generation of 3D models compatible with the simulation of digital twins, the evaluation of robustness and deployment times in realistic scenarios, and the compatibility with the research lines in simulation and telematic control of the Robotics and Cybernetics Laboratory (ROBCIB) of the CAR ETSII UPM-CSIC. The combination of tools allows for effective deployment in approximately one and a half hours, demonstrating its applicability in simulated rescue situations.

   

(a) Thermal camera. 3DGS robustness tests. Images inside the Unreal Engine simulation (a) Under the sea. 3DGS robustness tests. Images inside the Unreal Engine simulation (a) Smoke. 3DGS robustness tests. Images inside the Unreal Engine simulation (a) Flowers (complex geometry). 3DGS robustness tests. Images within the Unreal Engine simulation

3DGS robustness tests1. From left to right and top to bottom: thermal camera, underwater, smoke, flowers (complex geometry). Images inside the Unreal Engine simulation. Source: Author

   

Deployment of technology in a simulated rescue situation

Deployment of technology in a simulated rescue situation. Source: Author

   

About this repository

In this repository you will find:

  • The source code of the nodes and the launcher with the ROS parameters necessary for communication between the robot and Unreal Engine.
  • The complete Unreal Engine 5.2 project used for the development of the work, including the plugins with the modifications indicated in the report.
  • The installation and use manuals for the tools used.
  • The compilation of basic ROS concepts and commands necessary for the project.

Accessible information

The work report can be consulted in the UPM Digital Archive.

Additionally, the following YouTube channel is available where you can find the first tests carried out with instant-ngp, the robustness tests of the capacity to generate 3D models for environments considered difficult and the demonstration video of the deployment of the technology.

The list of videos found on the YouTube channel is as follows:

  1. Demonstration of the complete workflow
  2. instant-ngp - Yellow Flowers Training
  3. 3DGS Robustness Tests - Environments with repetitive elements
  4. 3DGS Robustness Tests - Small objects
  5. 3DGS Robustness Tests - Elements with complex geometries
  6. 3DGS Robustness Tests - Scenes with smoke or fog
  7. 3DGS Robustness Tests - Night scenes
  8. 3DGS Robustness Tests - Thermal camera recordings
  9. 3DGS Robustness Tests - Scenes with reflections and refraction

General description

In the digital era, digital twins have emerged as fundamental tools for the simulation and real-time control of complex physical environments. This work presents the development of an immersive telematic control system that combines digital twin technology with virtual reality (VR), generated from Neural Radiance Fields (NeRF) and subsequent developments derived from it such as 3D Gaussian Splatting (3DGS).

To illustrate the starting point, Figure 0.1 presents a typical training scenario for Digital Twins (0.1a) and the state of the art in teleoperation of robotic systems with virtual reality interfaces in the Robotics and Cybernetics Laboratory (ROBCIB) of CAR ETSII - UPM (0.1b); and Figure 0.2 presents examples of 3D models of complex physical environments in which it is proposed to deploy said digital twins.

   

Example (a) of Digital Twins in artificial or unrealistic environments Example (b) of Digital Twins in artificial or unrealistic environments

Figures 0.1a and 0.1b. Examples of Digital Twins in artificial or unrealistic environments. Sources: 2 y 3.

   

Examples of 3D models of complex environments generated with 3DGS

Figure 0.2: Examples of 3D models of complex environments generated with 3DGS. Sources: 1

   

This research seeks to demonstrate that it is possible to integrate these 3D model reconstruction technologies with rescue operations in emergency situations, providing operators with an immersive and detailed visualization of the affected environment.

The precise and realistic reconstruction of three-dimensional environments from two-dimensional images in which the digital twins of the deployed robots can operate also allows the simulation of possible catastrophe scenarios.

In this way, it not only facilitates a faster and more effective response, but also reduces the risk for rescue teams.

The main contributions of this project are the following:

  1. The optimal combination of tools to obtain a 3D model compatible with the simulation of the digital twins of the robots.
  2. The robustness and deployment time of the technology in realistic application situations.
  3. Compatibility with the current lines of research of the Robotics and Cybernetics Laboratory (ROBCIB) of the CAR ETSII UPM-CSIC, in particular the simulation and telematic control of robots through the use of virtual reality for closer and safer collaboration between operators and robots.

The combination of the tools used in this project allows the deployment of the technology in approximately one and a half hours for a simulated rescue situation.

After discarding instant-ngp4 and Unity due to incompatibilities, the tools compiled in the list below have been chosen. Figure 0.3 shows the complete workflow diagram of these tools, and then Table 0.1 shows the times used for their deployment.

  • FFmpeg5, for preprocessing, and COLMAP6 with the addition of hloc7, for obtaining the photogrammetry.
  • Splatfacto8 (NerfStudio8 environment method for 3DGS1) and volinga-model9, for generating the 3D model of the scene.
  • Volinga Suite9, for compatibility with Unreal Engine (UE).
  • UE together with the Volinga plugin9, for its representation in a virtual reality environment compatible with the simulation of the digital twins of the robots.
  • Rosbridge10 and ROSIntegration11, for compatibility with current lines of research that use the ROS standard.

   

Complete workflow

Figure 0.3: Complete workflow. Source: own elaboration

   

Full Workflow Deployment Time Table

Table 0.1: Table of deployment times for the complete workflow. Source: Own elaboration

   

The appearance of the technology deployed during the demonstration can be seen in the images in Figure 0.4.

   

Full workflow demonstration

Figure 0.4: Demonstration of the complete workflow. Source: own elaboration

   

Keywords

Digital Twin, telematic control, immersive simulation, NeRF, Gaussian Splatting, Unreal Engine, virtual reality for rescues.

References

Footnotes

  1. Kerbl, B., Kopanas, G., Leimk¨uhler, T. y Drettakis, G. (jul. de 2023). 3D Gaussian Splatting for Real-Time Radiance Field Rendering. En: ACM Transactions on Graphics 42.4. url: https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/. 2 3

  2. Rudin, N., Hoeller, D., Reist, P. y Hutter, M. (2021). Learning to Walk in Minutes Using Massively Parallel Deep Reinforcement Learning. En: ArXiv abs/2109.11978. url: https://api.semanticscholar.org/CorpusID:237635100.

  3. Cruz Ulloa, C. (mar. de 2024). Quadrupedal Robots in Search and Rescue: Perception and Teleoperation. doi: 10.20868/UPM.thesis.81769. url: https://oa.upm.es/81769/.

  4. Müller, T., Evans, A., Schied, C. y Keller, A. (jul. de 2022). Instant Neural Graphics Primitives with a Multiresolution Hash Encoding. En: ACM Trans. Graph. 41.4, 102:1-102:15. doi: 10.1145/3528223.3530127. url: https://doi.org/10.1145/3528223.3530127 y https://nvlabs.github.io/instant-ngp/.

  5. Tomar, S. (2006). Converting video formats with FFmpeg. En: Linux Journal 2006.146, pág. 10. url: https://www.ffmpeg.org/

  6. Schönberger, J. L., Zheng, E., Pollefeys, M. y Frahm, J.-M. (2016). Pixelwise View Selection for Unstructured Multi-View Stereo. En: European Conference on Computer Vision (ECCV). url: https://colmap.github.io/

  7. Sarlin, P.-E., Cadena, C., Siegwart, R. y Dymczyk, M. (2019). From Coarse to Fine: Robust Hierarchical Localization at Large Scale. En: CVPR. url: https://github.com/cvg/Hierarchical-Localization

  8. Tancik, M. et al. (2023). Nerfstudio: A Modular Framework for Neural Radiance Field Development. En: ACM SIGGRAPH 2023 Conference Proceedings. SIGGRAPH ’23. url: https://docs.nerf.studio/ 2

  9. Volinga development team (2023). Volinga Suite, volinga-model and Volinga plugin. url: https://volinga.ai/ y https://github.com/Volinga/volinga-model 2 3

  10. Crick, C., Jay, G., Osentoski, S., Pitzer, B. y Jenkins, O. C. (2011). ROSbridge: ROS for Non-ROS Users. En: Proceedings of the Robotics Systems Science and Systems Conference (RSS). url: http://rosbridge.org.

  11. Schmeisser, M. y Suero, M. (2018). ROSIntegration: Connecting Unreal Engine with ROS for Realistic Simulations. En: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). url: https://github.com/code-iai/ROSIntegration.

About

Remote control through virtual reality (VR) envinorment using NeRF and Gaussian Splatting (3DGS) technology and Unreal Engine 5.2 for rescue tasks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • CMake 66.6%
  • C++ 33.4%