User Tools

Site Tools


drones:slam:start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
drones:slam:start [2023/09/17 13:31] rolf.beckerdrones:slam:start [2023/09/17 13:43] – [Results] rolf.becker
Line 1: Line 1:
 ~~NOTOC~~ ~~NOTOC~~
 ====== Simultaneous Localization and Mapping (SLAM) for Indoor Navigation ====== ====== Simultaneous Localization and Mapping (SLAM) for Indoor Navigation ======
 +
 +Authors: [[https://github.com/harleylara|Harley Lara]], Rolf Becker
 +
 +===== Introduction =====
  
 When we humans enter and discover an unknown environment we are implicitly performing SLAM.  When we humans enter and discover an unknown environment we are implicitly performing SLAM. 
Line 12: Line 16:
 **SLAM in Robotics:**  **SLAM in Robotics:** 
 This simplified behavioral model can be transferred to autonomous mobile robots. To make them really autonomous they have do do something link SLAM. It is a standard problem in robotics. This simplified behavioral model can be transferred to autonomous mobile robots. To make them really autonomous they have do do something link SLAM. It is a standard problem in robotics.
 +
 +===== Methods and Material =====
 +
 +
  
 **Flight Controller and Mission Control Computer:**  **Flight Controller and Mission Control Computer:** 
Line 21: Line 29:
 **"Brain":** The misson control computer (in our case NVIDIA Jetson Xavier NX) is the brain of the flying robot. It collects all the data from the connected sensors and performs the sensor fusion needed for SLAM. We use the Robot Operating System ROS 2 to solve these problems. ROS 2 is a modular multitasking real-time operating system designed for robotic applications. "ROS 2 is a middleware based on a strongly-typed, anonymous publish/subscribe mechanism that allows for message passing between different processes." (from [[https://docs.ros.org/en/rolling/Concepts/Basic.html|ROS 2 Basic Concepts]]). Simply speaking: Sensors, actuators and algorithms are realized as independent nodes (like processes) communicating with each other by exchanging messages. A sensor node represents a real electronic sensing element like a digital twin. The same holds for actuators, e.g. motors, servos, etc. The system is highly flexible and extensible. A new sensor or actuator can be integrated by  deploying new nodes in the ROS 2 communication graph representing their hardware elements. With this modular software architecture interoprability is easy to achieve.  **"Brain":** The misson control computer (in our case NVIDIA Jetson Xavier NX) is the brain of the flying robot. It collects all the data from the connected sensors and performs the sensor fusion needed for SLAM. We use the Robot Operating System ROS 2 to solve these problems. ROS 2 is a modular multitasking real-time operating system designed for robotic applications. "ROS 2 is a middleware based on a strongly-typed, anonymous publish/subscribe mechanism that allows for message passing between different processes." (from [[https://docs.ros.org/en/rolling/Concepts/Basic.html|ROS 2 Basic Concepts]]). Simply speaking: Sensors, actuators and algorithms are realized as independent nodes (like processes) communicating with each other by exchanging messages. A sensor node represents a real electronic sensing element like a digital twin. The same holds for actuators, e.g. motors, servos, etc. The system is highly flexible and extensible. A new sensor or actuator can be integrated by  deploying new nodes in the ROS 2 communication graph representing their hardware elements. With this modular software architecture interoprability is easy to achieve. 
  
-**Results:** The video shows the first experimental results. The top two video streams show two external observation cameras filming the drone flight in the university's drone cage. The bottom video stream shwos the +===== Results ===== 
 + 
 +The video shows the first experimental results. The top two video streams show two external observation cameras filming the drone flight in the university's drone cage. The bottom video stream shows the slowly emerging 3D world reconstruction resulting from SLAM in real-time. You see the walls and their elements emerging. It is a dense point cloud, overlayed with color information from the cameras. This is the mapping. The localization of the drone, i.e. its flight track, is indicated by a thin blue line (hardly visible!) starting form the drone starting point to the current drone position. The drone creates the map and sets itself in realtion to it.  
 + 
 +BTW: Video and audio are produced and cut by Harley, too. He also played the piano.
    
  
drones/slam/start.txt · Last modified: 2023/09/18 09:36 by rolf.becker