User Tools

Site Tools


drones:slam:start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
drones:slam:start [2023/09/17 13:04] rolf.beckerdrones:slam:start [2023/09/18 09:36] (current) – [Methods and Material] rolf.becker
Line 1: Line 1:
 ~~NOTOC~~ ~~NOTOC~~
 ====== Simultaneous Localization and Mapping (SLAM) for Indoor Navigation ====== ====== Simultaneous Localization and Mapping (SLAM) for Indoor Navigation ======
 +
 +Authors: [[https://github.com/harleylara|Harley Lara]], Rolf Becker
 +
 +===== Introduction =====
  
 When we humans enter and discover an unknown environment we are implicitly performing SLAM.  When we humans enter and discover an unknown environment we are implicitly performing SLAM. 
Line 12: Line 16:
 **SLAM in Robotics:**  **SLAM in Robotics:** 
 This simplified behavioral model can be transferred to autonomous mobile robots. To make them really autonomous they have do do something link SLAM. It is a standard problem in robotics. This simplified behavioral model can be transferred to autonomous mobile robots. To make them really autonomous they have do do something link SLAM. It is a standard problem in robotics.
 +
 +===== Methods and Material =====
 +
 +
  
 **Flight Controller and Mission Control Computer:**  **Flight Controller and Mission Control Computer:** 
-Our research assistant Harley Lara built the first flying robot (drone) in our team which is capable of SLAM. The flight attitude and stability of the drone is controlled by the standard open source flight controller PX4. This is responsible for the core functionality of the flying platform. It makes it flyable. The SLAM algorithm is run on a companion computer (here NVIDIA Jetson Xavier NX) which at the final stage of development will be responsible for the mission, i.e. the autonomous navigation, path decision as well as other higher level mission intention such as search and rescue (SAR) of people in disaster areas. This mission control computer tells the flight controller where to go.+Our research assistant Harley Lara built the first flying robot (aka drone) in our team which is capable of SLAM. The flight attitude and stability of the drone is controlled by the standard open source flight controller PX4. This is responsible for the core functionality of the flying platform. It makes it flyable. The SLAM algorithm is run on a companion computer (here NVIDIA Jetson Xavier NX) which at the final stage of development will be responsible for the mission, i.e. the autonomous navigation, path decision as well as other higher level mission intention such as search and rescue (SAR) of people in disaster areas. This mission control computer tells the flight controller where to go.
    
-**The Perception:**  +**Perception:**  
-In this first experiment we use a combination of two special cameras which cooperate closely: a tracking camera (Intel T265) and a depth camera (Intel D435i). They form the visual perception system. It allows to measure distances to "obstacles" such as walls, and to determine the linear as well as rotational motion. This perception system is also equipped with an intertial measurement unit (IMU) on chip. Such an IMU usually consists of an accelerometer measuring linear acceleration (aka force) along three spatial axes as well as a gyroscope sensing the rotational speed (aka angular velocity) about the three spatial axes. Sometimes a magnetometer (aka compass) All of these sensor outputs have to be combined. This is called sensor fusion. This sensor fusion is partly done already in the cameras, partly this has to be done externally.  +In this first experiment we use a combination of two special cameras which cooperate closely: a tracking camera (Intel T265) and a depth camera (Intel D435i). They form the visual perception system. It allows to measure distances to "obstacles" such as walls, and to determine the linear as well as rotational motion of the robot. This perception system is also equipped with an intertial measurement unit (IMU) on chip. Such an IMU usually consists of an accelerometer measuring linear acceleration (force) along three spatial axes as well as a gyroscope sensing the rotational speed (angular velocity) about the three spatial axes. Sometimes a magnetometer (compass) is also included in the integrated IMU chip. All of these sensor outputs have to be combined. This is called sensor fusion. This sensor fusion is partly done already in the cameras, partly this has to be done externally.   
 + 
 +**"Brain":** The misson control computer (in our case NVIDIA Jetson Xavier NX) is the brain of the flying robot. It collects all the data from the connected sensors and performs the sensor fusion needed for SLAM. We use the Robot Operating System ROS 2 to solve these problems. ROS 2 is a modular multitasking real-time operating system designed for robotic applications. "ROS 2 is a middleware based on a strongly-typed, anonymous publish/subscribe mechanism that allows for message passing between different processes." (from [[https://docs.ros.org/en/rolling/Concepts/Basic.html|ROS 2 Basic Concepts]]). Simply speaking: Sensors, actuators and algorithms are realized as independent nodes (like processes) communicating with each other by exchanging messages. A sensor node represents a real electronic sensing element like a digital twin. The same holds for actuators, e.g. motors, servos, etc. The system is highly flexible and extensible. A new sensor or actuator can be integrated by  deploying new nodes in the ROS 2 communication graph representing their hardware elements. With this modular software architecture interoprability is easy to achieve.  
 + 
 +===== Results =====
  
 +The video shows the first experimental results. The top two video streams show two external observation cameras filming the drone flight in the university's drone cage. The bottom video stream shows the slowly emerging 3D world reconstruction resulting from SLAM in real-time. You see the walls and their elements emerging. It is a dense point cloud, overlayed with color information from the cameras. This is the mapping. The localization of the drone, i.e. its flight track, is indicated by a thin blue line (hardly visible!) starting form the drone starting point to the current drone position. The drone creates the map and sets itself in realtion to it. 
  
-The video shows the first experiment+BTW: Video and audio are produced and cut by Harley, too. He also played the piano.
    
  
drones/slam/start.1694948674.txt.gz · Last modified: 2023/09/17 13:04 by rolf.becker