User Tools

Site Tools


drones:slam:start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
drones:slam:start [2023/09/17 13:43] – [Results] rolf.beckerdrones:slam:start [2023/09/18 09:36] (current) – [Methods and Material] rolf.becker
Line 25: Line 25:
    
 **Perception:**  **Perception:** 
-In this first experiment we use a combination of two special cameras which cooperate closely: a tracking camera (Intel T265) and a depth camera (Intel D435i). They form the visual perception system. It allows to measure distances to "obstacles" such as walls, and to determine the linear as well as rotational motion of the robot. This perception system is also equipped with an intertial measurement unit (IMU) on chip. Such an IMU usually consists of an accelerometer measuring linear acceleration (aka force) along three spatial axes as well as a gyroscope sensing the rotational speed (aka angular velocity) about the three spatial axes. Sometimes a magnetometer (aka compass) All of these sensor outputs have to be combined. This is called sensor fusion. This sensor fusion is partly done already in the cameras, partly this has to be done externally.  +In this first experiment we use a combination of two special cameras which cooperate closely: a tracking camera (Intel T265) and a depth camera (Intel D435i). They form the visual perception system. It allows to measure distances to "obstacles" such as walls, and to determine the linear as well as rotational motion of the robot. This perception system is also equipped with an intertial measurement unit (IMU) on chip. Such an IMU usually consists of an accelerometer measuring linear acceleration (force) along three spatial axes as well as a gyroscope sensing the rotational speed (angular velocity) about the three spatial axes. Sometimes a magnetometer (compass) is also included in the integrated IMU chip. All of these sensor outputs have to be combined. This is called sensor fusion. This sensor fusion is partly done already in the cameras, partly this has to be done externally.  
  
 **"Brain":** The misson control computer (in our case NVIDIA Jetson Xavier NX) is the brain of the flying robot. It collects all the data from the connected sensors and performs the sensor fusion needed for SLAM. We use the Robot Operating System ROS 2 to solve these problems. ROS 2 is a modular multitasking real-time operating system designed for robotic applications. "ROS 2 is a middleware based on a strongly-typed, anonymous publish/subscribe mechanism that allows for message passing between different processes." (from [[https://docs.ros.org/en/rolling/Concepts/Basic.html|ROS 2 Basic Concepts]]). Simply speaking: Sensors, actuators and algorithms are realized as independent nodes (like processes) communicating with each other by exchanging messages. A sensor node represents a real electronic sensing element like a digital twin. The same holds for actuators, e.g. motors, servos, etc. The system is highly flexible and extensible. A new sensor or actuator can be integrated by  deploying new nodes in the ROS 2 communication graph representing their hardware elements. With this modular software architecture interoprability is easy to achieve.  **"Brain":** The misson control computer (in our case NVIDIA Jetson Xavier NX) is the brain of the flying robot. It collects all the data from the connected sensors and performs the sensor fusion needed for SLAM. We use the Robot Operating System ROS 2 to solve these problems. ROS 2 is a modular multitasking real-time operating system designed for robotic applications. "ROS 2 is a middleware based on a strongly-typed, anonymous publish/subscribe mechanism that allows for message passing between different processes." (from [[https://docs.ros.org/en/rolling/Concepts/Basic.html|ROS 2 Basic Concepts]]). Simply speaking: Sensors, actuators and algorithms are realized as independent nodes (like processes) communicating with each other by exchanging messages. A sensor node represents a real electronic sensing element like a digital twin. The same holds for actuators, e.g. motors, servos, etc. The system is highly flexible and extensible. A new sensor or actuator can be integrated by  deploying new nodes in the ROS 2 communication graph representing their hardware elements. With this modular software architecture interoprability is easy to achieve. 
drones/slam/start.1694950983.txt.gz · Last modified: 2023/09/17 13:43 by rolf.becker