2022 IB Diploma Extended Essays
Abstract Autonomous systems require more data and computing power to navigate an unknown environment. Most autonomous systems, like autonomous robots, use active sensors such as Lidar or Depth cameras to reconstruct unknown scenes and navigate autonomously. Traditional simultaneous localisation and mapping (SLAM) is no longer sufficient for autonomous navigation in some scenarios. Multi-sensor fusion algorithms combine data from different sensors to improve autonomous navigation precision. This study compares to what extent is Sensor-Fusion SLAM better than Active SLAM algorithms at autonomous navigation. This investigation was done in segments, collecting data from various sensors through ROS to measure a evaluation point of the given scene, then examining each algorithm in terms of pose, yaw and pose error. By comparing these algorithms, the robot's autonomous navigation accuracy is determined. Extended Kalman-Filter (EKF) is a Sensor-Fusion SLAM algorithms, which are more robust at autonomous navigation. EKF reduces the robot's pose error compared to Hector, ORB, and ZEDfu SLAM.
Introduction We have looked towards robots to solve many of the world problems. Taking over manufacturing, business solutions, agriculture and national defense. Recently the use of robots has been employed to self autonomous cars. The race for who can make the most reliable and safe self autonomous cars has led to growth within the topic [1]. The biggest growth has been in the process of autonomous navigation, the perception of the external environment is one of the fundamental problems in developing autonomous navigation. Autonomous navigation implies the ability for a system to
Made with FlippingBook. PDF to flipbook with ease