2022 IB Diploma Extended Essays

Hypothesis The Extended Kalman filter will provide a more robust navigation for AGV’s, compared to Passive SLAM algorithms such as ORB-SLAM, Zedfu and Hector SLAM. This is because EKF corrects fast changes in acceleration from the IMU, reducing unnecessary rotation. Furthermore, reducing the cumulative error by applying ORB-SLAM and using depth values of a scene from ORB-SLAM and Hector SLAM provides a more accurate prediction of the pose measurement, reducing scale drift of the path. Zedfu provides a 3D pose for the robot and further corrects depth values, providing better loop closure for Visual SLAM. Similarly, using an active depth sensor such as a laser from the LiDAR, will improve it’s pose estimate. Methodology ROS Robot

In contrast to the robot constructed. This is a similar ROS AGV robot [19]. It supports ackerman steering, through it’s front axle rotated from the servo. It presents a ZED camera similar to the robot constructed, a Hokuyo UTM-30LX LiDAR. Which is a much higher resolution LiDAR than the one constructed for this experiment. This robot also has faster onboard computation from the Nvidia Jetson TX1 and a Razor 9DoF SEN-10736. Which is a much higher resolution IMU, as it can collect less noisy data from it’s gyroscope, accelerator and magnetometer. This makes this robot more suitable to carry out this experiment as the apparatus plays less part in the overall cumulative error for the experiment. However, the same apparatus couldn’t be used for this experiment due to cost.

The robot constructed for this investigation is presented. This is an AGV meant to intimidate an autonomous car. While it has mecnaum manoeuvrability there are servo’s that are able to rotate the front axle. As such the controllable degrees of freedom of the omni wheels are removed as it isn’t constricted from moving along the yaw axis on the spot. The robot has Ackerman steering model through a front axle, rotating the front wheels. It supports a Nvidia Jetson Nano (4GB) for onboard computation, LIDAR (RP-LIDAR) as the main sensor, 3 x 128RPM motors, 4 x L298N hall bridge motor drivers, Zed Camera as a secondary sensor and a MPU-6050 as the IMU.

Made with FlippingBook. PDF to flipbook with ease