2022 IB Diploma Extended Essays

A buffer containing the robot's pose values was looped to the prediction. If it deviates from the path, it is able to find the robot's new position and therefore correct the path. This buffer was added. The planner then sends PWM signals to the motor driver and servo motor to drive autonomously.

Criteria for Benchmarks (Source code provided in the appendix) Criterion Description Pose of each SLAM algorithm

In computer vision and robotics, position and orientation are determined relative to a coordinate system. Pose is an object's position and orientation. The robot's pose determines its trajectory []. Pose estimation is the process of determining an object's pose in an image (or stereo image, image sequence). Different image sensor configurations and methodologies can be used to solve the pose estimation problem. Assuming the camera is calibrated and the 3D points and 2D points for mapping is known. Further if the object's geometry is known. Its projected image on the camera is a function of its pose. Once a set of control points on the object, typically corners or other feature points, are identified, the pose transformation can be solved from a set of equations that relate the points' 3D coordinates with their 2D image coordinates. This increases the position estimation error of a robot whose global position or orientation can't be directly measured. The robot's sensors measure its relative pose between two coordinate frames at different times. The sensors can't measure the robot's absolute pose in global coordinates. The robot has IMUs and cameras, but no compasses or GPS. The absolute position is then estimated from noisy relative measurements. Poor location estimates can result from using sensors to measure relative pose. When individual sensor measurements are concatenated to estimate the robot's position in the global frame, errors accumulate [29][30]. Long runs can muddle location estimates. Different SLAM algorithms will have different pose errors, allowing a valid comparison of their robustness. Yaw can be defined as a rotation. A yaw rotation is the movement around the yaw axis of the robot. This can be also described as the changes in rotation the robot is pointing at. The Yaw rate or Yaw velocity is important measurement within robotics. As the higher the yaw rate, the slower the reconstruction of the scene or more pose errors, therefore the robot changing it’s yaw to correct pose.

Pose error of each SLAM algorithm

Changes in Yaw

Discussion Data discussing and comparing Hector SLAM, ORB2-SLAM and Zedfu is attached in the appendix.

Made with FlippingBook. PDF to flipbook with ease