Hi everyone,
I’m doing the “ROS2 Basics in 5 Days” course project.
My full system works correctly in the provided Gazebo simulation. The wall_finder finds the nearest wall and aligns, wall_following follows the right-hand wall at ~0.2–0.3m, behavior is smooth and stable in sim
However, when I connect to FastBot and run the exact same code, the robot behaves poorly.
Two issues:
- The LiDAR scan appears reversed compared to simulation (I can compensate in code, but I’m not sure if there’s an official fix/config).
- The bigger problem: the robot’s motion is very “bursty” / jerky. It seems to move in short pulses rather than smoothly. Because of that:
- it’s hard to precisely face the nearest wall in
find_wall - the distance readings change too abruptly
- the wall following controller over-corrects and loses the right-hand wall
In simulation the same controller logic is stable, but on the real robot the velocity command doesn’t seem to result in continuous motion.
Questions:
- Is this expected behavior in the remote lab (e.g., network rate limiting, smoothing, safety controller, cmd_vel timeout, friction differences)?
- Is there a recommended way to publish
/cmd_velon the real robot (rate/QoS, max speeds, ramping)? - Any known parameters to tune (e.g., acceleration limits, velocity smoothing, TurtleBot3 bringup settings)?
- Is there an official way to handle the “reversed LiDAR” issue (frame/config) instead of manually remapping indices in code?
Any guidance on how to adapt a wall-following controller for this real robot setup would be appreciated. Thanks!