Wall follower works in Gazebo, but real TurtleBot motion is “bursty” and breaks wall tracking

Hi everyone,
I’m doing the “ROS2 Basics in 5 Days” course project.

My full system works correctly in the provided Gazebo simulation. The wall_finder finds the nearest wall and aligns, wall_following follows the right-hand wall at ~0.2–0.3m, behavior is smooth and stable in sim

However, when I connect to FastBot and run the exact same code, the robot behaves poorly.

Two issues:

  1. The LiDAR scan appears reversed compared to simulation (I can compensate in code, but I’m not sure if there’s an official fix/config).
  2. The bigger problem: the robot’s motion is very “bursty” / jerky. It seems to move in short pulses rather than smoothly. Because of that:
  • it’s hard to precisely face the nearest wall in find_wall
  • the distance readings change too abruptly
  • the wall following controller over-corrects and loses the right-hand wall

In simulation the same controller logic is stable, but on the real robot the velocity command doesn’t seem to result in continuous motion.

Questions:

  • Is this expected behavior in the remote lab (e.g., network rate limiting, smoothing, safety controller, cmd_vel timeout, friction differences)?
  • Is there a recommended way to publish /cmd_vel on the real robot (rate/QoS, max speeds, ramping)?
  • Any known parameters to tune (e.g., acceleration limits, velocity smoothing, TurtleBot3 bringup settings)?
  • Is there an official way to handle the “reversed LiDAR” issue (frame/config) instead of manually remapping indices in code?

Any guidance on how to adapt a wall-following controller for this real robot setup would be appreciated. Thanks!

Hi @Vortex
This should answer some of your questions about the lidar:

The real robot does not support “latching” (using the last velocity command continuously until changed) because that’s impractical. You must keep the “pedal” pressed…

Additionally, the real robot has a speed limit, just as real vehicles do. Let me confirm the number…0.25.