LaserScan not matching the map layout when its moving

Hi ,

I am working with ROS and using Navigation Stack for Robot’s Autonomous Navigation. I am running this on a custom made Autonomous mobile robot.

I am running the following packages before moving the robot:

  1. Map server to launch the map
  2. AMCl to localise

After the above 2 packages are online, i localise the robot using rviz’s “2D pose estimate”, but when i start moving the robot, the laser does not align with the map. Same is reflected in attached video

In attached video, you can see that when the robot rotates, the laser is no longer matching the map’s layout.

Note: My encoder values are true to the phyiscal world, that is, when moving 1m in real world, the same is reflected in rviz.

What could be the issue and how could it be resolved?

Hi @Joseph1001 ,

The “drifting” in the sensed map that you observe is usually due to the Odometry or IMU.
If you are really sure that the Odometry values are true to real world then you probably have a faulty IMU. Although it may even be the “best IMU”, there could be a bad IMU filter or no filter at all.

There could also be external disturbances like high EMF area but I highly doubt that.
A second possible reason could be that there is not enough shielding for the IMU from the electromagnetic effects of the motors, if the motors are situated close to the IMU in your robot.

So to fix this, here are your possible fixes:

  1. Use a filter (Kalman / Particle / LMS) filter for the IMU values.
  2. Check for Odometer/Encoder drifts.
  3. Shield your IMU from Motor’s EM Flux Field or RF field if you have any radio on your robot.

The thing to note here is that your scanner works correctly, even after the robot drifts, since the room shape is maintained after the drifts. It is just a problem of your IMU that I would say the most.


1 Like

Hi @girishkumar.kannan , thanks for looking into this.

Our System doesn’t posses IMU and we have checked the EKF for odom data but no improvement.

Let me know how to approach this.


Hi @Joseph1001 ,

If you are not using IMU, then are you doing “Dead Reckoning” with just the Encoder?
If so, I have to say, dead-reckoning is not a good method to calculate odometry - it is a very bad strategy.

The only best way to perform scan-matching in a non-IMU system will be to move the robot slowly.
Fast movements will create drift and jitters in the encoder ticks. Especially during turns (which can be observed in your case from the video).

If you can provide further info on how your robot system is designed, perhaps I can help you identify the cause of the erratic behavior.


1 Like

We are using only Encoder ticks for calculating odometry (Note:My robot is a 2 wheel Differential drive)
and moving my robot around 0.25 m/s for linear movement and 0.5 rad/s for angular movement.

As you suggested, will try moving it at multiple low speed to check if it improves and will also add an IMU to the system.

Will let you know the results after testing.


Hi @Joseph1001 ,

Just as I suspected, you are having Encoder jitters which is magnifying your localization errors.
As you know already, differential turn implements spinning the wheels in opposite direction to turn to a specific side. If your encoder accuracy (number of ticks and the gap between two consecutive ticks) are significantly large, the system will fail to detect tick-slips during robot turns. This will amplify your encoder error.

Yes, you should add an IMU or at least an accelerometer and do some sensor fusion to correct the odometry from encoders along with IMU/Accelerometer measurements.


This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.