Saturday, February 15, 2014

Arduino + Matlab SLAM

Purpose:

    Explore the subject of simultaneous localization and mapping (SLAM) robots. I have separated this venture into sever phases that are designed to build upon the previous. Here is a short description of my plan:

Phase 1: Build simple Arduino robot that gets environment data from various ultrasonic distance sensors and begin trying to map the robot's surroundings

Phase 2: Replace Arduino with raspberry pi, ultrasonic sensors with a Xbox Kinect sensor and begin implementing some of the more robust algorithms to make the mapping more accurate

Phase 3: Replace raspberry pi with on-board computer, Kinect sensor with LIDARs, and fully implement SLAM.

    This is a tentative plan and will involve quite a few steps in between each phase, but should be a good template for later revisions.

What is SLAM?

    From the name, you can guess that I are trying to map an environment and at the same time pinpoint the robot's location within said map. The difficulty with SLAM is that it produces sort of a catch-22. In order to know where you are within an environment, you must first have a map of your environment to compare what your sensor data shows. However, you do not have a map and must build one, but in order to build one you must first know where you are within the map you are building[1].

    4 Steps of SLAM [1]:

  1. Acquire sensor data
  2. Discern from the data any landmarks (walls, chair legs, etc.)
  3. Compare found landmarks with know ones via data association
  4. Update location and map

Phase 1 Hardware

The system consists of two parts:
    (1) Arduino controlled robot  
    (2) A laptop running Matlab


    The robot is controlled by an Arduino Mega 2560. I chose this over an Uno or just using a PIC micro because at this stage I need a controller that is easy to setup, easy to program, and has the necessary amount of pins for the robot's sensor payload and other hardware. I have an Arduino Motor Shield on the Mega to interface with the DC drive motors. The chassis of the robot is the DFRobot 2WD Mobile Platform with a Prototyping Plate, also from DFRobot, added to make room for more hardware. So, the robot is a 2WD, 3 tier robot, with the prototyping plate in the middle. The prototyping plate is home to the Mega+Motor Shield, the Xbee, a breadboard to disperse power to sensors, and a 9V battery to power the Mega. Figures 1, 2 and 3 show the robot from the front, left, and right sides, respectively (note: ultrasonic array not depicted). The top of the robot is home to the ultrasonic sensor assembly, which includes the sensor array from Figures 4 and 5, as well as the servo motor that swings it around during data acquisition.


Figure 1: Front side of robot

Figure 2: Left side of robot



Figure 3: Right side of robot

4 Steps of SLAM: Phase 1 Implementation:


  Acquire sensor data

    The sensors I am using is a Parallax PING))) ultrasonic distance sensors. I have 3 sensors mounted on a fixture that is attached to a servo motor that sweeps the array around 180 degrees. Figure 4 and 5 show the configuration. 

Figure 4: Top of sensor array. Sensors are angles 45 degrees apart. 
Figure 5: Front of sensor array.
    The servo the sensors are attached to can rotate 180 degrees, so with one scan, I am able to get a 270 degree view of the robot's surroundings. 

    Before beginning this post, I did some validation testing of whether or not these sensors were robust/reliable enough to get good looking data. During this testing I only used the sensor position at 90 degrees to gather data, so only 180 degrees will be shown. Figure 6 shows an overlay of the data I got onto the surface I scanned (my workbench).


Figure 6: Distance readings overlay on workbench. 















 
    This is the raw output of the ultrasonic sensor. I was pretty happy with the results of this test considering its only a $30 sensor giving me only 2D data. I am hoping with the addition of the other 2 sensors I will get better definition of objects. 

    Before I move on with more data acquisition, I did some simulations of how my dead-reckoning algorithms would hold up on this current platform. Unfortunately, I went the low-cost route for my motor encoders and they only have 10 ticks/rev (I know, its horrid). So, I developed a small Matlab simulation that imports a true travel path and exports a measured path that includes measurement error and a basic Kalman filter adjusted path to try and better predict the robot's position. Figure 7 shows a single run of the simulation and Figure 8 shows the error that accumulated during simulation.


Figure 7: Single run of dead-reckoning simulation. Estimate path is Kalman filtered.

Figure 8: Resultant error between the estimated position versus the true position. 


    From Figure 8, it is clear that since the encoders have such low resolution, that the largest impact is going to be on the robot's heading. 


[1] http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/projects/1aslam_blas_repo.pdf