Front end of Ferrt, the 2D LIDAR in front, followed by the camera and the two large planar LED. |
The business end of the robot is equipped with a oscillatory mechanism that houses a 2D SICK LIDAR and a CMOS blackfly camera with lighting. The basics con-ops of the device involves slow rotation of the mechanism to gather 3D laser data with visual imagery and then run a second scan, stopping at set positions to gather sequences of images for HDR image creation and 360 degree panoramic image stitching. The robot is controlled via ROS, with the computing handled with an onboard ODROID XU4. A Xsens MTi-20 VRU IMU onboard provides an estimate of attitude in order to properly combine the individual laser scans and imagery in the presence of any movement as the robot is gathering data.
Example scan of the FRC Highbay
I lead the full system development, from device driver software development and electrical design to the post-processing and auto-reporting software. The onboard software is a mixture of C++ and Python, involving ROS and OpenCV. Post-processing is currently prototyped in Matlab, but will possibly be moved to C++/OpenCV/PCL to increase speed. I am currently in the process of deploying the robot for the city of Pittsburgh to map subterranean voids near high traffic areas.
No comments:
Post a Comment