MXET 300 Final Project
Personal Videographer SCUTTLE Robot with Obstacle Avoidance
The purpose of this robot is to track a person at all times at a set distance while simultaneously avoiding dynamic obstacles and stairs. This project uses a Beaglebone Blue Board to control the primary sensors, that is, a camera, a LiDAR sensor, and an ultrasonic sensor. Additionally, this robot uses the SCUTTLE Robotics platform.
Team: Aaron Luna and Luke Baber
Physical Structure
The entire robot was modeled using Creo 6, Inventor Professional 2022, and 3D printed using PLA filament.
​
Towards the front of the SCUTTLE is the SICK LiDAR scanner. The scanner is capable of detecting obstacles within a 270-degree FOV,
and avoiding them.
​
At the front of the SCUTTLE is an ultrasonic sensor pointing at the floor to help with the drop-off protection.
​
On the back of the SCUTTLE is a large vertical stand that houses the facial tracking module.

Wiring Diagram
The project wiring diagram is shown to the left. The motors used possess encoders that are connected to the Beaglebone using I2C. The LiDAR sensor and ultrasonic sensor use UART communication. The camera and SICK LiDAR sensor are connected to the Beaglebone using the USB port and a USB splitter.

Primary Sensors
The Luna LiDAR range finder is a compact sensor capable of detecting objects within a range of 0.1 to 27 feet. It was utilized to maintain a fixed distance between the robot and the tracked subject.
​
For image tracking of an individual, the DEPSTECH 1080p webcam was employed. Although any HD webcam could have been used for this purpose, we opted for this brand due to its low cost and superior resolution.
​
With its 270-degree field of view, the SICK LiDAR scanner enabled us to identify objects obstructing the robot's trajectory and navigate around them.




Subject Tracking
To track the subject, we initially employed the HARR cascades facial tracking algorithm via OpenCV. Nonetheless, the Beaglebone struggled to process it at a satisfactory frame rate.
​
We maintained a high frame rate by utilizing an HSV filter, which was calibrated to detect a red shirt. Employing this technique enabled the camera to track a person with a red shirt swiftly and accurately.

Programming
Python 3.7 was the programming language employed for this project. Its usage facilitated the expedited programming of our robot through the utilization of various open-source libraries.
​
Furthermore, due to the Beaglebone's multiple cores, the facial tracking and robot movement were carried out on two distinct threads.

Complete Physical Model
