Ball Following Lego Robot

1. Introduction

After being inspired by the Computer Vision and Mobile Robotics modules at uni, I decided to build a lego robot using the Mindstorms kit and a webcam that would be able to chase a ball. This was done for a bit of fun and in all took me a couple of weeks to get working. This included writting all the program code, including the code to analyse the images from the webcam to locate the ball, as well as the code to actually control the robot and to actually construct the robot itself. Java was used to write both the vision processing application and the robot control code which used the lejos firmware.

Lego Robot

The design of the robot itself is very basic and so is the design of the program code (well, relatively!), however it all works reasonably well!

2. System Architecture

Obviously the microprocessor inside the Lego RCX brick isn’t up to image processing so this will have to be done on the PC. Mind you, it’s not like the RCX has a USB port for the camera anyway! To keep things modular a second program runs on the PC which receives the coordinate data from the vision program and then sends the appropriate command (left, right, forward, etc.) to the robot via the IR tower. A third program running on the lego robot is a very simple remote control server which listens for instructions from the IR tower and performs them. In other words all the processing and hard work is done on the PC and the robot it remote controlled from the PC.

System Architecture

3. Image Processing Application

The first problem to solve is how to recognise a ball. A webcam is attached to the front of the robot and using the Java Media Framework it is possible to grab images from the camera.

Now we have an image from the camera we need to identify the ball. From my computer vision module there are two main ways; one is to look for a specific shape and use edge detection, etc., the other is to look for regions of different colours. By looking for a specific colour the robot doesn’t necessarily have to chase only balls, but anything of a specific colour. This approach is also much simplier to program.

JColourTracker

The screenshot above shows the program I developed to detect regions of a particular colour. You can read about all the gory details in my Java Colour Tracker project. Firstly the user selects the range of colour they wish to detect either by dragging the sliders for the minimum and maximum HSI, or by drawing a selection box around the area of colour on the live stream. The processed image then shows only those parts of the image that falls within the selected range. The blob colouring algorithm is then used to detect regions of colour. The largest region is then assumed to be the object we wish to track. The top-left and bottom-right coordinates are then output and a box drawn around the object being tracked.

To allow other programs to get the coordinates of the object being tracked the vision program implements a multi-threaded internet server allowing clients to connect locally or over the network. They can then query the vision program and get the coordinates of the object.

4. RCX Remote Control Server

As stated earlier the RCX brick runs a remote control server which allows any program running on a PC to send commands to the robot via the IR tower. The program I wrote is relatively simple allowing the robot to be driven forward, backward, turned left, turned right, stop and play system sounds. As well as all that the robot can also be instructed to carry out a movement for a specified amount of time.

RCX Remote Control

To test the server I wrote a very simple client to allow the robot to be remotely driven forward, backward, turned left and turned right. The robot can be controlled by clicking on the buttons or using the W, S, A and D keys.

The code for the RCX remote control server and the simple client is available below.

RCXServer.java (2.87 Kb)SimplePCClient.java (5.19 Kb)VisionPCClient.java (5.35 Kb)

5. The Result

As said earlier, this project took roughly 2 weeks from conception to having a fully working demo. As it’s not very easy to show pictures of it working I have produced a short selection of videos of it going after a ball, as well as a video what is seen on the computer screen.

When the robot starts it turns clockwise until is detects an interesting object. Once detected it then tried to center the object in its field of view. Once centered it will drive forward until the middle of the object is no longer in the centre of the image. At this point the robot will turn either left or right as necessary and then drive forward once the object is centered.

If you watch the video you will notice that the robot moves in small increments, pausing for a second before carrying out the next command. This is to give time for the image to steady so that it is not blury. Also, there is a slight delay between sending a command to the robot and the robot carrying it out.

lego-robot.wmv (7,406.28 Kb)