This is a video of our completed project demonstration. In the video, Steve moves as the LynxMotion arm tracks him. Because of the limitations of the programming language in Processing, the computer processing power, and the fluidity of the laptop's webcam, the robotic arm has slightly delayed response time. However, motion tracking was still achieved and provides a solid framework for further development and improvement.
In previous posts different methods of motion tracking were discussed. It was said that the ultimate goal would be blob detection over color detection. This was achieved however, did not provide the results we were hoping for. The motion detection was far to jumpy and not accurate enough for our purposes. Also, because we wanted the program to track people, not just objects, color detection was not possible. We decided to use an advanced OpenCV component, face detection. This works very well, but is still not ideal. If blob detection could be further refined it would be the motion detection application of choice.
In the video it may have been evident that the laser was simply taped on. On the blog, figures have been posted from Pro/E of a model servo casing used to house the laser. This was to be printed out of plastic, however, this idea became less practical because of the time it would take to print that model. Instead a servo with the contents removed was drilled out. We then slid the laser through that and mounted it. This is seen in the picture below.
Laser mounted in servo casing |