Wednesday, June 6, 2012




This is a video of our completed project demonstration.  In the video, Steve moves as the LynxMotion arm tracks him.  Because of the limitations of the programming language in Processing, the computer processing power, and the fluidity of the laptop's webcam, the robotic arm has slightly delayed response time.  However, motion tracking was still achieved and provides a solid framework for further development and improvement.

In previous posts different methods of motion tracking were discussed.  It was said that the ultimate goal would be blob detection over color detection.  This was achieved however, did not provide the results we were hoping for.  The motion detection was far to jumpy and not accurate enough for our purposes.  Also, because we wanted the program to track people, not just objects, color detection was not possible.  We decided to use  an advanced OpenCV component, face detection.  This works very well, but is still not ideal.  If blob detection could be further refined it would be the motion detection application of choice.

In the video it may have been evident that the laser was simply taped on.  On the blog, figures have been posted from Pro/E of a model servo casing used to house the laser.  This was to be printed out of plastic, however, this idea became less practical because of the time it would take to print that model.  Instead a servo with the contents removed was drilled out.  We then slid the laser through that and mounted it.  This is seen in the picture below.

Laser mounted in servo casing

Wednesday, May 30, 2012

We have just over a week left until our project due date and presentation.  We have the programs written for face detection, color detection, and basic blob detection.  Blob detection is the best method for motion tracking because it does not require the object to be of a certain color, size, or orientation.

What we have left to do is integrate the Arduino with the finished computer program on Open CV.  This is proving to be challenging.  The Arduino board is not properly communicating with the computer, so many bugs must still be ironed out.

The laser mount mentioned in previous posts proved to difficult to print in plastic, and expensive to machine out of aluminum.  Instead of these methods, we have decided to use a flat piece of sheet metal bent to shape.  The shape will allow the laser to slip in through two holes.  It also mounts in the same holes as the original servo motor.

Finally, we built a simple wooden base to mount the robotic arm and camera too.  It is essential that both do not move once they are calibrated.  It has allows use to conceal many of the wires and components underneath.  This is not just purely cosmetic, but also eliminates the possibility that the wires will get caught on the arm in motion and alter its position.

Wednesday, May 16, 2012


            Previously we have been able to track motion using color detection.  This motion tracking works best with bright vivid colors such as red.  This has very limited uses because you are only able to track objects of a very specific shade of color that can easily be lost in a busy background.  Over the past couple of days generic motion tracking over a static image was programmed and operated successfully. 

            This new motion tracking is achieved by taking a picture of the environment the camera is placed in.  After the picture is taken, the program recognizes when there is a difference is the video being recorded versus the image already taken and identifies that as motion.  This works far better than the color tracking program and was a big achievement.

            In order to have the motion tracking run smoother we added more image noise.  This allowed smaller amounts of motion to be detected and at the same time the image processing was much smoother.

            Finally, a final version of the ProEngineer model of the laser mount was created.  Two holes of the side were added to allow for screws to tighten and fasten the laser in the mount.  This model will be sent shortly to the rapid prototyping machine to create a plastic piece to place the laser in.  A screen shot of this is directly below.

Final laser mount model

Sunday, May 6, 2012

Recently we have been building a model servo motor in Pro-Engineer.  This will be printed out of plastic and used to house the laser pointer.  The plastic housing will have a circular hole in it that the pen laser pointer will slide through.   In figure 1 is the rough computer drawing of the servo that will eventually be sent to a 3-d plastic printer machine.  In figure 2 the actual servo motor used for the model is seen.
Figure 1
Figure 2




Wednesday, April 18, 2012


An easy to use interface for adjusting position of the robotic arm was created.  This computer servo controller makes adjusting the position of the robotic arm extremely simple.  Enter a number representing degrees of rotation for both the x and y servos and the arm will move to the exact position designated.  A screenshot of this interface is found below.



                The first steps have been taken to achieve color detection using the camera.  Written in OpenCV, the camera sees vivid colors in its display.  On the computer display, clicking the object with a distinct color designates to the program which color to track.  It then will continually follow the object while placing a red dot at its approximate center.  The red dot will be the ideal location to point the laser pointer once the robotic arm is integrated with the camera color tracking program.  Currently some problems exist that must be fixed.  Very good lightning is needed to pick of the different colors.  Also, overly reflective surfaces are not seen by the program as an object, but just a light source.  These problems must be solved in order to be able to use this program effectively.  A screenshot of this program is found below.


Tuesday, April 10, 2012



We  picked up a LynxMotion robotic arm from the  Drexel autonomous lab yesterday, as well as an Arduino board.  The Arduino board is responsible for receiving signals from the computer and translating that to motion in the servos.  Please see reference 1 for more information on the Arduino board. Initially, the robotic arm had additional servos, joints, and a gripper on the end that was not necessary.  We removed the unnecessary components, as seen in the pictures below.
Before Removal
After Removal









Removing the additional parts also decreased the strain on the needed servos tremendously.  The arm will move in an x direction and a y direction.  The x direction rotates about the black base seen in the photos above, and the y direction operates by the top joint and visible servo.

The servos were then connected to the Arduino board after simple cutting and stripping of a few wires.  The Arduino board was then connected to the computer using a USB cable.  C++ coding will be used for this project.  Using example codes from the Arduino website (found on references page), a simple code was created to rotate the arm along the x axis.  This code was then expanded to move on both the x and y axes.

Achieving motion in the arm after one day was a huge achievement.  Below is a short video of the motion of the arm at the end of day 1.