Some photo’s of the final prototype
All the servos and connections handle the movement adequatly.
The lamp head is based on a flower head to make the lamp more of a living, natural object. To avoid the use of an additional servo, the upwards movement of the lamp head is mechanically transmitted to a system that opens the leafs of the head.
When the lamp looks up, the flower opens, when it looks down it closes.
This is not only designed to the actual movements of a flower, but also expands the focus to lighten up more space as you move further away, focuses on small locations as you move closer and allows the lamp to be a beautiful mood light when the light is on, but tracking is off.
Testing Proto 5 shows that the servo is capable and that the reach is increased a lot.
The images below show the furthest back and forth the lamp reaches with this setup.
The video below shows the lamp quickly adjusting its position. Slowing this action down will lower stress on the servo. The current coding just tested the possible reach of the servo without failing.
Video: Proto 5
After further research, we notice that the motor is often added onto the middle joint of the lamp. This increases the perpendicular lenght to the maximum and allows us to connect it directly to the rod. This also increases the visual unity of the lamp. In the previous prototypes, the motor was clearly visible. Using this method, we can easily blend it into the design. The images below show how and where the connection was made.
After testing the strenght of the motor, we minimize play by making the link out of aluminium plating. The metal wire bends and therefore gives no accurate measurement of the possible reach. Using the aluminium link we can clearly see a “stiffer”, more controlled movement.
Video: Proto 4: Aluminium link
To see if the video processing works, I let the program write left,right according to the hands position. Below you see the result of that
After that, i became to a conclusion i needed a delay. Otherwise there will be constant signal to rotate a servo. I chose to do this delay in the arduino code so the video processing can be continue.
For the simplicity of the code we chose to use 6 signals (2 signals of each servo). Also, because we use a arduino uno, it allows us to do it this way. That way we just have to look which signal is high, and change the position of a certain servo accordingly.
When increasing the perpendicular lenght, the wire bends too easily.
We strenghten it with glue.
Video: Proto 3: Strenghtened wire
We vary in perpendicular lenght. Connecting the motor closer to the pivot point requires more force but also increases the pivot angle, increasing the forward reach.
Video: Proto 2: lower connection
To test wether the strenght of the servo motor is sufficiënt to deal with the weight of the lamp, we test them in the most demanding situation; Moving the lamp back and forth.
To minimize the load, the motor is moved forwards and connected with a bendable wire. This way the motor won’t break if the lamp is too heavy. The wire will bend instead of breaking the gears.
Furthermore, holes are added in the lamp in varying distance from the pivot point. This way we can test different perpendicular lenghts (to measure the “moment”) to vary in needed strenght to pivot the lamp.
Video: Proto 1
The code for tracking is almost complete, it works on the pc. So now i have to converted it for testing on the raspberry pi. There is also an extra possibilty to quantize the video images, to respress the noise. But for now I didn’t do it to relief the raspberry pi from extra (calculating) processing. Below you find example of how it works with a brief explanation.
I used opencv libraries which was a big help.
First, I removed every color apart form the skin color. I did this by converting the image from RGB (red, green,blue) to HSV (HUE, Saturation, Value). Through certain upper and lower limits, Its possible to remove any color except skin tone color. An disadvantages is that skin toned objects will not be removed. I also blurred the image a little bit (3×3 pixels) repress noise.
After that, i use backgroundsubtraction. Opencv has a standard function called BackgroundSubtractorMOG2. This allows me to recognize moving objects, and these objects are only skin toned color. So its detecting hands and arms. This also means, if a hands stays still for a certain moment, it becomes part of the background.
When the moving hand/arm is detected, i draw a green black around it. This i called a contour. These is also a standard function in opencv and allows me to know where the moving hand/arm is on the screen. After that, i only need to send a signal to the arduino if the box is over certain limits to turn the lamp.
A possible problem with the backgroundsubtraction was that when the camera changes, there’s a whole new background to “learn”. So it will stop following the hand. This was solved by removing all colors except skin tone. This way, when the camera turns, most of the background stays black, and it doesn’t need to learn a new background and can keep following the hand/arm.
A simple lamp with the same dimensions (lenght and weight wise) are modelled in Siemens NX.
We decided on some design elements:
- 3 motors:
1 in the head
1 in the middle joint
1 for rotation around the z-axis
A Motion simulation is made where we look at the joint/motor that will need the most power. The result are shown below.
The torque results: 1.06 *10^3 N/mm which equates to 1 N/m. We can achieve this by using gears.
This way the motors, that are fast but weak can be converted to slower and stronger.