The Albert Einstein time travel machine   UnderConstruction
This year's theme for the annual village feast is "tijdmachine" (time machine). Thinking about the theme, mr Einstein popped up, one of the most important scientist ever lived. He believed that it is possible to travel in time. So the control for this year will have something to do with mr Einstein; let's say a time travel machine. Not as big to hold a whole person, but big enough to let insects, little creatures like a mouse travel in time. Let say a time travel machine as big as the size of a human head.
The story
Visualize the following picture; all nowadays new build machines have a certificate of conformation (CE). This time travel machine is a prototype, build before CE marking existed and a RI&E is never performed... so; this machine is not "save". Ok, we have placed a warning sticker on the machine.... but there is always someone sticking his nose into things.... he could better not do that; while his head is in the machine, there is a big flash and only his head travelled along with the machine to the accidental chosen year and place of  "15.08.2019.WYTGAARD"
So, just a head with some frills and frippery around the neck is there -in the time travel machine- doing funny things...(?!).
The technical design 
The technic i had in mind and partially ready-made and working; hardware consist of an arduino mega2560, a Raspberry PI and a handful of servos, colour leds, and relays. For the control of the head; turning up and down, looking left and right i will use a raspberry PI3b with object detection. Therefore a camery is used to look at the surroundings in front of the travel machine. With the object detection "person is on camera", it is possible to determine the place and head of the person nearest to the machine. With this information it is possible to move the head in the direction of a passant, look him or her rigth in the eyes and do something with the eyes and mouth of the dolls head... maybe it is also a funny detail to let some fluid (sweat) dripping somewhere from the hair line over the face.... ah, ideas enough... 
March 24, Week 12 (20 weeks remaining)
  • Received the Pi camera last week. FPS with this camera is higher then the USB cam. FPS is now around the 1.6. The Python program is slightly changed (made more readable). Development is going slow at the moment because of some new priorities. 
Febr 22. Week 9 (23 weeks remaining...)
  • The RPI3b python program was not stable. "Segmentation failure", "Invalid instruction", hanging... I could not find any lead on the internet. It had to be the hardware of the RPI3b. So, i've changed the Raspberry hardware. The new Raspberry is stable, was working over night. I've only changed the hardware, i'm still using the same software/SD card. Now i'am still searching for increasing the FPS. Attention for overheating the raspberry; without forced ventilation the cpu easily reaches 80 degrees while running Python3/Tensorflow object detection. With a ventilator the temperature sticks at 50 degrees. 

  • The moving of the head (left to right), is now going smooth. When no objects arround for 20 sec (no person in front of the camera), the head is going to the neutral position and the head is resting on the "chest". Thanks to the multitask FSM the actions for going to the neutral position and bending the head to the chest, are seprate sequences and running in parallel; sometimes the neutral postion is earlier reached as the bending of the head or visaversa (nice...). Also when starting the actions, the head is simultaneous turning and looking up.

  • First pan and tilt actions implemented (servo no. 2 is active) 
Start of the project week 8 (24 weeks remaning...)
Febr 9. Week 7 (25 weeks remaining...)
  • RPI3b. Object detection on the RPI3b
    Working now with a webcam. The software is now capable of reading 1.3 FPS and that is not quick enough for a "smooth" control of the head. I ordered a RPI3 camera and i'am examining some methods to increase the number of FPS. 
  • RPI3b/MEGA. Messages coming from the RPI3b to the MEGA board for the control of the head.
    The concept is operational. After the detection of a person on screen, python sends the middle of the detected box to the Mega board. Later on, an estimate for the persons head in front of the camera will be implemented. The smaller the box, the higher the estimate for the person head. The Mega calculates the right angle for moving the head and controls the Left/Rigth servo. 
  • MEGA. Pan and Tilt actions
  • MEGA. Light effects not yet made
  • MEGA. Sweating effect not yet made
  • MEGA. Sound effects not yet designed and made.
  • Mechanic. Hardware for the head (the head/mask) not yet made
  • Mechanic. Hardware for the time machine not yet made
  • Commissioning. Testing the whole control planned for week 31.