This robot has been subject of a number of theses at University of Twente – I did the design of the first version together with Takao Watanabe (from Waseda University, Tokyo) in 2013. In the meantime it has been used as the ‘head’ on the R3D3 robot, as well as being the subject of our ‘from puppet to actor’ research theme: can you off-load certain basic behaviour control (breathing, blinking, looking at points of interest) to a robot agent, instead of doing all ‘deliberative’ control from a central machine? Mixing deliberate control with reactive control for this specific case has been addressed in the paper ‘Things that make robots go HMMM, Heterogenous Multi-Modal Mixing’

four of the faces of eyePi

EyePi is controlled by a raspberry pi, uses an Arduino controlled LED matrix display as eyes (the camera is fixed in the version shown in the picture). The neck joints are powered by dynamixel Servo’s. The robot is both controlled by MIDI (a Korg nanoKONTROL panel in this case) and its camera.

eyePi