Poppy in a musical setup, please share your ideas

We finally produced something :slight_smile: Here is a demo video:


And a description of what is happening:

This is a demo of the Poppy robot performing electronic music from the movements of its own body, by interaction with a human. Drum samples are triggered by the movements of the left foot (kick) and the head (snare). The movements of the left shoulder modulates the pitch of a synthesized sound as well as the cutoff frequency of a low-pass filter applied on it. The grunt is triggered by the right hand movements.

An interesting point is that the modulations of the pitch and cutoff frequency performed when the human moves the left arm of the robot would be more difficult to realize in one shot with a classical MIDI controller: it would require to synchronize the movements of two faders. Using a robotic arm, where the two shoulder motors are mapped to the modulated parameters, allows to generate synchronized and quite complex modulations simply by grasping and moving the robot hand.

Of course this is a just a preliminary demo, there is no learning and no exploration here and the human-robot interaction is quite limited. The good news is that we can now build on it to go further on this topic and ask to real musicians if they could do something musically convincing with such a robotic interface.

1 Like