Starting with the Poppy Robot


Hi Everyone, I wanted to take the opportunity to introduce myself to the forums.

My name is Hadi Esper and I live in Dubai. I just purchased a full Humanoid poppy from Gen. Robotics and will be expecting delivery in the next 2 weeks.

I am starting up a personal project as a hobbyist in Robotics and I am not associated with any university or company.

I am re-writing all the software from scratch and will be using my own control system (no plans on using pypot). I also plan to introduce natural language processing, speech recognition, and touch sensors into Poppy.

I look forward to be part of this community, I am happy to be a part of it.


Welcome to the Poppy community!
Your project seems to be really interesting. Please feel free to share your experiments here!


Thank you Nicolas. Here is a short video of what I am working on. I am still waiting for the poppy to be delivered so I can implement this software on it. I have already prepared the entire robot structure and dynamixel integration:–vM

Poppy will be making hand and body gestures while talking and will also be maintaining eye contact. This has been already coded and tested with some AX12 motors I have. I cant wait to try it on a real life poppy.

Note that this is only my first iteration and I still have a very long way to go for improving it. I have only been working on this for about a month.

Poppy avec l'ia

That’s really cool!
Do you plan to do it open-sources? We could use it in the standard version of poppy!
If you want to associate your code with the robot movement you can try to use Vrep simulator…


Thanks Nicolas, at the moment I do not have any plans for open sourcing it. For the most part what I am working on is still in extremely early development stages and also It will be a nightmare to document everything for open sourcing it properly which I currently dont have time for especially with handling my day-time job. But with time maybe yes.

I tried getting the Vrep simulator but could get it to work. I gave up easily honestly mostly because I dont know anything about python or notebook or vrep. It is such a cool concept having poppy run on the simulator, but for now I decided not to waste too much time learning the setup and just focusing on my current development and waiting until I receive the poppy humanoid shipment.

Ill post a showcase when I get it and I would really appreciate any comments and suggestions.


Hey Guys,

Here is a quick update with a lot of improvement.

All the foundation for the control of the dynamical servos has been coded into this. I even made an interesting function that take the TTS speech as input, finds out the duration of the speech and if has words relating to positiveness, negation, love, hate, anger…etc and the robot will automatically move its hands/body based on this while he is talking.

**Still waiting for the Poppy parts to be shipped so I can assemble the robot and test my code.


Hey guys here is another update.

I finally got all the poppy parts and I was able to assemble it in just 2 days!

So my first tests were to create gestures and match it with speech. I created my own custom software to record and edit gesture scenes to achieve this.

I also added an LED light in the head which flashes when the robot is talking. Here is a demo that is quickly put together (will need a lot of improvement though!)

I think the robot is in dire need of the manga screen to give it a bit of life and make it look more friendly.


You’ve got really fast to master the Poppy, it is really impressive !

I’m thriving to see what you will be able to build in the coming weeks


Thanks Matthieu that is nice to hear from someone whom I admire his work!

So far I want to continue focusing on making the interaction with poppy more natural and friendly (speech, vision, hand/body gestures, NLP…etc)

My next big project is going to be installing accelerometer and gyro and to work on an algorithm for balance. But I think this project will be a very huge and complex one. I think it will take me months of reading and research before I write the first bit of code. If you have any interesting reading material please point me to it!

Thanks again!


Well the balance is very complex, there is no good way to do. Everyone has a different approach working in a really specific case and if you change a parameter by 5% it no longer works…

Maybe you could look at the work done on the Darwin OP which has a similar structure. There are robots from the robocup, maybe few have open source software.
If you are not afraid by research paper you should find some state of the art papers on google scholar.