Poppy and live Motion Capture


Hey Sonel

That’s really cool. i guess the dongles sticking out are apporpriate seeing as you are sending neural activity to the poppy. Did you use the install script to setup the raspberry pi? What distro did you use? cheers.



I just installed the stock Rasbpian Jessie, updated it, and then installed via pip pypot and poppy-humanoid (I think that was all).

To make things easier to work we used a Raspberry Pi 3 (built-in WiFi) plus a second USB dongle and I set it up to work in access point mode (the built-in WiFi being the broadcaster and the second WiFi connecting to the campus’ network). The details about setting this up are in this tutorial (you will need to swap eth0 to wlan1)). This way you always have the built-in WiFi creating a network to which you can connect and then SSH into the robot, no matter where you are. If you’re in a place where you have also WiFi access with the second dongle then the Raspberry Pi acts as a router and give you and itself access to the internet. Very convenient for when you are away.


Thanks Sonel,
I installed raspian Jessie and the Pi 3 is working well.
I actually installed the ergo jnr. then poppy-humanoid afterwards.
Seemed an easy way to get jupyter and notebooks etc.

The movement does seem much smoother.
I haven’t tested a lot yet but its looking promising.
Fitting it in the head is another matter :wink:
I could be imagining it but the motors may be running cooler as well
I don’t have the temp monitor running yet so not positive about the temp.

waiting for a V2 Pi camera to test some vision out.



@sonel oh, the wifi network is a great idea!
I’ve been carrying a wireless router around everywhere