[Documentation] [TitleIndex] [WordIndex

Only released in EOL distros:  

Motivation

Functionality

How To Control the Robot

Video

Inside the Package

How to Run the Code

  1. If you don't have them already, download the packages written in section Dependencies with their dependencies using,
     rosdep install PACKAGE_NAME
  2. Copy nao_openni/nao_ni_walker.py to nao_ctrl/scripts/

  3. Launch Microsoft Kinect nodes,
     roslaunch openni_camera openni_kinect.launch
  4. If you'd like to see your self being tracked, which is useful when you control the robot,
     rosrun openni Sample-NiUserTracker
  5. Turn on Nao, make sure that it's connected to the network, check it's IP number,
     roscd nao_ctrl/scripts
     ./nao_ni_walker.py --pip="YOUR_NAOS_IP_HERE" --pport=9559
  6. Make and run nao_ni
     rosmake nao_openni
     rosrun nao_openni teleop_nao_ni
  7. Stand in standard Psi pose, wait for the nao_ni code to print out "Calibration complete, start tracking user".
  8. For a short tutorial and to get familiar with the commands, see the video


IMPORTANT NOTE: Calibration of the nao_ni takes (currently) much longer than Sample-NiUserTracker. It's because the message publishing rate for Nao is relatively low. Don't get confused if you're watching your self on "Sample-NiUserTracker". Wait until you see the "Calibration Complete" message on the terminal you're actually running "nao_ni". The plan is to make it independent from publishing rate and thus faster for the next release.

More information

Dependencies

The gesture interpretation code is written in CPP, and the following ROS packages are required:


2024-11-02 14:35