Lately, I have created a human urdf model which is used to:
- filter out 3D points corresponding to ones that are on the human partner (for real icub and simulated pr2),
- obtain human partner’s kinematic body state during an interaction episode (for real icub and simulated pr2),
- be able to make changes in a simulated robot’s (e.g. pr2) virtual environment by interacting with the objects and also the robot.
The video below shows preliminary tests on this setup. In this video, I try to change the position and/or orientation of the objects in pr2’s environment while -at the same time- extracting relevant features and visualizing them on the rviz or matlab.
All the source code is open source and available in the project’s google code page. Please note that the source code is being frequently changed, and it is not recommended to use current version of the stack. I hope, I will move the close-to-stable components under the trunk directory of the repository in a week or so. Then, these packages can be used more comfortably.