[Documentation] [TitleIndex] [WordIndex

Welcome to the fifth lab session of CoTeSys-ROS Fall School!

The lab session today will be divided in 2 parts. In part 1 we will look into SBPL planning and in part 2 we will get the robot to perceive objects on the table 1 which will be incompletely set for a breakfast and then infer possible missing objects.

Part 1 SBPL Planning

Instructions here.

Part 2 Installation Instructions

* Update your ROS overlay:

rosinstall $HOME/ros https://svn.code.sf.net/p/tum-ros-pkg/code/rosinstall/fall_school2010.rosinstall $HOME/ros/sandbox

* Make sure your CLASSPATH and ROSJAVA_AUX_CLASSPATH are NOT set:

env | grep CLASS

* Shall they be set you can unset them with

unset CLASSPATH
unset ROSJAVA_AUX_CLASSPATH

* Make JPL package:

rosmake jpl

* Export JAVA variables:

echo 'export CLASSPATH=`rospack find rosjava_deps`/rosjava_msgs.jar' >> ~/.bashrc
echo 'export ROSJAVA_AUX_CLASSPATH=`rospack find rosjava_deps`/rosjava_msgs.jar' >> ~/.bashrc
source ~/.bashrc

3D Perception: Recognition of Objects using VFH

Download and play PCD files (one by one):

wget http://fallschool2010.informatik.tu-muenchen.de/public/scenes/milk-cereal.pcd
wget http://fallschool2010.informatik.tu-muenchen.de/public/scenes/bowl-cereal.pcd
wget http://fallschool2010.informatik.tu-muenchen.de/public/scenes/milk-bowl.pcd
rosrun pcl_ros pcd_to_pointcloud <path_to_your_pcd_file> 1

A way to visualize your point cloud from a "pcd" file is to run:

rosrun pcl_visualization pcd_viewer <path_to_your_pcd_file> 1

Hint: Press "l" and then "5" to get nicer display colors. Hint: You might need to rosmake it

Run recognition pipeline, the same one as on Day 2, with an added VFH cluster classification service:

roslaunch table_objects recognition_pipeline_vfh_offline.launch

Run rviz to inspect segmentation:

roslaunch table_objects recognition_pipeline_rviz.launch

Now create a new package in your repository named missing_objects:

roscreate-pkg missing_objects sensor_msgs visualization_msgs pcl table_objects roscpp json_prolog comp_cop comp_missingobj vision_msgs

and implement the node with the following functionality:

Start off with the following template: missing_objects.cpp

Bonus: also publish a visualization marker for each object, based on the returned CollisionObject list (doc/api/mapping_msgs/html/msg/CollisionObject.html).

Echo and observe object types on the topic (do not mind funny names, they are imported from openCyc):

rostopic echo /synthetic_percepts/tabletop_percepts

Inference of Missing Objects

Assume that the table has been partially set for one person. The person who was setting the table got interrupted and then asked the robot to complete the task. Based on the objects already present, the robot needs to infer the objects that are most likely to be used along with the ones that are observed. To this end, we issue a query to the probabilistic logical knowledge base in the Probcog system via Knowrob.

Background: In the ProbCog system, we have trained a model that models precisely the usage of utensils and the consumption of food and drinks during meals (for a known family of six the robot is usually serving). Given an arbitrary set of evidences, we can infer the probabilities of any propositions that may be relevant. In our case, we are interested in the probabilities of uses and consumes atoms given that other such atoms are known to be true. The construction of the respective evidence atoms goes on behind the scenes and is encapsulated in a Prolog use-case file that specifies how evidence in the system is to be mapped to the Probcog functors.

All you need to do is issue a Prolog query using the predicate

comp_misingobj:comp_missingObjectTypes(PerceivedObj, MissingObj, MissingTypes)

where PerceivedObj is the list of objects that were observed, MissingTypes is a list of types/classes of objects that are required and MissingObj is a list of (dummy) instances of these types (mainly for purposes of visualization). Write a C++ client that issues such a query and prints the results. Observe how varying the scene that is observed changes the inference results.

Start off with the following template: missing_objects_query.cpp

To test the client first run in one terminal:

roslaunch missing_objects_tutorial knowrob.launch

Run executable of missing_objects.cpp in 2nd terminal and finally execute your query in 3rd terminal.

End-of-Day Run Check List

Terminal 1:

rosrun pcl_ros pcd_to_pointcloud <path_to_your_pcd_file> 1

Terminal 2:

roslaunch table_objects recognition_pipeline_vfh_offline.launch

Terminal 3:

roslaunch missing_objects_tutorial knowrob.launch

Note: This needs re-launching every time you run the new pcd file.

Terminal 4:

Run executable of your missing_objects.cpp

Terminal 5:

Run executable of your missing_objects_query.cpp

Troubleshooting

Solutions

missing_objects_solution.cpp

missing_objects_query_solution.cpp


2024-12-07 12:14