User Tools

Site Tools


virtual

This is an old revision of the document!


Running Instructions for Virtual Reality demo

Initial State

0. Go the robot lab and look for a desktop computer to which a Headset and Motion Controllers are connected (Its on a corner of the lab). 1. If that computer is not turned on, turn it on. 2. Login with the user account “RobCog” with the password “robcog”. 3. Look for Steam, and start it. On the upper-right corner of the window there is a small icon which starts the detection of the motion devices. 4. Wait until the Headset and each of the motion controllers were detected (icons stopped blinking and are in green) 5. Start the Unreal environment by clicking the icon VR-demo placed on the desktop. 6. To navigate in the Unreal editor one has to constantly hold the right-click button of the mouse and use the ASDF keys to navigate (q and e for going down and up). 7. On the upper right there is a camera symbol with which one can set camera-navigation velocity. 5. Pressing “Play preview” starts virtual environment * When someone complains about blurred view, its because headset is not at the right position with the eyes. * try to leave the cable of the headset behind you 6. Close all programs 7. Logout of the windows session 8. If asked about more videos, show bookmarked Vimeo videos on internet browser (Firefox)

Put some oil into the pot and the pot onto the bottom level:

Align it between the black tape markers, such that the pot touches the markers. The handles should be parallel to the edge.

Put the lid and the bowl into the top right drawer. Put some popcorn into the bowl, just enough to cover the bottom of the bowl.

The lid should be in the corner touching the drawer walls. The bowl should have some space in the front as the robot is going to a apply a front grasp to it. This means, don't put the bowl too close to the drawer front wall or the robot's hand won't fit in.

Put the plate into the top left drawer:

The robot is going to apply a diagonal side grasp on the plate, so leave some space to the right and to the front of the plate or robot's hand won't fit in.

Put the salt cellar onto the table on the left back side:

There is a small “+” drawn on the table. The best is to sit the salt cellar on it. The robot will look for the salt cellar in the whole region left of the black plate. Danger: if the salt cellar is too far the robot might be unable to reach it. So just put it on the plus to stay on the safe side:

Running Steps

Login into pr2a machine (password is “popcorndemo”):

ssh -X popcorndemo@pr2a

You will find all the steps in the home directory of the popcorndemo user on PR2:

/home/popcorndemo/popcorn-demo-running-instructions.md

Now please follow the steps in that file.

RViz Setup

You can find a good RViz configuration in the popcorndemo user of PR2:

/home/popcorndemo/.rviz/default.rviz

Feel free to copy it over to your local machine or use VirtualGL or “ssh -X” to run RViz directly on the robot.

You will need the following things visualized in RViz:

  • Navigation map: Add → Map, Topic: /map
  • Base laser sensor data: Add → LaserScan, Topic: /base_scan
  • Kinect v.1 sensor data: Add → PointCloud2, Topic: /kinect_head/detph_registered/points
  • Localization hypotheses: Add → PoseArray, Topic: /particlecloud
  • Robot state: Add → RobotModel, Robot Description: robot_description

When localizing the robot after you just started up, you need to click the “2D Pose Estimate” button. Keep in mind that for using this button from your local machine you need to have your ROS_IP exported correctly:

$ export ROS_IP=192.168.xxx.xxx

If you don't do this you will only have one sided communication with the ROS master, you will hear what the robot is saying but the robot won't hear you. To make sure, listen to the “clicked_point” topic on the robot to see if RViz clicking reaches its destination.

Possible failures

If you need to stop the demo:

pkill popcorn_cooking

and then you can restart the demonstration in the desired step.

If roscore is already running, run the command to see the processes:

robot plist

If necessary, to stop the processes run the command:

robot stop

Troubleshooting

The localization is not precise enough

If people have updated the PR2 recently, then the package ros-indigo-pr2-navigation-global overwrites our local AMCL configuration.

To fix it:

roscd ros-indigo-pr2-navigation-global
sudo cp amcl_node.xml amcl_node.xml.orig
sudp cp amcl_node_iai.xml amcl_node.xml

That will overwrite the AMCL configuration with the parameters that Alexis chose in 20170309 when Gheorghe was having weird localization issues. The parameters were chosen so that it trusts the odometry much less (and the laser more), also switching the odometry model to “omni-corrected” which fixes a long-standing bug that made the localization much worse than it needed to be. The amount of laser rays evaluated is higher, and the distance for recalculating the localization got reduced. Both those things increase CPU usage, but with the new PCs it is not significant.

The zero of the joints is off

We often have the wrist angle on either arm wrong by about 30 degrees. You can notice by watching rviz and noticing that the wrist angles are off.

Re-doing the startup calibration fixes that, which you can do by turning the robot completely off and restarting, or by following the instructions here. Also check pr2_controller_manager Troubleshooting

Basically:

_Turn motors off_ and force hand in straight position
robot start
rosrun pr2_controller_manager pr2_controller_manager stop r_arm_controller
roscd pr2_controller_configuration
rosparam load pr2_calibration_controllers.yaml
rosparam set cal_r_wrist/force_calibration true
rosrun pr2_controller_manager pr2_controller_manager spawn cal_r_wrist
_turn motors on_ the robot should start moving its wrist in calibration wait for the movement to complete
robot start
virtual.1527782398.txt.gz · Last modified: 2018/05/31 15:59 by awich

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki