Dear all,
I'm trying to simulate the Multisense SL device in Gazebo.
I've started by modifying the plugin available here:
https://bitbucket.org/osrf/drcsim/src/194be8500fef81593f79607a21ee2badd9700a0e/drcsim_gazebo_ros_plugins/src/?at=default
The Gazebo-related URDF is the following:
0.9 0.9 0.9 0.9 0.9 0.9 0.9 0.9 1 0 0 0 0 0 0 false 40 1081 1 -2.356194 2.356194 0.10 30.0 0.01 gaussian 0.0 0.0 ${prefix}/lidar_scan ${prefix}/head_hokuyo_frame 10.0 1.3962634 1024 1024 R8G8B8 0.02 300 gaussian 0.0 0.007 0 -0.07 0 0 0 0 1.3962634 1024 1024 R8G8B8 0.02 300 gaussian 0.0 0.007 true 10.0 ${prefix}/camera image_color camera_info ${prefix}/left_camera_optical_frame ${prefix}/right_camera_optical_frame 0.07 0.0 0.0 0.0 0.0 0.0 1 1000.0 gaussian 0.0 2e-4 0.0000075 0.0000008 0.0 1.7e-2 0.1 0.001 ${robot_name}
To simulate the stereo, I need to run the stereo imageproc node from ROS, which needs a remap from the name "image_raw" (standard for the node) to "image_color" (standard name for Multisense real hardware).
The code is the following:
When I run the two, I get more than 600 Hz for the `/multisense/camera/left/image_color` topic (despite the update rate is set at 10 Hz), while for the `/multisense/camera/points2` I get 0.2 Hz.
If I put `image_raw` in both files, I correctly get 10 Hz for both topics.
Any suggestions?
I'm running the simulation on Gazebo 2.2 with Indigo and Ubuntu 14.04.
↧