As I could see the ROS hydro version has the packages for irobot create:
ros-hydro-create-dashboard
ros-hydro-create-description
ros-hydro-create-driver
ros-hydro-create-gazebo-plugins
ros-hydro-create-node
But I couldn't manage to find a tutorial of how to run Create in gazebo simulator.
↧
How to use irobot create in gazebo with ROS hydro?
↧
Explore the Gazebo World (turtlebot)
I'm having a couple of inter-related problems working through the Explore the Gazebo World Turtlebot tutorial (http://wiki.ros.org/turtlebot_gazebo/Tutorials/indigo/Explore%20the%20Gazebo%20world). The main problem is #2 below.
1. Step 3 says to perform the steps from the Gazebo Bringup Guide (http://wiki.ros.org/turtlebot_gazebo/Tutorials/indigo/Gazebo%20Bringup%20Guide). Then, in Step 4, their's a contradiction. Step 4 says that the world is empty, whereas, after the Gazebo Bringup Guide, the world has five objects in it. Then Step 4 suggests adding objects to the world.
2. Step 4 shows RViz displaying the objects that appear in Gazebo. That's not what I'm experiencing. I don't see any objects in RViz: neither the 5 objects that were present when I brought up Gazebo in Step 3, nor any objects that I manually add in Gazebo.
So, why isn't RViz showing the objects that are showing in Gazebo? How can I get RViz to do that?
BTW, I'm using Ros Indigo on Ubuntu 14.04 running on a Mac with OS 10.9.4 and Parallels 11.
↧
↧
Load SDF data with PCL
I'm searching for a fast way to create a point cloud in PCL representing the room in which our robot moves. We already have a representation of the room in gazebo, so one thought was that there maybe is an easy way to load the SDF file with PCL, but I didn't find anything in the PCL API documentation. Another thought was that maybe there is a quick way to convert the SDF file to a PCD file but until now, I didn't find anything in that direction.
So does anyone know of a quick way to load data from a SDF file with PCL?
↧
How to avoid this error , Error [gazebo_motor_model.cpp:144] Aliasing on motor [0] might occur. Consider making smaller simulation time steps or raising the rotor_velocity_slowdown_sim_ param.?
I have added a two link arm to a quadrotor, now I want to add effort controller to the arm. When I am applying the controller, I am getting the following error.
Error [gazebo_motor_model.cpp:144] Aliasing on motor [0] might occur. Consider making smaller simulation time steps or raising the rotor_velocity_slowdown_sim_ param.
I tried changing the rotor_velocity_slowdown_sim_ param but it showed the same error.
↧
how to set the defult namespace for the effort_controller?
I am trying to apply effort controller to an arm and the effort_controller is expecting to have robot_description to be present in the root namespace. When I call the effort_controller in the launch file , I specify the namespace of the controller, yet it expects the namespace of the robot_description parameter to be present in the root namespace.
↧
↧
My custom made mobile manipulator using Husky doesnt move.
I have been stuck on this issue for the past one week.I mounted a custom made arm on top of a husky.Independently the arm and husky works fine without including each other.I am able to move the arm using moveit! and move the husky using the husky control package.
When i attach the arm on top of the husky my problem starts.I noticed the following.
/autoarm
when i include this to my URDF i can move my arm.But if i change this code into -
/
I can use the husky_control package to move my husky.So my problem is that i cant move both my husky and my arm at the same time.This might be a silly question but i am totally lost.
↧
Robot distance/angle positioning with sensors
Hello guys,
In my project - which I successfully simulated with Gazebo ROS my robots needs
to position itself exactly in perpendicular position to a metallic surface - specifically 3 meters from the
side-door of a car so that the front of the robot points the same direction of the car.
I successfully solved the issue in Gazebo ROS using a lidar sensor.

The algorithm works the following way.
I calculated the distance/angle from the target surface using 3 different scan ranges from the lidar output.
Using a few trigonometric functions I was able to calculate the angle correction that had to be applied
to the robot base_link to be perfectly perpendicular to the target.
So everything is perfect in simulation!

The pain now comes in the real world...
I tested my align-base function in real world using a RPLidar (robot peak lidar).
I noticed that the lidar scan output on a black metal surface is very unreliable.
Consider that black and metal is very common on a car body (I didn't test on other colors yet)
**Now my question is: did I use the wrong type of sensor for this sort of application - or there's any work around
you can suggest to solve the issue?**
Can't wait to hear your opinions!
Thanks!
↧
turtlebot with namespace in gazebo and visualization in rviz
Hi, I need to bring up multiple turtlebots in gazebo, so I gave each of them a namespace. The way I did this is just modify the turtlebot_world.launch file in turtlebot_gazebo pkg. If I just bring up one turtlebot without namespace, I can use `roslaunch turtlebot_gazebo gmapping_demo.launch` and `roslaunch turtlebot_rviz_launchers view_navigation.launch` to open rviz successfully. What I did is like these:
1. bring up one turtlebot with namespace robot0,
2. change the arg name,"scan_topic" from /scan to robot0/scan in gmapping.launch.xml,
3. create a agent.rviz file by adding robot0/ before several topics in navigation.rviz. exp:from Topic: /mobile_base/sensors/bumper_pointcloud to Topic: robot0/mobile_base/sensors/bumper_pointcloud
But What I can achieve is only bringing up multiple turtlebots with different namespaces, but rviz can not find these turtlebots. I don't know what should I do to make things work.
Thanks
↧
Diff_drive Plugin
Hi, I have written a urdf file for a 2 wheeled robot, but I can't seem to get it to drive properly using the diff_drive_controller gazebo plugin. Instead of driving normally, the robot somersaults in the direction that I want it to move in. Is there anything I'm doing wrong when I'm setting the conditions of the plugin? I'm new to ROS. false false false true true 100.0 base_lwheel base_rwheel 0.26 0.065 1 20 cmd_vel odom odom base_link
↧
↧
Hello, I am developing a gazebo application for a mecanum robot. Could anyone tell me how to realize mecanum wheels in gazebo?
Hello, I am developing a gazebo application for a mecanum robot. Could anyone tell me how to realize mecanum wheels in gazebo?
↧
Gazebo depth sensor not working properly in RViz?
So im busy setting up a proper simulation environment for my robot. During the process i added depth sensors in the URDF file of the robot and images come out great. Though pointclouds show me planes instead of points. So there is a problem somewhere.
To verify it was my problem only i opened up the turtlebot in gazebo and rviz and it shows me the same thing as for the robot i'm working on. So it is a problem for me in general, but I've no idea what it can be. Maybe i've failed to install an important part?
Anyone has an idea? Thanks in advance
For info, im on 14.04 and running Indigo

↧
Spawning and controlling two different robots in Gazebo
Hi!
Currently there's an UR5 robot, which is working and doing it's stuff (with moveit).
My task is to have a ball (which in reality is a Sphero ball) that goes around in circles near the UR5 (so later the UR5 can do stuff with it).
I already made a sphero like ball that can go in circles. It has it's own URDF, gazebo ros control, launch file etc.
And of course the UR5 has these too.
I tried launching both of these with a launch file, which of course didn't work. I saw all kind of tutorials about spawning the same robot multiple times, which isn't what I want. I want to spawn two **different** robots.
Spawning the complete UR5 and the sphero model (without a controller, just the model) is easy, but when it comes to control the sphero too (spawning a gazebo_ros_control, or somehow using the UR5's), that's where things get nasty.
One of the problems (I think) is that both URDFs have a "" part, and I think spawning 2 ros controllers won't really work.
Deleting this plugin from the ball's URDF won't help, I can't control it (I can't remember, but I think even thought the .yaml is loaded, it can't find the controllers or something like that)
The other problem is probably namespace. Making a correct namespace for my simple ball isn't hard, but for the UR5... I think that's a bit complex.
I tried making a merged URDF, so the ball is part of the UR5's system (so there won't be problem with namespaces and stuff), but the problem is that I need to connect the ball to the UR5.
Simple, there's a world link already for the UR5, I just connect the ball to the world with a floating joint... yeah, well it seems like floating is not supported, or I don't know, but when I start the program it says:
[ WARN] [1469109391.706616642]: Converting unknown joint type of joint 'world_ball_joint' into a fixed joint
Connecting the ball with a fixed joint to the word seems to be a bit of a problem.
I tried searching everywhere on the internet, but there's nothing on spawning different robots. Just the usual multirobot system, where there's one URDF, one robot, that's being spawned multiple times in different namespaces.
I found a post about two different things (an arm and a hand) being put together, but that uses a fixed joint (and system where there's an arm, and a tool at the end of it is quite common).
Is there a way to do this?
*It would be great to have a ball moving around the UR5 (without us moving it through Gazebo's GUI), so that the UR5 (with a kinect) can find it and point at the ball (or grab it).*
Thank you.
↧
Gazebo ROS Collision Detection
Hello all,
I am trying to get a topic with the collision by using the gazebo-ros-bumper plugin, but all of the resources I have found online have not worked for me.
Here is my xacro code for the robot arm.
/robarm
Here is the code for my kuka arm:
true Gazebo/${material} transmission_interface/SimpleTransmission PositionJointInterface 1
Here is the code for the SDH Hand:
true Gazebo/${material} transmission_interface/SimpleTransmission PositionJointInterface 1
↧
↧
Move my robot-Tutorial (General Question)
Hello,
I am new to ROS (I am using ROS Kinetic) and Gazebo and so far what I have managed to do are the following:
- Create a URDF model of a 4 wheel
robot with a camera on top, a
base_link and 4 continuous joints
(connecting the wheels with the
base_link) with controllers,transmissions and
actuators on each one. I also created the appropriate configuration yaml file for the control of my robot.
- Create launch files for rviz and
gazebo and successfully display my
robot in these environments.
- Get image from the camera plugin I
added, in gazebo.
My next step is to move my robot. I searched this [tutorial](http://gazebosim.org/tutorials/?tut=ros_comm), but it is not what I am looking and I am kind of lost right now. I also checked the [MoveIt](http://docs.ros.org/indigo/api/pr2_moveit_tutorials/html/planning/src/doc/move_group_interface_tutorial.html) C++ beginner tutorial, but it is referring to the PR2 robot and it doesn't help me to start something on my own and build from it.
To be more specific, what I am thinking of doing next is to create nodes (the motors of the robot and the camera) and according to what image I get from the camera, I would send an appropriate command to the motors to either stop or move on, but I cannot find a tutorial that could help me with that. Even a tutorial with simple movement of a robot could help me, i.e. sending to a motor-node to move forward and simulate this movement in gazebo.
If you know a tutorial that could be useful for me to start from, I would appreciate your help.
Thanks for your answers and for your time in advance.
↧
Is LiftDrag Plugin applicable for ornithopter?
I am currently working on ornithopter model on gazebo7 integrated with ros indigo using gazebo_ros wrapper. I am not good in aerodynamics. I used LiftDrag plugin on wings of my model. It could achieve reasonable lift at a certain flapping frequency. But, Liftdrag plugin is solely created for aircrafts (not sure). Is it correct to apply the plugin to ornithopters?. [Note: The model lifts only when LiftDrag plugin is applied]
(LiftDrag plugin : https://bitbucket.org/osrf/gazebo/src/ca2a71952a3823494afe2aafe9b6996c40e31539/plugins/LiftDragPlugin.cc?at=default&fileviewer=file-view-default)
↧
how to add two seperate diff driven motors in the robot
Hello,
I using ROS Kinetic and I have built a 4-wheel robot, which I want to control. The robot can be seen in the following picture:
[Screenshot from 2016-07-25 18-49-27.png](/upfiles/14694617997897263.png)
At this robot I have added 2 motors (at the two front wheels) and also differential_drive_controller plugins:
true 100 joint_f_r_wheel 0.4 20 motor1/cmd_vel odom odom base_footprint labrob true true 100 joint_f_l_wheel 0.4 20 motor2/cmd_vel odom odom base_footprint labrob true
at the wheels that I want to control. But when I run it in gazebo and then do:
rostopic list
I cannot see the topics motor1/cmd_vel and motor2/cmd_vel .
If, however, I add only one plugin for controlling both motors at the same time (in this case, the robot doesn't turn),
everything works fine and after `rostopic list` I can see the labrob/cmd_vel topic and publish messages on it.
Could you please tell me what should I change, so I can control the motors separately?
Thank you for your answers and your time in advance.
↧
ROS/Gazebo Wide-angle Camera (360º)
Hi, I need to do a 360 camera with gazebo and ROS to do a mapping of a place, I know is easier to do with a normal unidirectional camera, but for my purpose it should be a wide-angle camera, and I am looking for information to change the FOV or the parameters of the camera, but I´m seeing that is nearly impossible for me. I tryed to understand the libraries of any camera on ROS and also tryed to change the FOV from a .world in gazebo. Also tryed to follow this tutorial: http://gazebosim.org/tutorials?tut=wide_angle_camera&branch=wideanglecamera and nothing but errors occour to me. Maybe because I'm using gazebo 2? But I found that I need to use this version with ROS Indigo
Please, any help? Maybe a tutorial of developing a camera sensor from 0. It would help to understand how a simulated camera works. Maybe then I could do mine. I would appreciate any help, I'm getting quite desperate with this.
↧
↧
add AR tag in gazebo
Hello everyone,
I'm wondering how to add an `AR tag` in the gazebo's world these days. But I have not found the right way to solve this problem. Or could I add an image to the world?
Thanks first!
↧
Problem using a lidar sensor in gazebo
Hello,
I am new to ROS and I am using ROS Kinetic. So far I have found a URDF code for a hokuyo laser sensor, which I would like to use for simulations and I have put it on my robot:
0 0 0 0 0 0 false 40 720 1 -1.570796 1.570796 0.10 30.0 0.01 gaussian 0.0 0.01 /labrob/laser/scan hokuyo_link
I can successfully roslaunch the robot with the sensor in gazebo and see that the `/labrob/laser/scan` topic is being used from the sensor. I have tried adding objects really close to the sensor (tables for example), but I get the same data as when the sensor is in an empty world. The data which are being published on the topic are the following:
header:
seq: 37775
stamp:
secs: 1156
nsecs: 736000000
frame_id: hokuyo_link
angle_min: -1.57079994678
angle_max: 1.57079994678
angle_increment: 0.00436940183863
time_increment: 0.0
scan_time: 0.0
range_min: 0.10000000149
range_max: 30.0
ranges: [-inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf, -inf]
intensities: [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
Has anyone experienced a same problem ?
↧
range_sensor_layer marks max sonar range as obstacle?
I'm using Gazebo to simulate a 4-wheeled differential drive robot. The robot has a forward sonar sensor, so I added a [simulated sonar sensor](http://wiki.ros.org/hector_gazebo_plugins#GazeboRosSonar).
The sonar sensor appears to work; it detects obstacles and the Range messages look correct (e.g. max range value when no obstacles). The visualization in Gazebo also looks correct.
I use the [range_sensor_layer package](http://wiki.ros.org/range_sensor_layer) as a plugin for `costmap_2d`.
My issue is that for some reason, when there is no obstacle and the sonar sensor is at max range, the cost map registers an obstacle.
Below is a screenshot of Gazebo (left), Rviz (top right), and the echo'd Range message (bottom right). I rotated the vehicle in a circle without any obstacles, yet the costmap shows that the robot is surrounded by obstacles.

Now there's a parameter in `range_sensor_layer` called `clear_on_max_reading`, which will remove obstacles when a reading is max. However I've found that this does more harm than good, because it will clear away actual obstacles by accident.
For example, when it's navigating it runs along the wall and starts creating a wall of obstacles. Eventually it decides to turn, and as the range value maxes out, it clears a whole chunk of actual obstacle. Now there's a hole in the wall, so it tries to navigate towards the hole, and relearns that it's indeed an obstacle. This **repeats forever**. It's both funny and infuriating.
Here are the YAML files I'm using for my costmap:
### costmap_common_params.yaml
map_type: costmap
origin_z: 0.0
z_resolution: 1
z_voxels: 2
obstacle_range: 0.5
raytrace_range: 0.5
footprint: [[-0.21, -0.165], [-0.21, 0.165], [0.21, 0.165], [0.21, -0.165]]
footprint_padding: 0.1
plugins:
- {name: sonar_layer, type: "range_sensor_layer::RangeSensorLayer"}
- {name: inflater_layer, type: "costmap_2d::InflationLayer"}
sonar_layer:
ns: ""
topics: ["/sonar"]
no_readings_timeout: 1.0
clear_threshold: 0.2
mark_threshold: 0.80
clear_on_max_reading: false
inflater_layer:
inflation_radius: 0.3
### local_costmap_params.yaml
local_costmap:
global_frame: odom
robot_base_frame: base_link
update_frequency: 20.0
publish_frequency: 5.0
width: 10.0
height: 10.0
resolution: 0.05
static_map: false
rolling_window: true
### global_costmap_params.yaml
global_costmap:
global_frame: odom
robot_base_frame: base_link
update_frequency: 20
publish_frequency: 5
width: 40.0
height: 40.0
resolution: 0.05
origin_x: -20.0
origin_y: -20.0
static_map: true
rolling_window: false
In my robot URDF I include the [sonar_sensor macro](https://github.com/tu-darmstadt-ros-pkg/hector_models/blob/indigo-devel/hector_sensors_description/urdf/sonar_sensor.urdf.xacro) and instantiate my sonar sensor like so:
I'm not sure what's going on here. I'd appreciate any help.
↧