Hi!
Currently I have a dif_wheeled_robot and a kinect inside Gazebo (these are separate "robots", not connected by any joint). The diff_wheeled_robot has it's own scanner, has an AMCL launch made etc. I can navigate it around, and while moving if I put an obstacle in front of it, it will go around it.
The kinect has a depthimage_to_laserscan node, so it's basically a laser scanner.
My goal is to make the diff_wheeled use the kinect's scan data to navigate instead of it's own hokuyo scanner, which is attached to it.
(basically I want to navigate the robot around, but with a scanner that's in a fixed position. In reality I can't put the kinect on my robot, but I still want my robot to avoid (dynamic) obstacles based on what the kinect sees).
AMCL wants scan data from the /scan topic, and that part is alright, the kinect sends data on that topic.
The problem is, the AMCL wants a base_frame_id and an odom_frame_id, so it can connect the robot frame to the odom, and do it's computing and stuff, but my kinect isn't connected to either of these.
(in the tf tree, the diff_wheeled_robot and the kinect is separate. The diff_wheeled_robot's base_footprint connects to the odom, and that connects to the map. The kinect's root ends at it's own base_footprint_kinect)
If I disable the diff_wheeled's hokuyo, then the AMCL will throw this:
[ WARN] [1469782994.436635986, 36.386000000]: Timed out waiting for transform from base_footprint to map to become available before running costmap, tf error: . canTransform returned after 0.103 timeout was 0.1.
My question:
Is it possible to use AMCL/Gmapping and navigation with a mobile robot and a detached/fixed kinect (not fixed to the mobile robot)?
If yes, how?
If no, then am I able to use the ros move_base navigation without AMCL? (so use navigation and collision avoidance with the same setup, on the fly. The obstacles are not static)
Than you.
Update #1: Okay, it seems like not the AMCL is the one throwing the warning, but the move_base. Still, no idea how to solve it. Maybe the better question is, can I use move_base with my setup?
↧
Mobile robot with fixed kinect - Navigation with/without AMCL?
↧
quaternion.proto: File not found while installing gazebo
I get the following error while trying to run PX4 Firmware Simulation using make posix_sitl_default gazebo> make[5]: warning: jobserver> unavailable: using -j1. Add `+' to> parent make rule. [ 2%] Running C++> protocol buffer compiler on> msgs/SensorImu.proto [ 4%] Running> C++ protocol buffer compiler on> msgs/CommandAttitudeThrust.proto [ > 6%] Built target sdf quaternion.proto[> 8%] Running C++ protocol buffer> compiler on> msgs/CommandRateThrust.proto : File> not found. vector3d.proto: File not> found. SensorImu.proto: Import> "quaternion.proto" was not found or> had errors. SensorImu.proto: Import> "vector3d.proto" was not found or had> errors. SensorImu.proto[ 10%] Running> C++ protocol buffer compiler on> msgs/CommandMotorSpeed.proto :8:12:> "gazebo.msgs.Quaternion" is not> defined. SensorImu.proto:10:12:> "gazebo.msgs.Vector3d" is not defined.> SensorImu.proto:12:12:> "gazebo.msgs.Vector3d" is not defined.> make[8]: *** [SensorImu.pb.cc] Error 1> make[8]: *** Waiting for unfinished> jobs.... [ 12%] Running C++ protocol> buffer compiler on> msgs/CommandRollPitchYawrateThrust.proto> make[7]: ***> [CMakeFiles/mav_msgs.dir/all] Error 2> make[6]: *** [all] Error 2 make[5]:> *** [gazebo_build] Error 2 make[4]: *** [src/firmware/posix/CMakeFiles/gazebo]> Error 2 make[3]: ***> [src/firmware/posix/CMakeFiles/gazebo.dir/all]> Error 2 make[2]: ***> [src/firmware/posix/CMakeFiles/gazebo.dir/rule]> Error 2 make[1]: *** [gazebo] Error 2> make: *** [posix_sitl_default] Error 2
It looks like it cannot find quaternion.proto and vector2d.proto. how can i get those files?
↧
↧
How to create a caster wheel (Urdf model)
I have built a 2 wheel robot and I have been trying for hours to put a correct caster wheel on it. I followed the logic of this [code](https://github.com/qboticslabs/mastering_ros/blob/master/chapter_2_codes/mastering_ros_robot_description_pkg/urdf/diff_wheeled_robot.urdf) in which a sphere with a fixed joint is used and I managed to make it work (or I thought so) until I commanded my robot to move. The movement was totally wrong. With the fixed joint sphere, the robot was bumping like crazy. Is there another way to create a caster wheel ?
Thanks for answering and for your time in advance.
↧
Spawn urdf-model in gazebo
Hello,
I am trying to spawn my urdf-model in gazebo, so i have created the urdf-file in ~/catkin_ws/src/robot_description/src/urdf and the model.config file in ~/catkin_ws/src/robot_description/src with the line: urdf/robot.urdf .
I add "export GAZEBO_MODEL_PATH=/home/timo/catkin_ws/src/" to .bashrc .
When i start gazebo and go on Insert it only shows me the preinstalled model-database and the line /home/timo/catkin_ws/src/ but without any pulldown window.
Did I have forgot anything?
↧
turtlebot not moving with teleop in indigo
I did a fresh install of Ubuntu 14 and ROS Indigo today so I could get going with Turtlebot. I'm using Gazebo 2.2.3.
I am working through the turtlebot tutorial, and so far everything is fine to this point:
http://wiki.ros.org/turtlebot_gazebo/Tutorials/indigo/Explore%20the%20Gazebo%20world
For instance, the Stage Simulator worked great.
For the world simulation, I have entered the following:
roslaunch turtlebot_gazebo turtlebot_world.launch
roslaunch turtlebot_teleop keyboard_teleop.launch
The nodes I get from rosnode list:
/bumper2pointcloud
/cmd_vel_mux
/depthimage_to_laserscan
/gazebo
/laserscan_nodelet_manager
/mobile_base_nodelet_manager
/robot_state_publisher
/rosout
/turtlebot_teleop_keyboard
The Gazebo screen is up, I see the turtlebot in its little room, I get the following output on the teleop screen when I press different buttons:

I expect the little guy to be moving by default, or when I enter q/w/e. The problem is, the turtle is not moving. Unlike the problem at the following link, I do not have multiple distributions of ROS on my system:
http://answers.ros.org/question/29983/turtlebot_teleop-not-working-properly/
The turtle is not up against an object in the world but sitting there in open space.
↧
↧
Problem publishing at cmd_vel
Hello,
I have written the following code so I can send commands to my robots motors, which are listening to the `/labrob/cmd_vel` topic:
#include "ros/ros.h"
#include "geometry_msgs/Twist.h"
int main(int argc, char **argv)
{
ros::init(argc, argv,"vel");
ros::NodeHandle vel;
ros::Publisher vel_pub = vel.advertise("/labrob/cmd_vel", 100);
ros::Rate loop_rate(10);
geometry_msgs::Twist msg;
msg.linear.x = 0.1;
msg.linear.y = 0;
msg.linear.z = 0;
msg.angular.x = 0;
msg.angular.y = 0;
msg.angular.z = 0;
vel_pub.publish(msg);
ros::spin();
return 0;
}
With the rostopic pub command via the terminal, I can move the robot fine, but with the code I posted, the robot does not respond. What I am doing wrong ?
↧
Get depth from Kinect sensor in gazebo simulator
I'm trying to find a specific pixel's depth of `/depth/image_raw` topic that is published by Kinect sensor mounted on **Turtlebot** robot.
It's what I get: www.uppic.com/do.php?img=97090) (Sorry I have not enough karma to upload image.
It's what I see in **rviz**: www.uppic.com/do.php?img=97092
How can I fix it to get depth of pixels?
↧
Question about odometry
Hello,
I want to use my odometer on my virtual 3wheel robot and I have managed through this code:
Debug True 1 true true 100.0 joint_l_wheel joint_r_wheel 0.20 0.16 1 30 /labrob/cmd_vel odom odom base_footprint true labrob
To make gazebo publish on the `/labrob/odom` topic. But when I `rosotopic echo` it, I see not accurate results, like for example that:
header:
seq: 25109
stamp:
secs: 251
nsecs: 195000000
frame_id: labrob/odom
child_frame_id: labrob/base_footprint
pose:
pose:
position:
x: -0.00832692377833
y: 1.20849063755e-05
z: 0.039996042105
orientation:
x: -2.18465180186e-06
y: -1.97231901816e-05
z: 1.37248233329e-05
w: 0.999999999709
covariance: [1e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001]
twist:
twist:
linear:
x: 1.11357837453e-05
y: -1.22072532961e-07
z: 0.0
angular:
x: 0.0
y: 0.0
z: 7.37337779565e-06
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---
Regarding the `pose.pose.positions` my robot spawns at the origin of the gazebo world, which I believe it defaults at (0,0,0).
Also the `pose.pose.position.y` as time passes by, is increasing, while the other two are oscillating in the values above. **My first question is why is the `pose.pose.position.y` value increasing ?**
Also about the velocity data `twist.twist` even though the velocities echoed are close to zero, which is the real result, since the robot is not moving, are inaccurate. **Is this normal or I am doing something wrong ?**
↧
Gazebo topic publication issue within namespaced group tag
ROS Kinetic
Gazebo 7.0
RedHat Linux
I'm running into an issue with a roslaunch file that attempts to start a Gazebo node within a group tag that has the namespace parameter set. The Gazebo node appears to start, but other nodes that wait on gazebo services hang. Specifically, when using rospy, the following calls do not complete.
rospy.wait_for_service('/gazebo/get_world_properties')
rospy.wait_for_service('/gazebo/reset_world')
rospy.wait_for_service('/gazebo/reset_simulation')
rospy.wait_for_service('/gazebo/pause_physics')
rospy.wait_for_service('/gazebo/unpause_physics')
The launch file is as follows:
I've tried the following while troubleshooting the problem.
1. Remove group tag completely. Behavior: Simulation runs fine, nodes initialize and execute after the initial spool up.
2. Add group tag without ns attribute. Behavior: Simulation again runs fine.
3. Add ns attribute. Behavior: Gazebo node appears to launch, but the previously mentioned services do not appear to be available to other nodes inside the group tag. rostopic list and gz topic -l do not show the topics being published either.
As far as I can tell, adding the ns attribute breaks communication/publication of the topics somewhere in the pipeline, but I have not been able to isolate it yet. Does anyone have any experience with this issue, or thoughts on how to proceed? I am reasonably sure that I need the namespace attribute as it will allow me to run multiple gazebo simulations in parallel. Perhaps there is an alternative that I have missed?
Thank you in advance!
↧
↧
Adding noise to P3DX model
Hi Ros users:
I am trying to add gaussian noise to velocities of P3DX. The model that I am using is from this source https://github.com/RafBerkvens/ua_ros_p3dx
I have modified this part of code:
true 100.0 base_link ${ns}/base_pose_ground_truth 0.01 map 0 0 0 0 0 0 10 10 10
I change the part where it says "gaussiannoise nad velocityGaussianNoise" but it works in the same way.
Does anybody know how i can add gaussian noise to P3DX model??
Thank you so much.
↧
How to create a waypoint in a simulation generated with the tum_simulator package
The tum_simulator package allows to simulate a ardrone in gazebo.
I want to create a control package allowing me to set waypoints in this simulation and see the drone moving between them. Possibly also implementing collision avoidance.
However I'm facing some difficulties in understanding what's the best approach to this problem.
According to the roswiki page (http://wiki.ros.org/tum_simulator) tum_simulator offers the position (published in the /navdata topic). However the /ardrone/navdata topic is of type ardrone_autonomy/Navdata which is:
std_msgs/Header header
uint32 seq
time stamp
string frame_id
float32 batteryPercent
uint32 state
int32 magX
int32 magY
int32 magZ
int32 pressure
int32 temp
float32 wind_speed
float32 wind_angle
float32 wind_comp_angle
float32 rotX
float32 rotY
float32 rotZ
int32 altd
float32 vx
float32 vy
float32 vz
float32 ax
float32 ay
float32 az
uint8 motor1
uint8 motor2
uint8 motor3
uint8 motor4
uint32 tags_count
uint32[] tags_type
uint32[] tags_xc
uint32[] tags_yc
uint32[] tags_width
uint32[] tags_height
float32[] tags_orientation
float32[] tags_distance
float32 tm
I can't find the position of the drone. Without it I would have to implement my own odometry evaluation.
Am I missing something?
Did I somehow install the wrong package or is the wiki page wrong?
Do you know if a control package for this simulator is already implemented?
↧
run c++ code on Gazebo simulator
so i have this c++ code https://github.com/erlerobot/ros_erle_takeoff_land/blob/master/src/main.cpp#L21-L24
and I want to run+test it on Gazebo simulator, any idea how can I do this I have ROS indigo installed in my laptop and Gazebo 4 all under ubuntu 14.04. I'm a beginner so if can provide me with the steps that will be great :)
Thank you
↧
TF tutorials question
Hello,
I am new to tf and in ros in general - so some follow up questions may rise. My main questions will be marked with **bold**.
I have built a 3 wheel robot on which I have put a lidar (hokuyo_link) and I am trying to setup the tf for my robot. I have managed to manipulate the lidar data for some simple obstacle avoidance movements, but my aim is to use the navigation stack and a main prerequisite for that is the tf. So what I am trying to implement is a odom -> base_link -> hokuyo_link tf, if I have understood correctly the logic.
For a start, I want to create a tf between my lidar and my base_link. For this purpose, I am following this [tutorial](http://wiki.ros.org/navigation/Tutorials/RobotSetup/TF) and I have some questions.
I have wrote a broadcaster -**do I need this broadcaster or it is implemented automatically from gazebo during simulation?** - copied from the tutorial but changed with my geometric data, and I am trying to write a listener:
#include
#include
#include
void transformPoint(const tf::TransformListener& listener){
//we'll create a point in the base_laser frame that we'd like to transform to the base_link frame
geometry_msgs::PointStamped laser_point;
laser_point.header.frame_id = "hokuyo_link";
//we'll just use the most recent transform available for our simple example
laser_point.header.stamp = ros::Time();
//just an arbitrary point in space
laser_point.point.x = 1.0;
laser_point.point.y = 0.2;
laser_point.point.z = 0.0;
try{
geometry_msgs::PointStamped base_point;
listener.transformPoint("base_link", laser_point, base_point);
ROS_INFO("base_laser: (%.2f, %.2f. %.2f) -----> base_link: (%.2f, %.2f, %.2f) at time %.2f",
laser_point.point.x, laser_point.point.y, laser_point.point.z,
base_point.point.x, base_point.point.y, base_point.point.z, base_point.header.stamp.toSec());
}
catch(tf::TransformException& ex){
ROS_ERROR("Received an exception trying to transform a point from \"hokuyo_link\" to \"base_link\": %s", ex.what());
}
}
int main(int argc, char** argv){
ros::init(argc, argv, "robot_tf_listener");
ros::NodeHandle n;
tf::TransformListener listener(ros::Duration(10));
//we'll transform a point once every second
ros::Timer timer = n.createTimer(ros::Duration(1.0), boost::bind(&transformPoint, boost::ref(listener)));
ros::spin();
}
The only thing I changed is the name used for the laser link. **The arbitrary point used in the tutorial are in my case my lidar data?** And also, if my previous question has a positive answer, this means that **the node that listens to the tf channel, needs to subscribe to the /laser/scan topic to get the data ?**
Thanks for your answers and for your time in advance,
Chris
↧
↧
Using Gazebo 5 to simulate a robot, how do I add more noise/drift to the transforms/odometry ?
I'm trying to measure the performance of AMCL under changes in Lidar specs, such as max range, update frequency, FOV, and other things. But it turns out that in simulation from gazebo that odometry has so little noise/drift that AMCL performs very well whether I use a simulated 30 meter long laser or a 1 meter long laser. This happens even in large featureless rooms.For example, over the course of 60 seconds of driving the 1 meter long laser only saw a wall for half a second. Yet the performance of AMCL on that run is comparable favorably to the run with the 30 meter long laser.
Any help would be greatly appreciated. I'm relatively new to ROS, I just started using it less than two months ago so please bare with me if I have any follow up questions, thank you very much.
EDIT:
I don't know if it helps but I am using the gazebo plugin for differential drive robots. I don't know if it's possible to alter it somehow to add more drift/noise.
↧
Could not start gazebo or Rviz on ROS preinstall virtual machine
Dowloaded the nootrix preinstall ROS indigo igloo virtual machine. Everything working fine apart from gazebo and rviz. The preinstall has a gazebo 2 by default and I am willing to work on the same version. The error is below
OpenGL Warning: crPixelCopy3D: simply crMemcpy'ing from srcPtr to dstPtr
OpenGL Warning: crPixelCopy3D: simply crMemcpy'ing from srcPtr to dstPtr
OpenGL Warning: crPixelCopy3D: simply crMemcpy'ing from srcPtr to dstPtr
OpenGL Warning: crPixelCopy3D: simply crMemcpy'ing from srcPtr to dstPtr
OpenGL Warning: No pincher, please call crStateSetCurrentPointers() in your SPU
Segmentation fault (core dumped)
↧
Hector_Slam (Autonomous Navigation)
Hello,
I am new to navigation and generally in ros and I am trying to use hector_slam metapackage for autonomous navigation. I am using ROS Kinetic.
I have a 3 wheel robot with a lidar on which I have created as a urdf model and I am using gazebo for simulations.
Currently, I have managed to use the hector_mapping node to create a map and visualize it through rviz. For the building of the map, I used teleoperation to move the robot.
Now I am lost, because I am trying to implement autonomy on my robot. What packages of the hector_slam meta will help me with that and generally where should i start ?
Thank you for your time and for your answers in advance,
Chris
↧
Robot fixed in centre, changed shape.
Hey, I've been using 25 robots of the same type in a swarm simulation in Gazebo 2.2.3 on ROS Indigo.
After about an hour of simulation at least half the robots are fixed in the centre and I cannot move them with the hand tool. They do not respond to their subscribed topics, and actually change shape.
In the attached picture ( http://imgur.com/a/YKoYN ) are the normal robots and in the bottom right are about 10 fixed/stuck in the same place.
I can't find anything like this online and there are no errors printed on console.
I have no idea how to address this and would much appreciate any ideas.
↧
↧
Gazebo plugin to subscribe to nav_msgs/Odometry messages
Hi guys
I have written a publisher to send Odometry messages to my topic "/robot_poses". The goal is to be able to control the movement of my robot in Gazebo. As such, I'm looking for a plugin I can use in my robot's sdf file to subscribe to Odometry messages from the topic.
Any help is appreciated, thanks.
↧
Where can I find urdf and plugin code to simulate mecanum wheels in gazebo
Where can I find urdf and plugin code to simulate mecanum wheels in gazebo?
↧
problem with gazebo_urdf_spawner
Hey,
I am trying to build an Ackermann model based on the CAT vehicle package. Since today I get the following error when I try to spawn the model via a launch file, even though it has worked before and I haven't changed anything in the files:
[INFO] [WallTime: 1471962070.626981] [0.000000] Controller Spawner: Waiting for service controller_manager/load_controller
Service call failed: transport error completing service call: unable to receive data from sender, check sender's logs for details
Segmentation fault (core dumped)
[gazebo-1] process has died [pid 16208, exit code 139, cmd /home/gleb/slam_simulation/src/gazebo_ros_pkgs/gazebo_ros/scripts/gzserver -e ode /home/gleb/slam_simulation/src/catvehicle-1.1.0pre/src/azcar_sim/worlds/skidpan.world __name:=gazebo __log:=/home/gleb/.ros/log/874b8914-6939-11e6-9099-90004e6484d5/gazebo-1.log].
log file: /home/gleb/.ros/log/874b8914-6939-11e6-9099-90004e6484d5/gazebo-1*.log
[azcar_sim/urdf_spawnerazcar_sim-2] process has finished cleanly
log file: /home/gleb/.ros/log/874b8914-6939-11e6-9099-90004e6484d5/azcar_sim-urdf_spawnerazcar_sim-2*.log
[WARN] [WallTime: 1471962100.880121] [0.000000] Controller Spawner couldn't find the expected controller_manager ROS interface.
[azcar_sim/controller_spawnerazcar_sim-3] process has finished cleanly
log file: /home/gleb/.ros/log/874b8914-6939-11e6-9099-90004e6484d5/azcar_sim-controller_spawnerazcar_sim-3*.log
I don't have any idea what could cause the problem and don't know how to resolve it. Could anyone give an advice by any chance?
Thank you in advance!
↧