Hello, I am using V-rep in order to simulate a scenario. I have a robot publishing laser scanner data and odometry to corresponding topics. I run through ros the slam_gmapping node and I visualise the resulted map in rviz.
Even though the map is produced perfectly fine, the problem is with the base_link and base_scan frames. When I add TF in rviz in order to see the frames I get the warning:
*No transform from [base_link] to frame [map]
*No transform from [base_scan] to frame [map]
However I publish the transform from base_scan to base_link and base_link to odometry, so the above should work in the sense that the slam_gmapping node publishes the odometry to map. Nevertheless, when I generate my TF tree is looks well connected. I have tried using the sim_time parameter set to true, set to false, no at all, but still I cannot get anything. I also made the vrep script to publish at the /clock topic in case there was problem with the frame_stamp but nothing changed (or basically slam_gmapping node never subscribed to that, any idea how I can do that?)
If anybody could, I would really appreciate that.
↧
TF problem using vrep and ros
↧
why cannot use tf::Quaternion and tf::createQuaternionMsgfromYaw in arduino
I try to use **tf::Quaternion and tf::createQuaternionMsgfromYaw in arduino**
I have **included** ros.h and tf/transform_broadcaster.h
but when I build the code in arduino ide , it shows error:
> error: 'createQuaternionMsgFromYaw' is not a member of 'tf' error:> error:'Quaternion' is not a member of 'tf'
I want to **send transform on arduino**, reference to official tutorial:[Writing a tf broadcaster ](http://wiki.ros.org/tf/Tutorials/Writing%20a%20tf%20broadcaster%20%28C%2B%2B%29), [using urdf with robot_state_publisher](http://wiki.ros.org/urdf/Tutorials/Using%20urdf%20with%20robot_state_publisher), and [using time and tf on the arduino](http://wiki.ros.org/rosserial_arduino/Tutorials/Time%20and%20TF)
I get data from robot, which stored in struct ArmData on arduino, including **absolute positions and joint angles**, and I want to **publish joint_state and send transform** to ros through rosserial_arduino.
when updating joint_state, I find that I cannot use resize() method, the way in tutorial [using urdf with robot_state_publisher](http://wiki.ros.org/urdf/Tutorials/Using%20urdf%20with%20robot_state_publisher), since I do not install **string lib** on arudino ide. Is it neccesary to do so? Or, there is another way to substitute it using arduino code?
Is there any suggestion to code what I want to do on arduino? Thanks!
Here is part of my code with regard to update the joint_state and transform:
geometry_msgs::TransformStamped ts;
const double d2r=1000/57296; //turn degrees to radians
//update transform from odom to base_link
ts.header.frame_id=odom;
ts.child_frame_id=base_link;
ts.transform.translation.x=0;
ts.transform.translation.y=0;
ts.transform.translation.z=.0115;
//q.setRPY(0,0,ArmDataUnion.ArmData.Baseangle*d2r+d2r/2);
//ts.transform.rotation.w=;
ts.transform.rotation=tf::createQuaternionMsgFromYaw(ArmDataUnion.ArmData.Baseangle*d2r+d2r/2);
ts.header.stamp=nh.now();
//update joint_state
joint_state.header.stamp=nh.now();
//joint_state.name.resize(3);
//joint_state.position.resize(3);
joint_state.name[0]="base_joint";
joint_state.position[0]=ArmDataUnion.ArmData.Baseangle*d2r;
joint_state.name[1]="long_joint";
joint_state.position[1]=ArmDataUnion.ArmData.LongArmangle*d2r;
joint_state.name[2]="short_joint";
joint_state.position[2]=ArmDataUnion.ArmData.ShortArmangle*d2r;
//send the joint state to topic /joint_states
joint_pub.publish(&joint_state);
//send transform
broadcaster.sendTransform(ts);
↧
↧
No transform between base_footprint and base_link
I have an URDF file with a fixed joint between base_footprint & base_link, but robot_state_publisher isn't publishing a tf transform between those 2 links. My understanding is that robot_state_publisher will publish & latch a static transform for fixed joints, but I'm not seeing it in any of my tests.
Details:
My base is irobot create2, using the create_autonomy base; source here: https://github.com/AutonomyLab/create_autonomy
The URDF file has this snippet:
I can't [yet] upload pictures, but my rqt_tf_tree shows: odom => base_footprint; and base_link => [ left_wheel_link, camera_link, right_wheel_link ]
(I added the base_link => camera_link transform using a static_transform_publisher; it's not in the create_autonomy URDF file.)
I can work around this, but I'm puzzled, and would like to find & fix the bug.
Thanks in advance!
↧
static frame visible in tf_tree but not in RViz
I am playing a bag file which has the following tf tree structure
odom --> base_link --> (lidar, gps, imu)
I want to publish a frame 'base_footprint' which is a child of 'base_link' with a constant transformation: 0 0 -0.3 0 0 0
I have the following in my launch file
I can see the frame published using $rosrun tf view_frames. Also by running $rostopic echo /tf
But when I run RViz, base_footprint doesn't show up in the tree. It does show up in the list of frames, but the parent frame field is blank, and position is 0 0 0.
Any help is appreciated. Thanks.
↧
Whether subscriber/ listener using visual transform (tf )can be implemented using cpp code in a windows machine ?
My requirement is that, I need to run a **tf listener** in **windows machine**.
either through Python script or cpp code.
I was able to run ,Simple Publisher and Subscriber (Python) on a Windows machine, successfully,
Also, I tried the following
http://servicerobotics.eu/en/other-projects/ros-fuer-windows/
Please advise.
↧
↧
[Fixed] Gmapping and tf
Hi all
My robot in rviz has some problems when i run gmapping launch file. the robot model and it's wheel got to take apart. How can i fix it. Sorry for my bad english. Thanks first.
Here is the image

My launch file
My urdf model
↧
Supported for implementing tf functionality of ROS on a Windows Platform
Hi,,
I want to get a confirmation on whether the **tf functionality of ROS** is supported on a **Windows Platform.**
I have tried and tested the simple publisher and listener on Windows Platform.
But now we are looking for some advanced function ..ie visual transform (tf). But no where I could find a proper solution/documentation for the implementation of tf on Windows Platform. All support is provided for the implementation on Ubuntu platform.
I have posted the same concern in ROS answers site and I got many options like codenotes/ros_cygwin , uml-robotics/ROS.NET, rosjava etc. But it is not fully helpful.
Please support.
↧
issue while building a map
hi
i am trying to build a map using slam i am using arduino with ROS. i am using arduino duo and it publish imu data and encoder and subscribe to cmd vel know every thing is good all the odometer data is good but some time while building the map robot move in the map few cm to left or right more than real life sometimes he correct his position on the map and some times not. drive method is diff drive. i test the encoder if i move 1 meter there is no error at all. and imu data is correct without any error some problem occur some time is:
WARN] [WallTime: 1485180236.120524] Serial Port read returned short (expected 312 bytes, received 147 instead).
[WARN] [WallTime: 1485180236.120981] Serial Port read failure:
[INFO] [WallTime: 1485180236.121363] Packet Failed : Failed to read msg data
[INFO] [WallTime: 1485180236.121683] msg len is 312
this not happen all the time just some time.
but this happen alot :
scan Matching Failed, using odometry. Likelihood=-754.114
lp:-0.826714 -0.201986 -0.497758
op:-0.564566 -0.17674 -2.21354
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:-0.826714 -0.201986 -0.497758
op:-0.564566 -0.17674 -2.21354
Scan Matching Failed, using odometry. Likelihood=-4266.67
now i have question in the main launch the following line
node pkg="tf" type="static_transform_publisher" name="base_to_kinect_broadcaster" args="-0.17 0.04 0.1975 0 0 0 \/base_link \/camera_link 100" />
now is this line is important or not ? and if its important how exactly i can adjust this parameter after args= i try to search how to adjust it and i dont know now. Is this parameter cuse the problem above or not.
for the information sometime i can build the map with relatively small error not noticeable but its not happen all the time.
i need a way or solution that help for building the map without any issue like this.
i use ROS indigo .
thanks i hope any one can help.
↧
static transformer question
Hi
I am trying to build a map using slam. I am using Arduino due and Ros indigo, I am capable of building map with few error
I have this line in my launch file:
node pkg="tf" type="static_transform_publisher" name="base_to_kinect_broadcaster" args="-0.17 0.04 0.1975 0 0 0 \/base_link \/camera_link 100" />
I need to know if this line is necessary and what the parameter after the arg= mean and how to tune them.
this is my main question.
I hope any one can replay with good answer I just need explanation and how to tune the parameter and what exactly this parameter mean.
↧
↧
how to use gmapping based on laser_scan_matcher?
I now only have .pcap file collected by my velodyne. I want to test gampping without a moving robot.
Here are the steps I did:
roslaunch velodyne_pointcloud VLP16_points.launch pcap:=/home/soowon/Documents/County_Fair.pcap
...
rosrun pointcloud_to_laserscan pointcloud_to_laserscan_node cloud_in:=/velodyne_points
At this moment, I can see the laserscan in RVIZ with frame of "velodyne", then I run this to get fake /tf:
rosrun laser_scan_matcher laser_scan_matcher_node scan:=/scan
..
rosrun gmapping slam_gmapping scan:=/scan
No error occurs, but I cannot see anything new in RVIZ, I added the topic map, but it says "no map received". I tried different frames in RVIZ, like: velodyne, map, world. It does not work.
Can anybody tell me what is the reason? Thank you very much!
↧
Laser_Ortho_Projector Transformation
Hi,
I am having tf issues with scan tool as in in this link -> http://wiki.ros.org/laser_ortho_projector
Under the package summary, it states that the tf transformation of world → base_ortho is already a provided transformation. Does this mean I still need to manually create a static transform between the world frame and base_ortho frame in my launch file? Because if I don't, rviz warns me that the base_ortho frame does not exist. So I am pretty confused what I actually need to do because at the moment the projected point cloud does not match the laser scans from my rplidar sensor.
Please help me out. My project is due really soon :(
Thanks in advance!
↧
ar_track_alvar causes TF_NAN_INPUT:
I am running ar_track_alvar successfully, but it causes tf transform nodes to output errors such as:
TF_NAN_INPUT: Ignoring transform for child_frame_id "ar_marker_111"
I'm pretty sure the errors are irrelevant, as I am detecting all of the expected markers fine and getting good transforms from them. However, these errors make a mess of the terminal screen of other nodes. Is there a way to either suppress the errors or (better) to get ar_track_alvar to only send out valid transforms?
↧
Configure frame IDs for mavros local_position/odom
Hi,
I have a system of 4 UAVs with 4 pixhawks and 4 on-board computers. A ground station is controlling the ROS master. I have a problem managing UAV poses.
Mavros publishes local_position/odom with child frame ID 'base_link'.
Since I have 4 UAVs operating in the same ROS space, this is a problem.
I looked through the documents but did not see how to set a parameter for child frame ID. Is there such parameter that I have missed?
If there isn't then running a node which listens to the topics (topics are namespaced with /uav1, /uav2 etc) and republish to a new topic with new frame IDs (naive, I know) the best way to approach this?
↧
↧
"use_sim_time" parameter disabled tf
Hello,every one,
After I added `` in my launch file,the rviz can not show the tf tree and there was an error in the RobotModel:`No transform from [base_laser]to[odom]` and a warning in Global Status: `No tf data.Actual error:Fixed Frame [odom] does not exist`.
When I entered `rostopic echo /tf` ,there was a warning :`no messages received and simulated time is active.Is /clock being published?`
So the root cause is the clock message not be published.How can I publish the /clock messages ?
Thanks in advance!
↧
Having trouble with making openni_depth_frame a visable child of Baxter's head in Rviz
I'm using a kinect with Baxter. I am having a problem getting baxter to move one of its hands to a humans hand to grab something from it.
In order to track people I am using
openni.launch from the package openni_launch
openni_tracker from the package openni_tracker.
After I add a tf display to Rviz:
I can successfully view the skeleton of a user in rviz when the fixed frame is 'openni_depth_frame'.
I can see the model of Baxter if the fixed frame is one of the frames that make up Baxter's tf model.
I wrote a script making 'openni_depth_frame' a child of 'head' (since the kinect is attached to the top of Baxter's head).
I have verified that this works by seeing that 'openni_depth_frame' is a child of 'head' in rqt_tf_tree after running this script.
However, I expected to see both the 'openni_depth_frame' frame and the frames that make up the user openni_tracker is tracking at the same time I can see baxter's model.
Am I incorrect in thinking that? Or does tf not work that way?
↧
What's the difference between "laser" and "scan" of tf
↧
tf_prefix and tf2_web_republisher (URDF visualization in browser)
Hello everybody !
I have manage to display my custom urdf file in my browser by using [this tutorial](http://wiki.ros.org/urdf/Tutorials/Using%20urdf%20with%20robot_state_publisher).
However, when I run what I've done alongside my actual robot (or simulation), the visualization does not display properly (several parts of the robot are displayed at the same point). I am guessing the problems comes from an interference between the tf tree of my robot and the tf tree of my visualization. (These two trees have some links in common but are different).
I tried adding a "tf_prefix" parameter to my robot_state_publisher by doing this
"visualisation"
But when I launch everything I get the following error :
[ERROR] [1468848677.857789028]: "Link1" passed to lookupTransform argument source_frame does not exist.
Where Link1 is a link listed in my URDF file.
Am I using the tf_prefix wrong?
Did I not configure something?
Should I load my URDF file in a special way ?
I also read that tf_prefix doesn't work with all the packages. Is this the case with tf2_web_republisher?
↧
↧
Frame Transformation using Laser_Geometry package
Hi,
I am just wondering if it is possible to use the `laser_geometry` package for a fixed-tilt RPLidar A2 2D sensor? If I tilt it at a fixed angle (30deg) , is there a way to project that tilted laser scan vertically downwards to the world frame? I am trying to achieve something like this video: https://www.youtube.com/watch?v=jHsdwQbqc8Y . I tried using the `laser_ortho_projector` package but it doesn't seem to work, so I am trying out `laser_geometry` to see if it allows me to do it too. I do not have an imu.
Thanks in advance!
↧
Translating between left handed co-ordinate system and right handed co-ordinate system?
I have a rosbag that publishes IMU information in the left hand coordinate system but as I read in [REP 103](http://www.ros.org/reps/rep-0103) , the coordinate systems in ros are all right handed.
How do I accomplish this transformation between the two? Do I change values published by IMU or do I publish a transform? I strongly feel that I should publish a certain transform. What does that transform look like and where can I read more about it? What are the best practices to do this? Also can one sensor be in a different co-ordinate frame than the other, how do you fix that?
**Note:** I am using tf2
I also found and article about this on [a webpage by unity](http://answers.unity3d.com/storage/temp/12048-lefthandedtorighthanded.pdf).
↧
Rotate rtabmap's local_costmap?
We're using rtabmap and only our local costmap is incorrectly rotated 90 degrees :(. We've tried using TF to rotate a few things with the static_transform_publisher but haven't been successful. Do you have any suggestions? See screenshots:
Incorrect: (the local cost map)

Correct: (the global cost map)

Thank you!
↧