Hi, everyone
I am using gmapping to build a map for an office room with a hokuyo laserscanner(UTM-30LX-EW), following the tutorial MappingFromLoggedData. The map I got seems incorrect, as shown below. It shows that the obstacles are putting in wrong position in the map. Is there something wrong with the tf for my robot? And how to check the problem? Thanks.

↧
problem with gmapping
↧
how to broadcast a tf frame?
Hello,
I have a tf frame named /base_link which is part of the turtlebot 2. I am trying to take the information that the /base_link frame contains such as its position(xyz) and its orientation(rpy) and broadcast this info to a new tf frame. The reason why I am doing this is because I could not tf_remap to work. So I am thinking that the next step is to create a broadcaster that sends out this info as a new name. I am highly confused of what info goes where. Below is the code that I have so far. I grabbed this info from the tf tutorials available on the wiki page.
#include
#include
void poseCallback( ){ //don't know what to pass through
static tf::TransformBroadcaster br;
tf::Transform transform;
transform.setOrigin( tf::Vector3(msg->x, msg->y, 0.0) );
transform.setRotation( tf::Quaternion(msg->theta, 0, 0) );
br.sendTransform(tf::StampedTransform(transform, ros::Time::now(), "base_footprint", "base_link")); //don't know what to pass through
}
int main(int argc, char** argv){
ros::init(argc, argv, "my_tf_broadcaster");
ros::NodeHandle node;
ros::Subscriber sub = node.subscribe("move_base/pose", 10, &poseCallback);
ros::spin();
return 0;
};
↧
↧
ar_tag_alvar frame not found
Dear ROS users,
I'm using ar_tag_alvar, starting from example launch file already in the package. When launching, I get
*Frame id / does not exist!*, despite I set in the launch file.
Thank you very much!
↧
TF Lookup would require extrapolation into the future
I would appreciate some guidance to understand what may be going on here.
I am trying to use the very nice robot_localization package to integrate GPS, IMU, Optical Flow and a few other sensors to localize a drone.
When running my configuration with sensor data coming from a recorded bag file, I get the following error from the ekf_localization_node node:
[ WARN] [1406560584.464395417, 1406223582.586091240]: Could not obtain transform from utm to nav. Error was Lookup would require extrapolation into the future. Requested time 1406223583.213000000 but the latest data is at time 1406223582.576022606, when looking up transform from frame [utm] to frame [nav]
I've been trying to understand why this is happening and I've gathered the following information:
- The CPU usage is pretty low. I don't think this could be happening because of lack of processing capacity
- What the code seems to be trying to do is using lookupTransform to calculate TF using the timestamp of the GPS message received
- The transformation between [utm] and [nav] is being published at 50Hz. GPS messages are being published at 1Hz.
- I can see GPS messages being published with timestamps matching the warning messages. I can also see TF publishing transformations matching the timestamp of "the latest data is at time ..." but also many more later messages:
> --> stamp: > secs: 1406223582> nsecs: 576022606> frame_id: nav> child_frame_id: utm> transform: > translation: > x: -5918718.00919> y: 1789119.24381> --> stamp: > secs: 1406223582> nsecs: 596210609> frame_id: nav> child_frame_id: utm> transform: > translation: > x: -5918718.00919> y: 1789119.24381> --> stamp: > secs: 1406223582> nsecs: 616362896> frame_id: nav> child_frame_id: utm> transform: > translation: > x: -5918718.00919> y: 1789119.24381> --> stamp: > secs: 1406223582> nsecs: 636493509> frame_id: nav> child_frame_id: utm> transform: > translation: > x: -5918718.00919> y: 1789119.24381> --> ...
So if there are other, more recent transforms available, published by TF, why is lookupTransform complaining that the latest data is so old, when in fact there is much newer data being published?
For now, I am using the workaround of requesting the transformation for ros::Time(0) instead of the timestamp for the GPS message, but I am not sure that is the correct thing to do.
Any ideas?
↧
RGBDSLAM - Slow transform publishing
Hi there,
I have the RGBD SLAM v2 package (http://felixendres.github.io/rgbdslam_v2/) running on a Core i7 computer with a Nvidia GTX 780 graphics card. The computer runs Ubuntu 12.04.5 and ROS Hydro.
I want to use the package for robot localization, but it seems like I'm getting a too low pub rate of the transform.
In my launch file I have:
which are transforms that I need to exist for sending the pose estimation for the robot. I also have:
where 'vision' is considered the base of the robot. 'till now everything is ok. The problem comes when I run the package and start processing. Using the tf view_frames, I can check the following on the tf tree:
Broadcaster: /rgbdslam
Average rate: 10000.000 Hz
Most recent transform: 1409946186.276
Buffer length: 0.000 sec
which for sure is not right, given that as said in http://answers.ros.org/question/54240/using-robot_pose_ekf-and-rgbdslam-for-path-planning-with-octomaps/?answer=54312#post-id-54312 the tf transform is being sent at a 10Hz rate supposedly, which I know it can be less, but less can't be what is in the following: (Issuing 'rosrun tf tf_monitor')
Node: /rgbdslam 0.202653 Hz, Average Delay: 1.97535 Max Delay: 11.3705
Which is extremely slow! Does someone has a tip for what is happening or why is it so slow?
Thanks in advance!
↧
↧
make tf from rosbag data raw
Hello,
I am trying to make tf tree from rosbag file and raw topics like /imu/data and /scan without tf topic. At the time when record the bagfile, we don't know the tf jet and now we want to make the tf appropriate tree.
We are doing tf static transform with static_transform_publisher nodes by launch file:
In imu msg topic, the header is the follow:
header:
seq: 49958
stamp:
secs: 1410520339
nsecs: 565862711
frame_id: imu
Frame_id : imu.
The same for the laser data:
header:
seq: 1831
stamp:
secs: 1410520215
nsecs: 426117012
frame_id: laser
angle_min: -0.872664630413
angle_max: 0.872664630413
angle_increment: 0.00436332309619
time_increment: 3.70370362361e-05
scan_time: 0.0533333346248
range_min: 0.0
range_max: 81.0
Notice that both frame publisher are "imu" and "laser".
Now we run the launch file generating the tree but when we want to see in rviz, appear the next msg:
In imu frame:
Transform [sender=unknown_publisher]
Message removed because it is too old (frame=[imu], stamp=[1410520316.775747058])
In laser frame.
Transform [sender=unknown_publisher]
Message removed because it is too old (frame=[laser], stamp=[1410520194.940736519])
The problem is the msg has other timestamp as the tf_publisher.
What is the best strategy to make new tf with old timestamp to use the raw data from the rosbag file?
The aim is watch the robot pose and the map/odometry just with the rosbag sensors. I read hector_mapping can do the thing but need tf in order to make correct odometry.
↧
tf::TransformListener::transformPose [exception] target_frame does not exist
Hi,
I am trying to transform `geometry_msgs::PoseStamped` in one frame to other. I am using `tf::TransformListener::transformPose` here is my code snippet.
tf::TransformListener listener;
geometry_msgs::PoseStamped pose_map;
try{
listener.transformPose("map",pose_world,pose_map); // pose_world is in world frame
}
catch( tf::TransformException ex)
{
ROS_ERROR("transfrom exception : %s",ex.what());
}
i am getting below exception.
[ERROR] [1410527449.677789054, 1410185755.142016150]: transfrom exception : "map" passed to lookupTransform argument target_frame does not exist.
But i can see tf between /world and /map from `rosrun tf view_frames` and also from `rosrun tf tf_echo world map`
**Edit:**
After adding wait for transform i am getting below exception. But /map to /world is publishing at 25hz from launch file. I am using bag file for tf data with --clock and use_sim_time true
[ERROR] [1410539110.856303453, 1410185780.612246601]: transfrompose exception : Lookup would require extrapolation into the past. Requested time 1410185779.600078575 but the earliest data is at time 1410185779.862480216, when looking up transform from frame [world] to frame [map]
Thank you.
↧
Could not transform imu message from imu_link to base_footprint
Now I am migrating a handover code which can run in diamondback to run in groovy. I have changed the code according to the migration log and it compiles successfully. But when I run it in the real PR2 robot, all the realtime controllers suddenly die. There is an error message keep poping out:
Node: /robot_pose_ekf
Time: 1410839002.827758490
Severity: Error
Location: /tmp/buildd/ros-groovy-navigation-1.10.3/debian/ros-groovy-navigation/opt/ros/groovy/stacks/navigation/robot_pose_ekf/src/odom_estimation_node.cpp: OdomEstimationNode::imuCallback:214
Published Topics: /rosout, /tf, /robot_pose_ekf/odom_combined
Could not transform imu message from imu_link to base_footprint
I think it may have something related to the urdf of pr2 as [this link](https://code.ros.org/trac/ros-pkg/ticket/3814) suggests. The link gives a solution:
This problem is caused by an old urdf on prg. You should generate a new one from pr2_defs
But I don't understand what it means by "generate a new one from pr2_defs".
In addition, I notice that in diamondback, my code depends on the package "move_arm_msgs" but if I want the code to run in groovy, I have to change all those "move_arm_msgs" to another package "arm_navigation_msgs". I think my error may be related to the change of these two packages. Can you give me any suggestions?
↧
[TF] Extrapolation into future - Wierd Chain?
Hey everyone,
I have the issue, that while running moveIt, I get transforms errors:
[ERROR] [1410959502.463512661]: Transform error: Lookup would require extrapolation into the future. Requested time 1410959502.460805976 but the latest data is at time 1410959502.410336017, when looking up transform from frame [finger_1_link_0] to frame [camera_top/camera_top_depth_optical_frame]
Now, following [this](http://wiki.ros.org/tf/Tutorials/Debugging%20tf%20problems) guide, I debugged the transforms, I appended the results below. What I was wondering is: Why is the chain so long and looped? Might this be the root of the issue?
RESULTS: for camera_top/camera_top_depth_optical_frame to finger_1_link_0
Chain is: segment_0 -> camera_top/camera_top_link -> camera_top/camera_top_rgb_frame -> camera_top/camera_top_link -> camera_front/camera_front_link -> segment_0 -> camera_top/camera_top_depth_frame -> camera_front/camera_front_link -> palm -> segment_7 -> camera_front/camera_front_depth_frame -> camera_front/camera_front_rgb_frame -> finger_1_link_1 -> finger_1_link_0 -> finger_1_link_2 -> finger_2_link_1 -> finger_2_link_0 -> finger_2_link_2 -> finger_middle_link_1 -> finger_middle_link_0 -> finger_middle_link_2 -> palm -> palm -> palm -> segment_6 -> segment_0 -> segment_1 -> segment_2 -> segment_3 -> segment_4 -> segment_5
Net delay avg = 0.055249: max = 0.117458
Frames:
Frame: camera_front/camera_front_depth_frame published by /camera_front_base_link Average Delay: -0.00940285 Max Delay: 0
Frame: camera_front/camera_front_link published by /second_kinect_broadcaster Average Delay: -0.0993219 Max Delay: 0
Frame: camera_front/camera_front_link published by /second_kinect_broadcaster Average Delay: -0.0993219 Max Delay: 0
Frame: camera_front/camera_front_rgb_frame published by /camera_front_base_link_1 Average Delay: -0.00940964 Max Delay: 0
Frame: camera_top/camera_top_depth_frame published by /camera_top_base_link Average Delay: -0.00945783 Max Delay: 0
Frame: camera_top/camera_top_link published by /camera_top_link_broadcaster Average Delay: -0.00942012 Max Delay: 0
Frame: camera_top/camera_top_link published by /camera_top_link_broadcaster Average Delay: -0.00942012 Max Delay: 0
Frame: camera_top/camera_top_rgb_frame published by /camera_top_base_link_1 Average Delay: -0.00942155 Max Delay: 0
Frame: finger_1_link_0 published by /robot_tf_state_publisher Average Delay: -0.496956 Max Delay: 0
Frame: finger_1_link_1 published by /robot_tf_state_publisher Average Delay: 0.00512914 Max Delay: 0.0639241
Frame: finger_1_link_2 published by /robot_tf_state_publisher Average Delay: -0.496976 Max Delay: 0
Frame: finger_2_link_0 published by /robot_tf_state_publisher Average Delay: -0.496954 Max Delay: 0
Frame: finger_2_link_1 published by /robot_tf_state_publisher Average Delay: 0.00513452 Max Delay: 0.0639276
Frame: finger_2_link_2 published by /robot_tf_state_publisher Average Delay: -0.496969 Max Delay: 0
Frame: finger_middle_link_0 published by /robot_tf_state_publisher Average Delay: -0.496951 Max Delay: 0
Frame: finger_middle_link_1 published by /robot_tf_state_publisher Average Delay: 0.00513811 Max Delay: 0.0639306
Frame: finger_middle_link_2 published by /robot_tf_state_publisher Average Delay: -0.496963 Max Delay: 0
Frame: palm published by /robot_tf_state_publisher Average Delay: -0.496948 Max Delay: 0
Frame: palm published by /robot_tf_state_publisher Average Delay: -0.496948 Max Delay: 0
Frame: palm published by /robot_tf_state_publisher Average Delay: -0.496948 Max Delay: 0
Frame: palm published by /robot_tf_state_publisher Average Delay: -0.496948 Max Delay: 0
Frame: segment_1 published by /robot_tf_state_publisher Average Delay: 0.00514182 Max Delay: 0.0639336
Frame: segment_2 published by /robot_tf_state_publisher Average Delay: 0.00514508 Max Delay: 0.0639368
Frame: segment_3 published by /robot_tf_state_publisher Average Delay: 0.00514861 Max Delay: 0.0639399
Frame: segment_4 published by /robot_tf_state_publisher Average Delay: 0.00515163 Max Delay: 0.063943
Frame: segment_5 published by /robot_tf_state_publisher Average Delay: 0.00515438 Max Delay: 0.0639463
Frame: segment_6 published by /robot_tf_state_publisher Average Delay: 0.00515731 Max Delay: 0.0639497
Frame: segment_7 published by /robot_tf_state_publisher Average Delay: 0.00515987 Max Delay: 0.0639524
All Broadcasters:
Node: /camera_front_base_link 99.3663 Hz, Average Delay: -0.00940285 Max Delay: 0
Node: /camera_front_base_link_1 99.4127 Hz, Average Delay: -0.00940964 Max Delay: 0
Node: /camera_front_base_link_2 99.3969 Hz, Average Delay: -0.00941993 Max Delay: 0
Node: /camera_front_base_link_3 99.4064 Hz, Average Delay: -0.00940085 Max Delay: 0
Node: /camera_top_base_link 99.4075 Hz, Average Delay: -0.00945783 Max Delay: 0
Node: /camera_top_base_link_1 99.4135 Hz, Average Delay: -0.00942155 Max Delay: 0
Node: /camera_top_base_link_2 99.4092 Hz, Average Delay: -0.00940796 Max Delay: 0
Node: /camera_top_base_link_3 99.4122 Hz, Average Delay: -0.00936636 Max Delay: 0
Node: /camera_top_link_broadcaster 99.404 Hz, Average Delay: -0.00942012 Max Delay: 0
Node: /pinch_frame_broadcaster 99.4173 Hz, Average Delay: -0.00934003 Max Delay: 0
Node: /robot_tf_state_publisher 110.242 Hz, Average Delay: -0.451271 Max Delay: 0.012499
Node: /second_kinect_broadcaster 9.94165 Hz, Average Delay: -0.0993219 Max Delay: 0
Here is my transform tree:


I'm thankful for any hints,
Rabe
↧
↧
How do i use ros TF to capture object position based on colour and calculate relative distance?
Hi im currently doing position detection. By using Opencv color(red) detection i detect the red object and now i want to publish it in the tf and show where and the distance between the object and camera.
↧
TF2 - static_transform_publisher - Extrapolation into the future
Hello,
I have a couple of static_transform_publishers from the TF2 package running. How is it possible that a node complains about extrapolating into the future, when it tries to lookup a transformation from a random element to the static one? In my [understanding](http://wiki.ros.org/tf2/Migration), the static-transforms should be valid for all eternity, so why would you complain about timing issues?
A further explanation about my setup:
I have 2 Kinects and I am using those in my moveIt setup to create the Octomap.
As explained [here](http://answers.ros.org/question/192871/tf-extrapolation-into-future-wierd-chain/), moveIt complains about extrapolation into the future while looking up a transform from my robot to the cameras.
My first solution was to write a python node, which would handle all the publishing instead of seperate static_transform_publishers from the old TF. There, I could date the transformations back in time to avoid those issues.
Now I read about the static transforms in TF2 and implemented all the camera transformations like this:
Still, moveIt is complaining
[ERROR] [1411130239.665581579]: Transform error: Lookup would require extrapolation into the future. Requested time 1411130239.663295803 but the latest data is at time 1411130239.541714906, when looking up transform from frame [finger_2_link_0] to frame [camera_top/camera_top_depth_optical_frame]
[ERROR] [1411130239.997648183]: Transform cache was not updated. Self-filtering may fail
But running ``rosrun tf tf_echo /finger_2_link_0 camera_top/camera_top_depth_optical_frame` has no problems and tells me the transforms moveIt couldn't find?
Thanks in advance,
Rabe
↧
reading a specific tf frame
Hello,
Does anyone know if there is a function that lets you read a specific tf frame such as:
.readFrame("/base_link" , time(latest info published), //frame reference);
I need to be able to read a specific frame. I know you can do:
tf::StampedTransform transform;
.lookUpTransform("target frame", "original frame" , time , transform);
The reason why I don't want a lookUpTransform is because I am actually not trying to take the transform of anything here.
Unless I am not understanding something here, I would appreciate some feed back here.
This is what I have so far. Please give me some feed back on the frames that I am using. Because to my understanding The "base_link" frame should be transformed to the "map" frame, If I have a .yaml map file running on top of the navigation stack right?
#include
#include
#include
#include
#include
#include
using namespace std;
int main(int argc, char** argv) {
ros::init(argc, argv, "readTFframe");
ros::NodeHandle nh;
tf::Transform leader_base_link;
tf::Quaternion q;
tf::Vector3 v;
tf::TransformListener listener;
geometry_msgs::TransformStamped transformConverter;
ros::Time current_time, last_time;
current_time = ros::Time::now();
last_time = ros::Time::now();
ros::Publisher pub = nh.advertise("/leader/tf", 1);
static tf::TransformBroadcaster br;
double yaw;
ros::Rate loopRate(10);
while(ros::ok()){
tf::StampedTransform transform;
current_time = ros::Time::now();
try {
listener.waitForTransform("base_link", "base_footprint", ros::Time(0), ros::Duration(10.0));
listener.lookupTransform("base_link", "base_footprint", ros::Time(0), transform);
}
catch (tf::TransformException &ex) {
ROS_ERROR("%s",ex.what());
}
yaw = tf::getYaw(transform.getRotation());
v.setValue(transform.getOrigin().x() , transform.getOrigin().y() , transform.getOrigin().z());
leader_base_link.setOrigin(v);
q.setRPY(0.0, 0.0, yaw);
leader_base_link.setRotation(q );
br.sendTransform(tf::StampedTransform(leader_base_link, ros::Time::now(), "base_footprint", "leader_base_link"));
/*
ROS_INFO_STREAM("X pose " << transform.getOrigin().x() );
ROS_INFO_STREAM("Y pose " << transform.getOrigin().y() );
ROS_INFO_STREAM("Z pose " << transform.getOrigin().z() );
ROS_INFO_STREAM("Yaw "<< yaw );
*/
double th = yaw;
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);
transformConverter.header.stamp = current_time;
transformConverter.header.frame_id = "base_footprint";
transformConverter.child_frame_id = "base_link";
transformConverter.transform.translation.x = transform.getOrigin().x();
transformConverter.transform.translation.y = transform.getOrigin().y();
transformConverter.transform.translation.z = transform.getOrigin().z();
transformConverter.transform.rotation = odom_quat;
ROS_INFO_STREAM("X pose " << transformConverter.transform.translation.x );
ROS_INFO_STREAM("Y pose " << transformConverter.transform.translation.y);
ROS_INFO_STREAM("Z pose " << transformConverter.transform.translation.z );
ROS_INFO_STREAM("Yaw "<< yaw );
loopRate.sleep();
}
return 0;
}
This is my current code when I run it I can see that the values are being passed to the transformConverter transform that is converting the desired transform from tf::StampedTransform to geometry_msgs::TransformStamped . This way it can be published on the topic /leader/tf. The only thing is I can see the values are passed through ROS_INFO. But they are not being published now on the topic.
↧
tf tree is invalide because it contains a loop
I am trying to display my laserscan on rviz but its status flickers between Status:Ok and Status:Error because of Transform.
Transform [sender=/depthimage_to_laserscan]
For frame [camera_depth_frame]: No transform to fixed frame [base_footprint].
TF error: [The tf tree is invalid because it contains a loop.
Frame camera_rgb_optical_frame exists with parent camera_rgb_frame.
Frame camera_rgb_frame exists with parent base_link.
Frame base_footprint exists with parent odom.
Frame base_link exists with parent base_footprint.
Frame left_cliff_sensor_link exists with parent base_link.
Frame leftfront_cliff_sensor_link exists with parent base_link.
Frame right_cliff_sensor_link exists with parent base_link.
Frame rightfront_cliff_sensor_link exists with parent base_link.
Frame wall_sensor_link exists with parent base_link.
Frame camera_depth_frame exists with parent camera_rgb_frame.
Frame camera_depth_optical_frame exists with parent camera_depth_frame.
Frame camera_link exists with parent camera_rgb_frame.
Frame front_wheel_link exists with parent base_link.
Frame gyro_link exists with parent base_link.
Frame base_laser_link exists with parent base_link.
Frame laser exists with parent base_link.
Frame plate_0_link exists with parent base_link.
Frame plate_1_link exists with parent plate_0_link.
Frame plate_2_link exists with parent plate_1_link.
Frame plate_3_link exists with parent plate_2_link.
Frame rear_wheel_link exists with parent base_link.
Frame spacer_0_link exists with parent base_link.
Frame spacer_1_link exists with parent base_link.
Frame spacer_2_link exists with parent base_link.
Frame spacer_3_link exists with parent base_link.
Frame standoff_2in_0_link exists with parent base_link.
Frame standoff_2in_1_link exists with parent base_link.
Frame standoff_2in_2_link exists with parent base_link.
Frame standoff_2in_3_link exists with parent base_link.
Frame standoff_2in_4_link exists with parent standoff_2in_0_link.
Frame standoff_2in_5_link exists with parent standoff_2in_1_link.
Frame standoff_2in_6_link exists with parent standoff_2in_2_link.
Frame standoff_2in_7_link exists with parent standoff_2in_3_link.
Frame standoff_8in_0_link exists with parent standoff_2in_4_link.
Frame standoff_8in_1_link exists with parent standoff_2in_5_link.
Frame standoff_8in_2_link exists with parent standoff_2in_6_link.
Frame standoff_8in_3_link exists with parent standoff_2in_7_link.
Frame standoff_kinect_0_link exists with parent plate_2_link.
Frame standoff_kinect_1_link exists with parent plate_2_link.
Frame left_wheel_link exists with parent base_link.
Frame right_wheel_link exists with parent base_link.
Frame scan1 exists with parent cart_frame.
Frame scan2 exists with parent cart_frame. ]
How do I get rid of the error?
EDIT:
tf frames

↧
↧
tf message_filters and child_frame_id
Message filters are handy when you want to only fire callbacks if a transform is available for the data contained therein. The user sets the target frame_id, and presumably the source frame_id is derived from the message's header.
However, nav_msgs/Odometry messages contain both pose and twist data, with the former being in the message header's frame_id, and the latter being in the message's child_frame_id. Even if no transform exists for the pose data, there may be a transform for the twist data. I can't just set up another filter with the target child_frame_id, as the message filter will then look for a transform from the odometry message's frame_id (i.e., pose data frame) to the target child_frame_id, which will probably not make any sense.
Do I have any recourse here? Clearly, I can just fire all the callbacks and check for the transform availability myself, but I'm guessing I'm going to lose some efficiency that way.
↧
transform broadcaster orientation problem
Hello,
I am using a turtlebot 2 with ros-groovy. I have successfully wrote a publisher that takes the base_link and broadcasts that info. I am working on writing a broadcaster for the base_footprint. Also my base_link broadcaster works perfect. That is why I just followed the same format for that. But I don't understand why this logic is happening.
The base_foot print broadcaster seems to be messing up though. The broadcaster is able to keep the correct distance but when I start turning the robot things go bad. What goes wrong is that as the robot starts turning the base_footprint frame starts revolving around the fixed map frame (0,0,0). What I would expect is that whenever the turtlebot turns in the z direction that the base_footprint stays with the robot model on rviz like it is suppose to. If anyone could help it would be appreciated.


I have 2 examples there of what Is going on so you can get an idea.
while(ros::ok()){
tf::StampedTransform transform; //for the base_link
tf::StampedTransform transformFP; //for the base_footprint
current_time = ros::Time::now();
//This is for the base_link frame which works great this block of code just gets the transform and that transform is
//broadcasted as leader_base_link to the tf tree.
try {
listener.waitForTransform("base_link", "base_footprint", ros::Time(0), ros::Duration(10.0));
listener.lookupTransform("base_link", "base_footprint", ros::Time(0), transform);
}
catch (tf::TransformException &ex) {
ROS_ERROR("%s",ex.what());
}
yaw = tf::getYaw(transform.getRotation());
v.setValue(transform.getOrigin().x() , transform.getOrigin().y() , transform.getOrigin().z());
leader_base_link.setOrigin(v);
q.setRPY(0.0, 0.0, yaw);
leader_base_link.setRotation(q );
br.sendTransform(tf::StampedTransform(leader_base_link, ros::Time::now(), "base_footprint", "leader_base_link"));
double th = yaw;
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);
transformConverter.header.stamp = current_time;
transformConverter.header.frame_id = "base_footprint";
transformConverter.child_frame_id = "base_link";
transformConverter.transform.translation.x = transform.getOrigin().x();
transformConverter.transform.translation.y = transform.getOrigin().y();
transformConverter.transform.translation.z = transform.getOrigin().z();
transformConverter.transform.rotation = odom_quat;
/*This is where the base_footprint is being created I also believe this is where the code is messing up.
As the robot turns in the yaw angle the position of the base_footprint starts translating into a + or - direction but the distance is kept . As you can see in the picture that is what happens when the robot turns in a 180 degrees. The base_footprint gets shifted in 180 degrees This is following the same exact code as the base_link that works fine as well*/
try {
listener.waitForTransform("base_footprint", "odom", ros::Time(0), ros::Duration(10.0));
listener.lookupTransform("base_footprint", "odom", ros::Time(0), transformFP);
}
catch (tf::TransformException &ex) {
ROS_ERROR("%s",ex.what());
}
*/ double tryYaw = atan2( (2*transformFP.getRotation().w()*transformFP.getRotation().z() ), (transformFP.getRotation().w()*transformFP.getRotation().w() - transformFP.getRotation().z()*transformFP.getRotation().z())); This is another equation I tried but it basically does the same exact stuff. */
//this is where the yaw angle gets calculated.
yaw2 = tf::getYaw(transformFP.getRotation());
v.setValue(transformFP.getOrigin().x() , transformFP.getOrigin().y() , transformFP.getOrigin().z());
leader_footprint.setOrigin(v);
q.setRPY(0.0, 0.0, yaw2);
leader_footprint.setRotation(q );
//this is where the leader_footprint gets added to the transform tree
br.sendTransform(tf::StampedTransform(leader_footprint, ros::Time::now(), "odom", "leader_base_footprint"));
//The rest of the code just used to be able to send the StampedTransform over a topic Don't worry too much about this part
double th2 = yaw2;
geometry_msgs::Quaternion odom2_quat = tf::createQuaternionMsgFromYaw(th2);
transformConverterFP.header.stamp = current_time;
transformConverterFP.header.frame_id = "odom";
transformConverterFP.child_frame_id = "base_footprint";
transformConverterFP.transform.translation.x = transformFP.getOrigin().x();
transformConverterFP.transform.translation.y = transformFP.getOrigin().y();
transformConverterFP.transform.translation.z = transformFP.getOrigin().z();
transformConverterFP.transform.rotation = odom2_quat;
ROS_INFO_STREAM("X pose " << transformConverterFP.transform.translation.x );
ROS_INFO_STREAM("Y pose " << transformConverterFP.transform.translation.y);
ROS_INFO_STREAM("Z pose " << transformConverterFP.transform.translation.z );
ROS_INFO_STREAM("Yaw Interposed : "<< yaw2 );
ROS_INFO_STREAM("Yaw Z "<< transformFP.getRotation().w() );
ROS_INFO_STREAM("Yaw Z "<< transformFP.getRotation().z() );
pub.publish(transformConverter);
pubFoot.publish(transformConverterFP);
loopRate.sleep();
}
↧
The problem of Transformer::lookupTransform
Recently, I follow the tf tutorial and try to understand the frame transformation in ROS. But I'm confusing about the API `Transformer::lookupTransform ` and `tf_echo`
First, the usage of tf_echo is shown below.
Usage: tf_echo source_frame target_frame
This will echo the transform from the coordinate frame of the source_frame to the coordinate frame of the target_frame.
So it is clear, I will get the transform that from source_frame to target_frame. I also confirmed it in turtlesim. However, when I saw the [tutorial for tf_echo](http://wiki.ros.org/tf/Tutorials/Introduction%20to%20tf). They say the `tf_echo` will return "the transform of the turtle2 frame with respect to turtle1 frame". In other words, I should get the transform from `target frame` to `source frame`. It is not correct since I verified from the turtlesim.
I wen to see how to get transform. It is clear that in [tf_echo.cpp](https://github.com/ros/geometry/blob/indigo-devel/tf/src/tf_echo.cpp#L89). Everything is fine except this line.
`echoListener.tf.lookupTransform(source_frameid, target_frameid, ros::Time(), echo_transform);`
I compare this line with the API document.
`
void Transformer::lookupTransform ( const std::string & target_frame,
const std::string & source_frame,
const ros::Time & time,
StampedTransform & transform
)`
With my inspection, `Transformer::lookupTransform` will get the transform that from target_frame to source_frame(API naming). However, for some reason, tf_echo swap the these frames. My questions are:
1)What is the meaning of source frame and target frame?
2)Why the usage of tf_echo is different from [tutorial](http://wiki.ros.org/tf/Tutorials/Introduction%20to%20tf)
3)Why tf_echo swap the arguments?
↧
tf rotation before translation?
I cannot find any documentation on convention used by tf.
$ rosrun tf tf_echo /map /odom
At time 1263248513.809
- Translation: [2.398, 6.783, 0.000]
- Rotation: in Quaternion [0.000, 0.000, -0.707, 0.707]
in RPY [0.000, -0.000, -1.570]
1. For translation value and rotation quaternion value,
is translation happening in original frame and then rotation happens?
Or, does it rotate first and then the translation happens?
What is the convention used by tf system?
2. Which RPY convention is RPY using?
Thanks.
↧
↧
Why is a transform not available after waitForTransform?
Hello ros-users,
I have strange issue with tf-transforms. I use laser_geometry::LaserProjection to project laser scans to point clouds, but I assume the problem is related to tf. The code-snippet is from the scan callback function.
if(!gTfListener->waitForTransform(scan->header.frame_id,
"camera",
scan->header.stamp,
ros::Duration(2.0),
ros::Duration(0.01),&error_msg))
{
ROS_WARN("Warning 1: %s", error_msg.c_str());
return;
}
try
{
gProjector->transformLaserScanToPointCloud("camera", *scan, cloud, *gTfListener);
}catch(tf::TransformException e)
{
ROS_WARN("Warning 2: %s", e.what());
return;
}
After a few seconds of running, Warning 2 is raised on the console for like 1 out of 4 scans with this error message:
> Warning 2: Lookup would require extrapolation into the future. Requested time 1412757571.359567610 but the latest data is at time 1412757571.357117891, when looking up transform from frame [laser_link] to frame [camera]
How can this happen, after waitForTransform obviously succeeded (returned true)?
Thanks in advance,
Sebastian
↧
transform listener can not look up /map frame
Dear All, I am asking a weired question about the transform listener.
I have tried the following two code, one with lookup base_link to odom and one to look up base_link to map. The result is I can get the transform to odom but can not get the transform to map with error:
Unable to lookup transform, cache is empty, when looking up transform from frame [/base_link] to frame [/map]
For base_link to odom:
try {
listener.waitForTransform("odom", "base_link", ros::Time(0), ros::Duration(10.0) );
listener.lookupTransform("odom", "base_link", ros::Time(0), transform);
} catch (tf::TransformException ex) {
ROS_ERROR("%s",ex.what());
}
For base_link to map:
try {
listener.waitForTransform("map", "base_link", ros::Time(0), ros::Duration(10.0) );
listener.lookupTransform("map", "base_link", ros::Time(0), transform);
} catch (tf::TransformException ex) {
ROS_ERROR("%s",ex.what());
}
I have checked using the tf view_frames and the /map frame does exist and is connected with base_link. And I have also tried another method from costmap_2d_ros which can successfully loop up the map frame. See below the getRobotPose function actually did the same thing as above only without the waitForTransform, but I don't think this is the issue.
tf::Stamped robot_pose;
tf::TransformListener tf;
costmap_2d::Costmap2DROS lcr("costmap", tf);
lcr.**getRobotPose**(robot_pose);
Can anyone help advise on this why I can not use my code to look up the transformation to /map frame? I am using Groovy under Ubuntu 12.04.4 64bit
Thanks.
Edit: Attached code and tf tree


↧
Error with boost while building tf package in catkin ws
I'm building the imu_filter_madgwick package from the [imu_tools](https://github.com/ccny-ros-pkg/imu_tools) package using the command )’:
/root/ros_catkin_test_ws/src/geometry/tf/src/tf.cpp:575:60: error: could not convert ‘tf2::BufferCore::_addTransformsChangedListener(boost::function)(boost::function((*(const self_type*)(& callback))))’ from ‘boost::signals2::connection’ to ‘boost::signals::connection’
/root/ros_catkin_test_ws/src/geometry/tf/src/tf.cpp: In member function ‘void tf::Transformer::removeTransformsChangedListener(boost::signals::connection)’:
/root/ros_catkin_test_ws/src/geometry/tf/src/tf.cpp:580:49: error: no matching function for call to ‘tf2_ros::Buffer::_removeTransformsChangedListener(boost::signals::connection&)’
/root/ros_catkin_test_ws/src/geometry/tf/src/tf.cpp:580:49: note: candidate is:
/root/ros_catkin_ws/install_isolated/include/tf2/buffer_core.h:257:8: note: void tf2::BufferCore::_removeTransformsChangedListener(boost::signals2::connection)
/root/ros_catkin_ws/install_isolated/include/tf2/buffer_core.h:257:8: note: no known conversion for argument 1 from ‘boost::signals::connection’ to ‘boost::signals2::connection’
make[2]: *** [CMakeFiles/pytf_py.dir/src/tf.cpp.o] Error 1
make[1]: *** [CMakeFiles/pytf_py.dir/all] Error 2
make: *** [all] Error 2
**Is there anyway to resolve this?** I'm using a catkin workspace environment on a Debian OS. Please let me know if any additional logs are required to debug this. Thank you in advance for your help!
catkin_make_isolated --install --from-pkg-with-deps imu_filter_madgwick
. All goes well in the installation until it hits the second to last package in the package build topology, which is tf. I receive the following CMAKE error:
/root/ros_catkin_test_ws/src/geometry/tf/src/tf.cpp: In member function ‘boost::signals::connection tf::Transformer::addTransformsChangedListener(boost::function↧