Howdy folks,
I have question relating to the following transform tree:
**[/nav_frame]->[/base_link]->[/sensor_platform] -> [/IMU_AHRS]**
The sensor platform is literally a platform that has sensors mounted on it (GPS antenna, IMU-AHRS, etc.) and is attached to the robotic platform (/base_link).
The nav_frame is a local ENU frame in which I do my navigation computations: it's placed on the WGS84 ellipsoid at the same location as the base_link, and it always points north.
I got an IMU-AHRS system (with magnetometer and world magnetic map) mounted on the sensor_platform that I will use to resolve the transformation between nav_frame and base_link.
So, I have the following situation:
[/nav_frame]->[/base_link] UNKNOWN, this is what I need to resolve
[/base_link] ->[/sensor_platform] is a known and fixed transform
[/sensor_platform] -> [/IMU_AHRS] is a known and fixed transform
**My Question: Can tf (or tf2) somehow automatically (internally) resolve the hidden transform [/nav_frame]->[base_link] when it's given the measured transform [/nav_frame] -> [/IMU_AHRS] as produced by the IMU-AHRS system, or do I (as the programmer) have to explicitly compute this transform (which is easy enough using the IMU-AHRS measurements and the fixed transforms relating the IMU_AHRS frame w.r.t. the base_link) and explicitly publish this as the [/nave_frame] -> [/base_link] transform?**
if tf/tf2 can do this "internally", then how?
Thanks in advance
Best regards,
Galto
↧
Can tf/tf2 resolve "hidden transforms" ?
↧
TransformListener has unconnected trees
I have multiple tf frames which some of are dynamically published by the robot_state_publisher and some are static transformations.
When using the tf.TransformListener i get the error message that there are two or more unconnected trees.
Running rosrun tf view_frames also shows the unconnected trees.
However, rosrun rqt_tf_tree rqt_tf_tree shows the correct tf tree, rviz can show the correct tf tree and even rosrun tf tf_echo source_frame destination_frame returns the transformation values.
What am I doing wrong?
EDIT1:
rosrun tf tf_monitor source_frame target_frame returns the correct chain, however every frame is published by "unknown publisher" and the only broadcaster node is called "unknown". Could this be the reason why view_frames or the TransformListener are failing?
EDIT2:
TF2 can do the transformation with the following piece of code:
tf_buffer = tf2_ros.Buffer()
tf2_listener = tf2_ros.TransformListener(tf_buffer)
tf_buffer.lookup_transform('world', 'halcon_camera', rospy.Time())
↧
↧
what are quaternions and how can I use them?
I have seen quaternions in tf and bullet but I don't know what they are or how to use them.
↧
tf time out error : hector slam
I'm suffering tf time out error.
I want to use hector slam on a UAV(quad copter) with a Lidar
First, the error output on the terminal looks like the below
[ INFO] [1502340821.590928774]: lookupTransform base_stabilized to laser timed out. Could not transform laser scan into base_frame.
[ INFO] [1502340822.096555365]: lookupTransform base_stabilized to laser timed out. Could not transform laser scan into base_frame.
[ WARN] [1502340822.584208719]: No transform between frames /map and scanmatcher_frame available after 20.002080 seconds of waiting. This warning only prints once.
[ INFO] [1502340822.601611575]: lookupTransform base_stabilized to laser timed out. Could not transform laser scan into base_frame.
I set tf setting as follows.
map -> (scanmatcher_frame , static) -> base_stabilized (odom)
base_stabilized -> base_link (imu_attitude node in order to describe roll,pitch motion)
base_link -> laser (static tf publisher)
I think I've got correct at the tf setting.
and all topics have proper time stamps.
I'll let you see the **launch file** that I use and **tf tree**.
the tf tree is given ..
[here](https://drive.google.com/open?id=0B1HgCksxRA2ldnR3YXAzR19uYWM)
thanks in advance!
↧
how to Transform Pointcloud2 with tf
Hi,
I tried to run the python code on this link http://answers.ros.org/question/9103/how-to-transform-pointcloud2-with-tf/ and got such an error below :
Traceback(most recent call last):
File "/home/ahmet/catkin_ws/src/laserTF/src/laserTransform.py", line 26 in module>
listener.waitForTransform("/base_link", "/odom" , rospy.Time(0),rospy.Duration(1.0))
tf.Exception: . canTransform returned after 1.00039 timeout was 1.
How to solve this problem ? Please help me this time . @jayess
↧
↧
Two 6 DOF robots on scene
Hello everybody/ I'm new in ros and possible my question is very stupid. But i've read huge amount of tutotials and posts on forum and still have questions. I'm really stack with my task to setup 2 6dof robot on scene, I cant figure out how to publush transform between robot base and world when each robot has it own namespace and tf_prefix.
So what I have: I have two 6dof robot setup on table, camera and the box. I whant to detect box from camera and publish box's coordinates in table/world coordinate system. Each robot listen there coordinates topic and transform coordinates of the box in there own frame, then robot touch a box.
What I did. I can get the position of box in camera frame and tranform it robot frame. But it transform likes robot base in the [0,0,0] world frame. So I cant publish in TF static transformation from robot base to the zero point, I run static_tranform_publisher node for that but in tf tree I can't see the root node.
So my questions is:
- if there some recommended way to use namespases with MoveIt and Rviz?
- how could I build and set the "world" for scene from several frames.
Thanks a lot in advance and sorry if it common question. Any information about topic would be very appripricate.
↧
Best practices for tf in kinetic?
I see other similar questions here, but nothing that answered my specific inquiries about tf:
1. Is it recommended to use a separate
tf server (as in its own node)?
1. Should URDF be used for
statically-linked robots like
quadcopters with a few attached
sensors?
1. When is a
tf_state_publisher used (from a
launch file, say), as opposed to a
tf broadcaster in code?
↧
Best practices for tf?
I see other similar questions here, but nothing that answered my specific inquiries about tf:
1. Is it recommended to use a separate
tf server (as in its own node)?
1. Should URDF be used for
statically-linked robots like
quadcopters with a few attached
sensors?
1. When is a
tf_state_publisher used (from a
launch file, say), as opposed to a
tf broadcaster in code?
↧
How to transform the whole /tf frame?
Given I have an occupancy grid map which I would want to transform with a certain [tx,ty,theta] transformation. How do I do it? The tf tree is posted below and also the rviz which contains the grid map. My ultimate goal is that I would have two occupancy grid maps which I would have to apply a transformation to either one of them so that they can be overlapped -- Map Merging. I'm looking for the mechanism in ROS where I can transform the map, particularly the whole /tf frame so that even the robot's pose will be transformed into a new coordinate frame.
http://postimg.org/image/d4xd3d39p/
http://postimg.org/image/vrdqce6x9/
↧
↧
Unable to lookup tf transform
Hi,
I'm unable to lookup tf transforms in my C++ node for some reason, unless the parent and child are the same. I have done this many times before and checked with the tutorials and wikis but am not having any luck. The tf tree displays fine in RVIZ as well as command line with the tf_echo. I've tried multiple different rosbags that worked fine in other ROS nodes but nothing is working with this node.
What is different is that this time I am working with a ROS package where there are no classes, everything is in functions but I don't think that should matter because that is how the tutorial is done.
Here is a code snippet:
void laserCloudHandler(const sensor_msgs::PointCloud2ConstPtr &laserCloudMsg) {
tf::TransformListener tf_listener;
if (tf_listener.waitForTransform("world", "velodyne_frame", laserCloudMsg->header.stamp, ros::Duration(0.2))) {
// Get the tf transform.
tf::StampedTransform tf_transform;
tf_listener.lookupTransform(world_frame, velo_frame, laserCloudMsg->header.stamp, tf_transform);
}
else
{
ROS_FATAL("UNABLE TO LOOKUP INITIAL TRANSFORM!");
}
}
Does anyone have any idea what may be wrong?
Thanks!
↧
nav_msgs/Odometry to tf2_msgs/TFMessage
I am following this tutorial [here](http://wiki.ros.org/laser_pipeline/Tutorials/IntroductionToWorkingWithLaserScannerData) for understanding how to use laserscanner data.
There is a bag file which publishes laser scan data alongside odometry data. So it doesn't publish anything on /tf topic but gives out Odometry message.
Is there a way to convert that Odometry info to TF ?
I have rosbag file which Is there a way to convert that odometry data to tf?
↧
Sample rosbag file for laser scanner data
I need a bag file for laser scanner data. The bag file should have a node publishing tf2_msgs/TFMessage and sensor_msgs/LaserScan type messages to topics.
I started out with this tutorial - [Introduction to Working With Laser Scanner Data](http://wiki.ros.org/laser_pipeline/Tutorials/IntroductionToWorkingWithLaserScannerData#Converting_a_Laser_Scan_to_a_Point_Cloud) but the bag file provided here has odometry data (nav_msgs/Odometry type message). I just don't understand how to transform odometry data to tf data.
Either help is appreciated.
↧
tf cannot find boost thread library
I am trying to build ros hydro from source on ubuntu 12.04 it requires boost 1.46.1 but the version of boost installed on my machine is 1.48 so I downloaded boost 1.46.1 and installed it in a local folder. I also added skip-keys flag in rosdep to stop it from installing boost (because it would remove the newer version). After installing these dependencies I did ./src/catkin/bin/catkin_make_isolated --install -DCMAKE_BUILD_TYPE=Release as mentioned here : http://wiki.ros.org/hydro/Installation/Source
this started building plain cmake and catkin packages but gave an error while building tf:
CMakeFiles/tf_monitor.dir/src/tf_monitor.cpp.o: In function `main':
tf_monitor.cpp:(.text.startup+0x3cf): undefined reference to `boost::thread::start_thread_noexcept()'
tf_monitor.cpp:(.text.startup+0x46f): undefined reference to `boost::thread::join_noexcept()'
I specified the boost library path using CMAKE_LIBRARY_PATH variable
cmake is able to find the local boost installation using this.
I also tried changing the CMakeCahe.txt file in tf package by specifying the location of libboost_thread.so but still no luck can someone please help .
↧
↧
robot_localization IMU TF
Dear all,
If I use robot_localization with [9 Degrees of Freedom - Razor IMU](https://www.sparkfun.com/products/retired/10736), do I need to provide TF between base link and imu link? Is TF between base_link and imu_link important for robot_localization internal processing?
Since this IMU uses three different chips for an accelerometer, gyroscope and magnetometer, which sensor's location will determine the TF value?
Based on this [Q &A](https://answers.ros.org/question/9957/what-frame_id-to-put-in-a-sensor_msgsimu-message/), we may need to mount IMU to be aligned with base_link. In that case, how should I align this IMU due to three separate chips locations in the PCB?
↧
what is tf
Hey guys, I am all in all pretty new to computer science in general and learning ROS now. came across some tutorials on tf after completing the beginner tutorials. I want to use rososc but that's another story since it is not updated for kinetic and catkin and I'm still trying to work out how to make that work for me... anyway,
-explain what tf is used for as if I am a 5 year old.
-why would I use it in developing robots?
-what type of applications is it useful for?
↧
robot_state_publisher design: multiple publishers on /joint_states ok?
Is robot_state_publisher designed to handle multiple sources of /joint_state information? For example, let's say I have the R2D2-like robot from this [tutorial](http://wiki.ros.org/urdf/Tutorials/Using%20urdf%20with%20robot_state_publisher).
If I have one node publishing the state of the 'periscope' on `/joint_states`, and another node publishing the rest of the joints, also on `/joint_states`, will `robot_state_publisher` do the right thing? (i.e., 'merge' the `/joint_state` messages?).
I'm currently studying the source code to definitively figure this out, but I see that others have asked this in the past, so I figured I'd raise the issue again.
↧
tf tree details
After running gmapping from intel.bag, while generating the map, if I run rosrun tf view_frames, it gives me the frames.pdf. In this case, the transform is map->odom->base_link and odom->laser,the broadcasting node for map->odom is /slam_gmapping, the broadcasting node for odom->laser is /play_1479118047019813325 node, the broadcasting node for odom->base_link is /static_transform_publisher_1479118028670611586 node Please explain the tree.
↧
↧
how to change the transform according to a transform tree
i have a bag file containing a transform tree and want to make a map from gmapping and hector slam. i want to know what all changes i have to make in the transforms in order to get the result..
↧
Extract robot position from rosbag
I recorded tf-messages (while driving the robot) into a bagfile and wrote a ros node that should extract the positions (base_link-frame) from that bag file.
My idea was to use a tf::Transformer and feed it with all the transforms stored in the tf-messages:
rosbag::Bag bag;
bag.open("tf.bag", rosbag::bagmode::Read);
std::vector topics;
topics.push_back("/tf");
rosbag::View view(bag, rosbag::TopicQuery(topics));
tf::Transformer transformer;
BOOST_FOREACH(rosbag::MessageInstance const m, view)
{
//instantiate bag message
tf::tfMessage::ConstPtr tfPtr = m.instantiate();
BOOST_FOREACH(geometry_msgs::TransformStamped const tfs, tfPtr->transforms)
{
tf::StampedTransform stampedTf;
tf::transformStampedMsgToTF(tfs, stampedTf);
//setTransform returns true!
transformer.setTransform(stampedTf);
...
}
}
The method setTransform(...) always returns true, so I thought that it works...
Each time I get a transform with child_frame == /robot_1/base_link I want to get the map-base_link-transform at the time of this last transform. But the transformer returns false:
if(transformer.canTransform(baseLink, mapFrame, stampedTf.stamp_)==true){
//lookup transform
transformer.lookupTransform(baseLink, mapFrame, stampedTf.stamp_, temp);
}
A few additional facts:
The transformer's latest common time is always = 0:
transformer.getLatestCommonTime(baseLink, mapFrame, time, NULL)
And printing all tf-messages of the bag shows that the tf-tree should be consistent (I'm using stage and used view_frames):
/map -> /robot_1/odom -> /robot_1/base_footprint -> /robot_1/base_link -> /robot_1/base_laser_link
Is something wrong with my code or idea?
Any suggestions for a better solution?
Thanks in advance!
↧
using tf data from bag files
I'm wondering if there is a good way to get transforms (rotation and translation) from tf data recorded in bag files. I'm working in python and I don't want to replay the bag file as part of the solution. Instead I want to be able to get other messages from the bag file and use tf to find the euclidean distance between two frames on the robot at the same time instant as my message of interest. Any thoughts?
I could look at parent-child relationships in a single tf message and string the transforms together, but then I'd be recreating the functionality of tf. Is there a utility or tool for getting a transformation from a single tf message (i.e. not using ROS to wait for transform or having to replay the bag file). Thanks a lot for any ideas.
↧