I am currently using multimaster_fkie to communicate between two robots.
I can set namespace for all nodes and topics, but I cannot set namespace for the tf trees. The tf frames are conflict with each other.
Is there any solution?
↧
How to set namespace for tf?
↧
tf works only in one way
Using ROS kinetic on Ubuntu 16.04.
I have a urdf description of an industrial robot which works just fine, which has a fixed frame called `base_link` (which is linked to `world`).
I also use a realsense camera which has its own tf tree. The base link of this tree is `camera_link`.
I want to display in rviz the rgbd point-cloud from the realsense together with the robot, and of course this point-cloud should be in the correct location in respect to the robot.
Currently, the camera is static (not located on the robot), so I created a node that all it does is publish a simple constant transformation between `base_link` (parent) and `camera_link` (child). When I check rqt_tf_tree, everything looks just fine.
However, when displaying both robot model and PointCloud2 together in rviz (when `world` is set as the fixed frame), I get an error in PointCloud2 saying that there's no transformation between `base_link` and `camera_link`. The really weird thing though, is that when I set `camera_link` as the fixed frame, there's no error and rviz display correctly everything. The only problem is that I don't want rviz to use `camera_link` as the fixed frame, because then everything is "shifted" in the view.
Does anyone as an idea why would this happen and how can I fix it?
Thanks a lot!
↧
↧
truncating ros time
Hi,
Is there anyway i could truncate the ros time value to certain length, lets say i need the time value to be only up to a certain decimal points
↧
How to map with hector_mapping and just a Lidar
Hi, I just got a lidar (ydlidar x4), and would like to use it to map (just holding the lidar and raspberry pi myself and walk around the room without odometry)..I'm very new to ROS. I have read quite a lot posts...but still could fix the errors..
The lidar is connected to raspberry pi 3, and the scanning data is successfully shown in rviz. But as I tried with hector
_slam, I got an error:
[ INFO] [1539142958.281154496]: lookupTransform base_link to laser_frame timed out. Could not transform laser scan into base_frame.
[ WARN] [1539142958.855603123]: No transform between frames /map and /base_link available after 20.009138 seconds of waiting. This warning only prints once.
To start with hector_slam, I roslaunch the launch file (from another tutorial) below:
While for the mapping default.launch:
For your information, here's my tf tree:
(https://drive.google.com/file/d/1ZJHK-i4tBK2gD2UlJOu_MoGUOTtzqxl6/view?usp=sharing)
Please, anyone knows how to solve this?
↧
Newton-Euler inverse dynamics.
I need to implement newton-euler inverse dynamics algorithm which in case requires forward transformations between COMs of my robot links. Unfortunately I can not find a way to do it with TF. I would do it by myself but I also do not see any way to build kinematic chain but to parse urdf on parameter server manually. Looks like the same work the TF do. Also I need to get masses and inertial properties of robot's links and again I can not find a way to do it.
So the questions are:
1. How can I retrieve inertial properties of my links without writing a parser for URDF?
2. Is it possible to do forward transformations between COMs with TF?
P.S. I want to do it by hands for the first time but if you know some good libraries which can do everything for me, I would like to know about them. Certainly I can google them but having some kind of rating is useful.
Thanks!
↧
↧
transformPose: Lookup would require extrapolation at time
Hello,
i have a simple problem: I cannot transform a pose into another frame.
This
listener = tf.TransformListener()
target = '/link_6'
listener.waitForTransform(pose_curr.header.frame_id, target, rospy.Time(), rospy.Duration(4.0))
transd_pose = listener.transformPose(target, pose_curr)
throws
tf2.ExtrapolationException: Lookup would require extrapolation at time 1539962723.299428940, but only time 1539962723.398022890 is in the buffer, when looking up transform from frame [base_link] to frame [link_6]
However, this works:
listener = tf.TransformListener()
target = '/link_6'
listener.waitForTransform(pose_curr.header.frame_id, target, rospy.Time(), rospy.Duration(4.0))
(trans,rot) = listener.lookupTransform(pose_curr.header.frame_id, target, rospy.Time(0))
What might be the issue here?
↧
Denavit-Hartenberg (DH) to URDF/SDF or /tf /tf2?
Hi!
Can anybody advise, is there any package to build a manipulator using Dehavit-Hartenberg parameters, then convert to URDF/SDF or just use outputs as /tf /tf2 ?
↧
Where can I get TransformStamped values?
I am quite new to ROS and I am creating a new frame "map" as I am using map_server to publish map and also creating a tf2 from map to odom, which is required for the localization. So the first step, is to create a TransformStamped object with details filled in. The header.frame_id and header.child_id is obivious, map and odom, but where can I obtain the transform values? When I have included amcl package and r`un rosrun tf tf_echo map odom` I got these results and I wonder where can I get them?:
At time 13.558
- Translation: [0.013, -0.002, 0.000]
- Rotation: in Quaternion [-0.000, 0.000, 0.000, 1.000]
in RPY (radian) [-0.000, 0.000, 0.000]
in RPY (degree) [-0.000, 0.001, 0.010]
↧
tf view_frame missing, how to reinstall
Hello, I was trying to install tf2 in my catkin_ws so to avoid any conflict I deleted tf. However, tf2 seems to have having problem so I tried reinstalled tf by git clone the depository into my catkin_ws/src and catkin_make. However, catkin_make kept failing. So I move on and install tf by tying:
sudo apt-get install ros-kinetic-tf
It worked, but I am missing view_frame. These are the package shown when I type rosrun tf [tab][tab]:
*static_transform_publisher, tf_echo,
testBroadcaster, tf_empty_listener,
testListener, tf_monitor,
tf_change_notifier, transform_listener_unittest*
When I type view_frames, it gave me an error saying:
Traceback (most recent call last):
File "/home/user-linux/catkin_ws/devel/bin/view_frames", line 7, in
with open(python_script, 'r') as fh:
IOError: [Errno 2] No such file or directory: '/home/user-linux/catkin_ws/src/geometry/tf/scripts/groovy_compatibility/view_frames'
I am using ROS Kinetic. I am using Ubuntu 16.04. Can anyone help?
↧
↧
Why is inverse transform tf so wrong?
While trying to get the base_link footprint of the robot in utm coordinates, we discovered hughe jumps in the utm position. But in the simulation, the real change was only in the millimeter range.
We finally found the issue to be that we asked the inverse transform instead of the transform as it was published. Why is the inverse transform so wrong and not precise at all?
Below are the two transforms
$ rosrun tf tf_echo utm base_footprint
At time 6573.400
- Translation: [-691212.171, -5333943.315, -0.000]
- Rotation: in Quaternion [-0.000, -0.000, 0.036, 0.999]
in RPY (radian) [-0.000, -0.000, 0.071]
in RPY (degree) [-0.000, -0.000, 4.076]
At time 6574.400
- Translation: [-691212.169, -5333943.315, 0.000]
- Rotation: in Quaternion [-0.000, -0.000, 0.036, 0.999]
in RPY (radian) [-0.000, -0.000, 0.071]
in RPY (degree) [-0.000, -0.000, 4.076]
At time 6575.400
- Translation: [-691212.168, -5333943.315, 0.000]
- Rotation: in Quaternion [-0.000, -0.000, 0.036, 0.999]
in RPY (radian) [-0.000, -0.000, 0.071]
in RPY (degree) [-0.000, -0.000, 4.076]
$ rosrun tf tf_echo base_footprint utm
At time 6578.400
- Translation: [1068621.327, 5271316.154, 0.000]
- Rotation: in Quaternion [0.000, -0.000, -0.036, 0.999]
in RPY (radian) [0.000, 0.000, -0.071]
in RPY (degree) [0.000, 0.000, -4.076]
At time 6579.400
- Translation: [1068617.849, 5271316.859, -0.000]
- Rotation: in Quaternion [-0.000, 0.000, -0.036, 0.999]
in RPY (radian) [-0.000, 0.000, -0.071]
in RPY (degree) [-0.000, 0.000, -4.076]
At time 6580.400
- Translation: [1068614.338, 5271317.570, -0.000]
- Rotation: in Quaternion [-0.000, 0.000, -0.036, 0.999]
in RPY (radian) [-0.000, 0.000, -0.071]
in RPY (degree) [-0.000, 0.000, -4.076]
The same issue persists when using f2_tools echo.py.
Any ideas how to fix this?
↧
callback / trigger on tf
Hi
I would like to get trigger something if a transformation has changed.
For example map ->odom -> base_link.
One solution would be to subscribe to the tf msgs and to all the work but is there a better way? I the case of map -> odom -> base_link I could listen to the odom msg (Motor controller) and pose msg (localization) but that is not what I want. I also want an update even the transformation has not changed meaning i want a trigger if one of the tf's in the tf-chain was re-published. Any suggestions?
For example map ->odom -> base_link.
One solution would be to subscribe to the tf msgs and to all the work but is there a better way? I the case of map -> odom -> base_link I could listen to the odom msg (Motor controller) and pose msg (localization) but that is not what I want. I also want an update even the transformation has not changed meaning i want a trigger if one of the tf's in the tf-chain was re-published. Any suggestions?
↧
tf::TransformListener gets wrong tf
Hello,
TL;DR tf::TransformListener gets a different transform than shown in RVIZ
I am having quite a strange problem. I am using a rosbag file to playback recorded data and I am adding new transforms and using them while doing that:
rosparam set use_sim_time true
rosbag play 2016-09-12-14-51-41.bag --clock
The transform is published with a python node, where the origin is static (defined with cfg-values) and the rotation is coming from the values of an IMU. The interesting part how the tf is published should be this (in the callback of the IMU subscription):
br = tf.TransformBroadcaster()
br.sendTransform((frame_offset_x, frame_offset_y, frame_offset_z),
quaternion,
rospy.Time.now(),
frame_child,
frame_parent
)
With the following values:
frame_offset_x = 0
frame_offset_y = 0
frame_offset_z = 1.2
quaternion = rotation of the IMU (only around the x and y axis)
frame_child = level_laser_2
frame_parent = base_footprint
The created Transform in RVIZ looks like expected. It's 1.2m above base_footprint with some rotation on top.
In the picture, the Fixed Frame is set to "base_footprint"

However if I lookup the transform in a ROS node using the tf::TransformListener I get a different result:
#include
#include
...
tf_listener_ = new tf::TransformListener( ros::Duration(10) ); // in the constructor
...
void callback( const sensor_msgs::PointCloud2Ptr& cloud_in ) {
...
if ( tf_listener_->waitForTransform(tf_target_,
cloud_in->header.frame_id,
cloud_in->header.stamp,
ros::Duration( scan_time_ )) ) {
tf::StampedTransform transform;
tf_listener_->lookupTransform(tf_target_,
cloud_in->header.frame_id,
cloud_in->header.stamp,
transform);
ROS_INFO("%s %s -> %s\tat%lf", node_name_.c_str(),
cloud_in->header.frame_id.c_str(),
tf_target_.c_str(),
cloud_in->header.stamp.toSec());
tfScalar tf[16];
transform.getOpenGLMatrix(tf);
ROS_INFO("%s\n"
"/ %lf\t%lf\t%lf\t%lf \\\n"
"| %lf\t%lf\t%lf\t%lf |\n"
"| %lf\t%lf\t%lf\t%lf |\n"
"\\ %lf\t%lf\t%lf\t%lf /", node_name_.c_str(),
tf[0], tf[4], tf[8], tf[12],
tf[1], tf[5], tf[9], tf[13],
tf[2], tf[6], tf[10], tf[14],
tf[3], tf[7], tf[11], tf[15]);
} else {
// display error here
}
...
}
The output of this is:
[ INFO] [1473756806.521699009, 1473684710.202103154]: /level_laser_velodyne: base_footprint -> level_laser_2 at1473684709.395050
[ INFO] [1473756806.521764836, 1473684710.202103154]: /level_laser_velodyne:
/ 0.997645 0.001908 -0.068567 0.002741 \
| 0.001908 0.998454 0.055557 -0.002221 |
| 0.068567 -0.055557 0.996098 -0.039822 |
\ 0.000000 0.000000 0.000000 1.000000 /
But it should be something like:
[ INFO] [1473756806.521764836, 1473684710.202103154]: /level_laser_velodyne:
/ R R R 0 \
| R R R 0 |
| R R R 1.2 |
\ 0 0 0 1 /
Does anybody know what I am doing wrong?
Thank you for your help,
Tobias
↧
Get one(pitch) euler angle between two orientations (quaternions)
Hi,
I want to get the angle between 2 quaternions (as euler angle), but the angle always needs to be from q1 to q2 (not the shortest angle).
But since there are two ways to get an angle, I need to find the one I need. The following code works, but I'm looking for a nicer solution.
// calculate difference
ori_diff = q1.inverse() * q1;
//just use the pitch part
ori_diff.setValue(0., ori_diff.y(), 0., ori_diff.w());
ori_diff.normalize();
//get euler angle
tf::Matrix3x3 m;
m.setRotation(ori_diff);
tfScalar r1, r2, p1, p2, y1, y2;
m.getRPY(r1, p1, y1, 1);
m.getRPY(r2, p2, y2, 2);
if ( r1 == 0 && y1 == 0 ) {
euler_angle = p1;
} else if ( r2 == 0 && y2 == 0) {
euler_angle = p2;
} else {
ROS_ERROR( ... );
}
↧
↧
Does robot_localization require tf2? what if all odom is measured from centre?
Hi everyone. Bare with me if I get some things wrong here, I am confused on the implementation.
I have my ekf for continuous data and discrete data based on the demonstrations by Clearpath. What I am confused about is the diagram they display in their 2015 seminar on the package, robot localization.
https://roscon.ros.org/2015/presentations/robot_localization.pdf
In this it shows the continuous ekf publishing a Odometry message to move_base and then the discrete, continuous, and 'sensor transforms' converging into tfMessage? This is where I am confused. How are all 3 of these messages converged into a single tfMessage (this is the message that interfaces with gmapping I assume?)? Also my IMU is mounted in the center of my robot (both x and y) and the wheel encoders are reading from a mid point on my robot(one motor per side mounted at the centre of the unit in one dimension and an equal distance from each other, so the total velocity x and angular are read from the mid point). What sort of tf implementation is required to transform these sensors if they are already located at the middle of the robot?
Thanks in advance for any help in advance.
↧
robot_localization transform functions for move_base?
Hi,
I am curious as to the coordinate frames that my odometry data is in after using the robot_localization package ekf as a localisation estimator? I am using wheel odometry and an IMU in a continuous ekf and navsat (from a gps) (and the previous two sensors) in a discrete ekf.
ekf_cont:
frequency: 30
sensor_timeout: 0.1
two_d_mode: true
transform_time_offset: 0.0
transform_timeout: 0.0
print_diagnostics: true
debug: false
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: odom
ekf_disc:
frequency: 30
sensor_timeout: 0.1
two_d_mode: true
transform_time_offset: 0.0
transform_timeout: 0.0
print_diagnostics: true
debug: false
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: map
My solution needs to use this data in conjunction with move_base and gmapping. As I understand it I need to provide a transform for the discrete data. My IMU device is mounted in the exact center and my drive shaft is mounted at the exact centre (in one dimension at an equal distance from the centre in the other dimension, so the speed is measured from the centre in both dimensions).
As I understand, my continuous data has to be presented to move_base in the odometry frame and my discrete data has to be presented as a tf/tfMessage? [Clearpath Husky demo](https://roscon.ros.org/2015/presentations/robot_localization.pdf). If I understand correctly my continuous is in the odometry frame and my discrete is in the map frame but how do I produce this as a tf/tfMessage for my move_base node? Will
↧
tf broadcasting and listening in the same node
I am currently trying to make a perception part using tf.
My robot 's position is broadcasted to tf, and the robot tries to read obstacles using tf.
I want to put this on the same node " " test1 "passed to lookupTransform argument target_frame does not exist.
"Message is generated.
What happens when two nodes tf each succeed, and one node should not?
I think this problem is "time" and I have put in various times.
ros :: Time (), ros :: Time (0). ros :: Time :: now (), ros :: Time :: now () - ros :: Duration (0.1) etc ..
But it was not all.
I tested with two threads on one node, which was a success. What is the problem?
#include "tf/transform_listener.h"
#include
#include
int main(int argc, char **argv) {
ros::init(argc, argv, "tf_b");
tf::TransformBroadcaster br;
tf::Transform transform;
tf::Quaternion q;
ros::NodeHandle nh;
transform.setOrigin(tf::Vector3(1.0, 1.0, 1.0));
q.setRPY(0, 0, 0);
ros::Rate loop_rate(10);
while (ros::ok()) {
ros::Time tGetTime = ros::Time::now();
transform.setRotation(q);
br.sendTransform(
tf::StampedTransform(transform, tGetTime, "world", "test1"));
tf::TransformListener listener;
try {
listener.waitForTransform("/test1", "/world", tGetTime,
ros::Duration(0.1));
tf::StampedTransform transform;
listener.lookupTransform("/test1", "/world", tGetTime, transform);
ROS_INFO_STREAM(transform.getOrigin().getX());
} catch (tf::TransformException &ex) {
ROS_ERROR(ex.what());
}
ros::spinOnce();
loop_rate.sleep();
}
return 0;
}
On top of that I've simplified the code to make it easier to understand.
↧
TF frames order, which should be fixed and which should be the parent and child?
Mobile robot moving around static known map. Map was built with hector SLAM.
Is that the right order for the frame tree?
Do I need the base_footprint?
Fixed fram Odom
Tree
--> odom --> base_link --> laser --> map
↧
↧
tf::Transform getOrigin returning different value from rviz
Hi there!
I'm trying to see the pose and rotation from a UR5 using tf and comparing with rviz and the robot GUI itself. In rviz I get the same result as the GUI, but when I try to listen to the transform and see the values, and the values are different from expected.
#include
#include
//tf
#include
#include
int main(int argc, char** argcv){
ros::init(argc, argcv, "world_to_robot_pose");
tf::StampedTransform transform;
static tf::TransformListener listener;
while(ros::ok()){
try{
listener.lookupTransform("/tool0_controller","/world",ros::Time(0), transform);
ROS_INFO_STREAM(" Transform: " << transform.getOrigin().x() << ", " << transform.getOrigin().y() << ", " <
↧
Strange behavior between move_base Navigation Stack and Rviz
This is my graduation project from college and I can't figure out the problem. I have an environment build specifically for this project and build a map of it with hector SLAM. The goal is for the Mobile robot moving around the static map for specific goal set by Rviz.
`UPDATE THE OLD 2 BEHAVIORS ARE NOT HAPPENING ANYMORE: I ADJUSTED LINOROBOT COSTMAPS TO WORK FOR ME AND IT FIXED THOSE 2 ERRORS. HOWEVER, THIS SOLUTION INTRODUCED:`
> DWA planner failed to produce path
> Off Map 1.098, -0.050
**Old behaviors:**
> Rviz behavior Video: [Google drive
> Rviz localization
> video](https://drive.google.com/file/d/1AKhtZoMIMrkTengJJV6RVGA-c4bFaosM/view?usp=sharing)
>> Rviz behavior when trying to estimate
> pose: [Google drive estimate pose
> behavior](https://drive.google.com/file/d/1Qr6w6JOIF65OAzy3w22B0fuvbzO3LuPg/view?usp=sharing)
> **NOTE: the robot was not in the designed environment when taking this video, this video is just for demonstration**
>
**The errors I'm getting:** `UPDATED`
> roswtf:
WARNING The following node subscriptions are unconnected:
* /amcl:
* /tf_static
* /rviz_1542954558890249147:
* /move_base/TrajectoryPlannerROS/global_plan
* /move_base/NavfnROS/plan
* /tf_static
* /map_updates
* /move_base/TrajectoryPlannerROS/local_plan
* /rqt_gui_py_node_27505:
* /statistics
* /move_base:
* /tf_static
* /move_base/cancel
WARNING Received out-of-date/future transforms:
* receiving transform from [/amcl] that differed from ROS time by 1.176933497s
Found 1 error(s).
ERROR The following nodes should be connected but aren't:
* /move_base->/move_base (/move_base/global_costmap/footprint)
* /move_base->/move_base (/move_base/local_costmap/footprint)
> AMCL warning: **[ WARN]
> [1542907954.448988739]: Frame_id of
> map received:'/odom' doesn't match
> global_frame_id:'map'. This could
> cause issues with reading published
> topics**
`FIXED BY MAKING MAP_SERVER FRAME ID TO BE MAP`
>Rviz Global option's fixed
> frame: **map**
`FIXED BY THE COSTMAP CONFIG`
> Blockquote
> TF configuration: **Tree --> odom -->> base_link --> laser --> map**
>> current `updated` TF configuration: map-->odom-->base_link-->laser
> rqt_graph: [rqt graph
> picture](https://drive.google.com/file/d/1Q5SWaZAXKYHpMi4VTCKD22pr-TfieJ_Q/view?usp=sharing)
**Packages used:**
- Rplidar (for the lidar I have)
- motor_hat (for the motor control)
- robot_setup_tf (for publishing the tf information)
- robot_drive (for publishing commands for the motor_hat)
- differential_drive (for publishing odom, twist commands and the PID controller)
- move_base (costmaps)
- amcl (for localization)
- map_server (to publish the map)
**robot launch file:**
6000 8.0 0.294 0.294 75 148 0 -255 255 30 4 5 75 148 0 -255 255 30 4 5
**move_base launch file:**
**robot_setup_tf:**
#include
#include
int main(int argc, char** argv)
{
ros::init(argc, argv, "robot_tf_publisher");
ros::NodeHandle n;
ros::Rate r(100);
tf::TransformBroadcaster broadcaster;
while(n.ok()){
//broadcaster.sendTransform(
//tf::StampedTransform(
//tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.0, 0.0, 0.0)),
//ros::Time::now(), "map", "odom"));
broadcaster.sendTransform(
tf::StampedTransform(
tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.0, 0.0, 0.2)),
ros::Time::now(),"base_link", "laser"));
r.sleep();
}
}
**Costmaps:**
base_local_planner_params.yaml:
TrajectoryPlannerROS:
max_vel_x: 0.45
min_vel_x: 0.1
max_vel_theta: 1.0
min_vel_theta: -1.0
min_in_place_vel_theta: 0.4
escape_vel: -0.1
acc_lim_theta: 3.2
acc_lim_x: 2.5
acc_lim_y: 2.5
holonomic_robot: false
yaw_goal_tolerance: 0.05
xy_goal_tolerance: 0.10
latch_xy_goal_tolerance: false
sim_time: 2
sim_granularity: 0.10
angular_sim_granularity: 0.10
vx_samples: 3
vtheta_samples: 20
meter_scoring: true
pdist_scale: 0.6
gdist_scale: 1.0
occdist_scale: 0.1
heading_lookahead: 0.325
heading_scoring: false
dwa: true
global_frame_id: odom
oscillation_reset_dist: 0.30
prune_plan: true
costmap_common_params.yaml:
footprint: [[-0.15,-0.15],[-0.15,0.15], [0.15, 0.15], [0.15,-0.15]]
obstacle_range: 2.0
raytrace_range: 2.5
inflation_radius: 0.3
observation_sources: laser_scan_sensor
laser_scan_sensor: {sensor_frame: laser, data_type: LaserScan, topic: scan, marking: true, clearing: true}
global_costmap_params.yaml:
global_costmap:
global_frame: map
robot_base_frame: base_link
update_frequency: 5.0
static_map: true
transform_tolerance: 0.5
local_costmap_params.yaml:
local_costmap:
global_frame: odom
robot_base_frame: base_link
update_frequency: 5.0
publish_frequency: 2.0
static_map: false
rolling_window: true
width: 2.0
height: 2.0
resolution: 0.05
transform_tolerance: 0.5
**amcl configuration launch file:**
↧
Calculate the transformation from map to odom
Hi all,
For the last a few days, I am struggling with the correct way to calculate the transformation from map to odom. I have a robot system, which publish odom message. Since odom drift, I wrote my own algorithm. My algorithm can compute the location and orientation of the robot in the world (map) system. And I would like to publish a "map" frame and the transformation from map to odom.
I am working with a 2D robot and suppose `geometry_msgs::Point position1` is the robot location in odom, `geometry_msgs/Quaternion orientation1` is the robot orientation in odom, `x, y, theta` is real location and orientation of the robot in the world system I calculated with my algorithm. `theta` uses the x axis as 0 degree. What is the correct way to calculate the transformation from map to odom and publish it in tf?
Below is the code I am using to publish tf:
tf::StampedTransform tf_map_to_odom;
// set up parent and child frames
tf_map_to_odom.frame_id_ = string("map");
tf_map_to_odom.child_frame_id_ = string("odom");
// set up publish rate
ros::Rate loop_rate(100);
// set the time
tf_map_to_odom.stamp_ = ros::Time::now();
//set the transformation
tf_map_to_odom.setOrigin(***How to calculate this?***));
tf_map_to_odom.setRotation(***How to calculate this?***);
// broadcast transform
_tf_br.sendTransform(tf_map_to_odom);
Thank you very much for your help.
↧