I'm working through the `python` [tf tutorial](http://wiki.ros.org/tf/Tutorials/Time%20travel%20with%20tf%20%28Python%29) and am having an issue with an extrapolation into the past. Here's the section:
>So let's go back to where we ended in the previous tutorial. Go to your package for the tutorial:
$ roscd learning_tf
>Now, instead of making the second turtle go to where the first turtle is now, make the second turtle go to where the first turtle was 5 seconds ago. Edit nodes/turtle_tf_listener.py:
try:
now = rospy.Time.now() - rospy.Duration(5.0)
listener.waitForTransform("/turtle2", "/carrot1", now, rospy.Duration(1.0))
(trans, rot) = listener.lookupTransform("/turtle2", "/carrot1", now)
except (tf.Exception, tf.LookupException, tf.ConnectivityException):
The referenced "previous tutorial" taught about `waitForTransform()` and fixed a future extrapolation like so:
listener.waitForTransform("/turtle2", "/carrot1", rospy.Time(), rospy.Duration(4.0))
while not rospy.is_shutdown():
try:
now = rospy.Time.now()
listener.waitForTransform("/turtle2", "/carrot1", now, rospy.Duration(4.0))
(trans,rot) = listener.lookupTransform("/turtle2", "/carrot1", now)
Based on that (which worked) and the changes shown above, here is my full code:
#!/usr/bin/env python
import rospy
import tf
import turtlesim.srv
import geometry_msgs.msg
import math
if __name__ == '__main__':
rospy.init_node('tf_turtle')
listener = tf.TransformListener()
rospy.wait_for_service('spawn')
spawner = rospy.ServiceProxy('spawn', turtlesim.srv.Spawn)
spawner(4, 2, 0, 'turtle2')
turtle_vel = rospy.Publisher('turtle2/cmd_vel',
geometry_msgs.msg.Twist,
queue_size=1)
rate = rospy.Rate(10)
listener.waitForTransform('/turtle2', '/carrot1',
rospy.Time().now(),
rospy.Duration(4.0))
while not rospy.is_shutdown():
try:
now = rospy.Time.now() - rospy.Duration(5.0)
listener.waitForTransform('/turtle2', '/carrot1', now, rospy.Duration(1.0))
(trans, rot) = listener.lookupTransform('/turtle2',
'/carrot1',
now)
except(tf.LookupException, tf.ConnectivityException, tf.ExtrapolationException):
continue
angular = 4 * math.atan2(trans[1], trans[0])
linear = 0.5 * math.sqrt(trans[0] ** 2 + trans[1] ** 2)
cmd = geometry_msgs.msg.Twist()
cmd.linear.x = linear
cmd.angular.z = angular
turtle_vel.publish(cmd)
rate.sleep()
I'm getting these types of errors about past extrapolation when I run the above:
listener.waitForTransform('/turtle2', '/carrot1', now, rospy.Duration(1.0))
tf.Exception: Lookup would require extrapolation into the past. Requested time 1498858934.730669022 but the earliest data is at time 1498858934.911530972, when looking up transform from frame [carrot1] to frame [turtle2]. canTransform returned after 1.00266 timeout was 1.
I've tried inserting a `rospy.sleep(5)` above the `while` section to ensure that there's 5 seconds of data, as well as having the first call to `waitForTransform` use `rospy.Time.now() - rospy.Duration(5.0)` to match the second argument. I'm not really sure what's going on. The tutorial goes on to suggest this is not really what you want anyway, but it still shows it like it should run so I'm confused at my results!
↧
tf transform tutorial issue (past extrapolation)
↧
tf broadcasting rate
hello
im trying to increase the rate of which tf transform is being published
it seems that no matter what i do the rate stays 1 hz
even if cout prints at 10 hz, when i tf_echo the transform it still appears to be
published at 1 hz
i didnt find in ROS ANSWER any documentation about that
thanks!
#include
#include
#include
int main(int argc, char** argv){
ros::init(argc, argv, "tfpub");
ros::NodeHandle n;
ros::Rate r(10.0);
tf::TransformBroadcaster broadcaster;
while(n.ok()){
broadcaster.sendTransform(
tf::StampedTransform(
tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.1, 0.0, 0.2)),
ros::Time::now(),"map", "odom"));
std::cout<< ros::Time::now() <
↧
↧
tf_echo is correct, yet no result from lookuptranform
Hey guys,
I'm using urdf withrviz to do forward kinematics related problem. My question is can we get the results of tf_echo in a script? Also, I'm looking for suggestions why this problem happened.
I could use
rosrun tf tf_echo /base /tool0
to get the correct result:
At time 1499215724.972
- Translation: [0.410, 0.191, 0.947]
- Rotation: in Quaternion [0.707, 0.000, -0.000, 0.707]
in RPY (radian) [1.571, 0.000, 0.000]
in RPY (degree) [89.997, 0.008, 0.004]
Yet, I couldn't use
(trans, rot) = listener.lookupTransform('base', 'tool0', rospy.Time(0)), refer to [here](http://wiki.ros.org/tf/Tutorials/Writing%20a%20tf%20listener%20%28Python%29) to get the transform, it would throw an exception.
I also tried
>now = rospy.Time.now()> listener.waitForTransform("base", "tool0", now, rospy.Duration(10.0))
tf.Exception: . canTransform returned after 10.0023 timeout was 10.
I chagned 10 here to 0.01, 1, etc. and it always gives similar exception.
↧
Fixed frame does not exist
I use rviz to visualize some moving objects with rviz Markers. All of the Markers' frame_id is "/my_frame". I also use map_server to load a map with frame_id "/map". I use static_transform_publisher to build up the relation between /map and /my_frame.
Here is the Launch file:
In simulator.rviz, I set the fixed frame in global options as /map.
But it gives a warning in rviz that the fixed frame does not exist and there is no objects showing in the rviz.
I am new in ROS. Appreciate someone's kind help!
↧
Is there a way to change the /tf in a rosbag?
I currently have a rosbag with a /tf topic containing only static transforms. Is there a way I can alter one of the transforms?
I have tried creating a new filtered rosbag not containing the /tf and generating the /tf from a launch file. But I get issues when trying to display the pointclouds in rviz.
I get the error : "Message removed because it is too old" when trying to display PointCloud2 topics.
↧
↧
How to transform PointCloud2 with TF?
Hi,
I'm writing a small node aggregating data from a Kinect sensor (PointCloud2) taken from different angle of view. I therefore use the tf library to transform all images in the /world frame before aggregating them with the pcl lib.
After looking through tf API doc, I found the TransformListener::transformPointCloud function to tranform PointCloud messages but nothing to transform PointCloud2 messages. I am therefore using a code like this:
`sensor_msgs::convertPointCloud2ToPointCloud(pc2SensorFrame,pcSensorFrame) TfListener.transformPointCloud("/world", pcSensorFrame, pcWorldFrame); sensor_msgs::convertPointCloudToPointCloud2(pcWorldFrame,pc2WorldFrame) pcl::fromROSMsg(pc2WorldFrame,pclCloud);` It seems to work but I was wondering if there was a more direct way to transform PointCloud2 with tf avoiding the conversion in PointCloud. Thanks
`sensor_msgs::convertPointCloud2ToPointCloud(pc2SensorFrame,pcSensorFrame) TfListener.transformPointCloud("/world", pcSensorFrame, pcWorldFrame); sensor_msgs::convertPointCloudToPointCloud2(pcWorldFrame,pc2WorldFrame) pcl::fromROSMsg(pc2WorldFrame,pclCloud);` It seems to work but I was wondering if there was a more direct way to transform PointCloud2 with tf avoiding the conversion in PointCloud. Thanks
↧
Quaternion to Rotate Rigid Frame
I have arbitrary planes defined by coefficients A,B,C,D such that Ax + By + Cz + D = 0.
I want to fit a solid box over each plane such that one short edge of the box is perpendicular to the normal of the plane, with the other two edge lengths being longer, and with one of the other two edges parallel to the horizontal XY plane. I've been able to construct the position of the box but making the orientation has been tricky.
The coefficients given above for the plane make it really easy to get the normal vector to the plane N = (A,B,C). The way I've been trying to compose the orientation of the box is:
1. Create a vector V normal to the global X axis and to the plane's normal vector by taking their cross product
2. Using a dot product determine the angle theta between the global X axis and the plane normal N
3. Create a quaternion using theta and V as the angle and rotation axis.
Using that quaternion for the object orientation gets me a box with its X-axis normal to the plane, but doesn't account for rotation of the box about its own X-axis. To get one of the edges (here its Y-axis) of the box parallel to the ground XY plane, I've been trying:
1. Rotate the global Y-axis by the previously made quaternion using tf::quatRotate to produce the intermediate local Y-axis Y'
2. Take the cross product of N and the global Z-axis to get a vector E parallel to the target plane and to the XY plane (E is the desired local box Y-axis)
3. Determine the angle between the E and Y' vectors
4. Use that angle and N to make a second quaternion, multiple this with the first quaternion to get the final box orientation
I haven't actually been able to get the second part to completely work yet, although I think the logic makes sense? **But I was wondering if there were any utilities in TF that I've overlooked which might make this whole process easier / anywhere this is already implemented.**
Fundamentally I just want to find the quaternion necessary to rotate a pair of perpendicular axes to a desired orientation, from a given start orientation, where the start and end axes are all defined as vectors.
↧
robot_localization not generating odom to robot's sensor_frame transform
Hello all,
I'm trying to fuse an IMU and a DVL (measures linear velocity on x, y, z). I have the following axis configurations.
- base_link : robot's frame
- base_imu :
imu's frame
- base_dvl : DVL's frame
I have a transform broadcaster which broadcasts base_imu->base_link and base_dvl->base_link. The picture attached shows the frame poses. I have set the fixed frame as base_link here (rviz). 
This is the TF tree:

But when I run robot_localization, the localization is perfect but it doesn't show my sensor frames. There are also some warnings generated in rviz (check left pane), saying there isn't any transform to odom frame. But there is a transform as per the tf_tree. Here is a picture:

How do I fix this issue?
↧
ROS Navigation - AMCL not transforming map to odom
I am running a turtlebot 2 with an RPLidar instead of a Kinect, trying to get the navigation stack working.
I am unable to get AMCL to transform map to odom. My current tf tree looks like this: https://drive.google.com/file/d/0B-2f_ZBhV2PKVkhCZW1vaVY3em8/view?usp=sharing
I am running two transform scripts between base_link and base_footprint, and base_link and base_laser. When I execute move_base.launch, I get the following error: `[ WARN] [1500318096.546547244]: Timed out waiting for transform from base_link to map to become available before running costmap, tf error: . canTransform returned after 0.100719 timeout was 0.1.`
In my launch file, I have: ` `. which references: https://www.pastiebin.com/596d0bc02173e
I'm not sure what I'm doing wrong and why AMCL is not transforming map to odom.
↧
↧
Use imu in Hector mapping with low cost laser distance sensor
We are using a low cost laser distance sensor [rplidar A2](http://wiki.ros.org/rplidar) on a [turtlebot2 robot](http://wiki.ros.org/Robots/TurtleBot/) . Because in some cases(probably due to similar environment) there is false scan matching leading to inconsistent map, we are considering using imu data from turtlebot2 to solve this problem. We are using [robot_pose_ekf](http://wiki.ros.org/robot_pose_ekf) to fuse the information from wheel odometry with imu. By this way, the imu data is fused into the tf tree.
We found that in hector mapping, if the parameter **"use_tf_pose_start_estimate"** is set to be true (of course, another parameter **"use_tf_scan_transformation**" is set to te true beforehand), the tf data can be used during the mapping process.
However, it always outputs an error:
[ERROR] [1500360203.142398818, 11186.930000000]: Transform from map to base_link failed with Lookup would require extrapolation into the future. Requested time 11185.970000000 but the latest data is at time 11185.370000000, when looking up transform from frame [base_link] to frame [map]
We find this error is generated [here](https://github.com/tu-darmstadt-ros-pkg/hector_slam/blob/catkin/hector_mapping/src/HectorMappingRos.cpp#L275) in the source code.
The above error is generated not only in the experiments on the real robot but also in the simulated environments created by gazebo. Even if we increase the duration in waitfortransform to be 2.0, the error still exists.
@Stefan Kohlbrecher If possible, we are looking forwards one of the author of hector mapping give us great advices. Thank you!
↧
TF inverse of a pose
I am trying to understand the [pose_follower](https://github.com/ros-planning/navigation_experimental/blob/hydro-devel/pose_follower/src/pose_follower.cpp) package.
It simply tries to follow the waypoints sent from the global planner (through move_base). In a function called diff2D, there is this line:
tf::Pose diff = pose2.inverse() * pose1;
where pose2 is the robot current pose and pose1 is the waypoint pose. It seems like this line returns the difference between the two poses. My questions are:
1) how this line (internally) works? or what is the mathematical concept behind this?
2) does this line also calculate the difference between the two quaternions of the poses?
3) Another question is what is the difference between tf::Pose and tf::Transform
↧
How to find source of TF_NAN_INPUT error
While using `amcl` and `move_base` I'm getting the following errors while applying a `tf_prefix`. It worked without the prefix but now that I'm using it, I get the following error:
remote[red1-0]: Error: TF_NAN_INPUT: Ignoring transform for child_frame_id "robot_tf/odom" from authority "unknown_publisher" because of a nan value in the transform (nan nan nan) (nan nan nan nan)
at line 240 in /tmp/binarydeb/ros-indigo-tf2-0.5.15/src/buffer_core.cpp
Error: TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "robot_tf/odom" from authority "unknown_publisher" because of an invalid quaternion in the transform (nan nan nan nan)
at line 253 in /tmp/binarydeb/ros-indigo-tf2-0.5.15/src/buffer_core.cpp
From the error listed above, what transform is being ignored? To the `robot_tf/odom` frame? What is the `authority` (I know that it says `"unknown_publisher"` but what is an `authority` in this context)?
At this point, I'm just trying to figure out how to read this error so that I can figure out where the issue lies. I know that `TF_NAN_INPUT` means that somewhere `NaN` values are being set in a transform and `TF_DENORMALIZED_QUATERNION` means that somewhere a denormalized quaternion is being set.
In case it is useful, here is the error that `roswtf` reports:
ERROR TF multiple authority contention:
* node [/robot/laser_broadcaster_NGNeuromorphic1_18892_5059838883645381072] publishing transform [robot_tf/base_laser] with parent [robot_tf/base_link] already published by node [/robot/robot_tf/pioneer_transform_node_NGNeuromorphic1_18892_3963024023289368467]
* node [/robot/robot_tf/pioneer_transform_node_NGNeuromorphic1_18892_3963024023289368467] publishing transform [robot_tf/base_laser] with parent [robot_tf/base_link] already published by node [/robot/laser_broadcaster_NGNeuromorphic1_18892_5059838883645381072]
and the output from `rqt_tf_tree`:

↧
fuse imu and lidar in hector slam
Hi, I run hector slam successfully, and get the map using the bagfile. But I do not know how to use it on my robot, because I found that in bagfile, there is a `/tf` topic exist. But where is the `/tf` topic comes from? And what is the relation between this `/tf` topic with imu data? Any idea? A tutorial will be great, thanks.
↧
↧
Is there a standard way to transform a pose between coordinate frames?
I have a pose message that I want to transform between two (non-static) coordinate frames. Currently I am doing the following in python:
def rotate_orientation(ori, q):
rot_mat = tf.transformations.quaternion_matrix(q)
pose_rot = rot_mat.dot([ori.x, ori.y, ori.z, ori.w])
ori.x = pose_rot[0]
ori.y = pose_rot[1]
ori.z = pose_rot[2]
ori.w = pose_rot[3]
def translate_position(pos, t):
pos.x += t[0]
pos.y += t[1]
pos.z += t[2]
def odom_to_utm(odom):
global utm_pub
trans,rot = listener.lookupTransform('utm', 'odom', rospy.Time(0))
translate_position(odom.pose.pose.position, trans)
rotate_orientation(odom.pose.pose.orientation, rot)
utm_pub.publish(utm)
But it feels a bit clunky. Is there a more standard way to switch between coordinates frames? Something like a `tf.transformPose(pose, frame1, frame2)` that returns `pose` in frame 2.
↧
Georeferencing map coordinates
I was wondering how ROS handles georeferencing of map coordinates. My robot currently sets its position when it starts up as position 0.0. However this doesn't give any information about where I am in on earth.
How do I convert map coordinates to GPS coordinates and back again? That is, how do typical ROS-powered robots do it? TF doesn't seem suitable for this because converting GPS to map coordinates isn't as simple as a matrix transformation, if I understand correctly.
I'm also trying to figure out how to do this with map_server. As I understand, the map_server is initialized with a yaml file containing map coordinates of the underlying map. However, my robot could start up at any point on earth, so the yaml file would always be incorrect, if I set it to a particular map coordinate. How can I set up the map_server based on a GPS coordinate, rather than a map coordinate?
↧
hector slam launch with odom and hokuyo
Hi,
I'm trying to use hector mapping on my custom robot to produce a map, this works fine without any odometry information from my odom node.
here's the launch file (i didn't write it, but it works pretty well).
The problem is when i try and use it with my odometry information, published from my odom node (which has the same framework as the one from the odom tutorial, but is customised to take my encoder data and use my equations) -
#include
#include
#include
#include
#include
#define RovWid 0.25
#define Pi 3.141592865358979
#define TicksPerRev 600
#define WheelDia 0.13
float left,right; //declare global variables to hold incoming data
void CallbackLeft(const std_msgs::Float32::ConstPtr& msg)
{
//ROS_INFO("Left ticks [%f]", msg->data);
left = msg->data;
}
void CallbackRight(const std_msgs::Float32::ConstPtr& msg)
{
//ROS_INFO("Right ticks [%f]", msg->data);
right = msg->data;
}
int main(int argc, char** argv){
ros::init(argc, argv, "odometry_publisher");
ros::NodeHandle n;
ros::Publisher odom_pub = n.advertise("odom", 50);
ros::Subscriber sub1 = n.subscribe("ticks_left", 100, CallbackLeft);
ros::Subscriber sub2 = n.subscribe("ticks_right", 100, CallbackRight);
tf::TransformBroadcaster odom_broadcaster;
double vx = 0.0;
double vy = 0.0;
double vth = 0.0;
float DeltaLeft = 0;
float DeltaRight = 0;
float PreviousRight = 0;
float PreviousLeft = 0;
float Theta = 0;
float X = 0;
float Y = 0;
float Circum, DisPerTick, expr1,right_minus_left;
ros::Time current_time, last_time;
current_time = ros::Time::now();
last_time = ros::Time::now();
ros::Rate r(5);
ROS_INFO("Node started");
Circum = Pi * WheelDia;
DisPerTick = Circum / TicksPerRev;
while(n.ok()){
ros::spinOnce(); // check for incoming messages
current_time = ros::Time::now();
DeltaRight = (right - PreviousRight) * DisPerTick;
DeltaLeft = (left - PreviousLeft) * DisPerTick;
PreviousRight = right;
PreviousLeft = left;
if (DeltaLeft == DeltaRight){
X += DeltaLeft * cos(Theta);
Y += DeltaLeft * sin(Theta);
}
else{
expr1 = RovWid * 2 * (DeltaRight + DeltaLeft)/ 2.0 / (DeltaRight - DeltaLeft);
right_minus_left = DeltaRight - DeltaLeft;
X += expr1 * (sin(right_minus_left / (RovWid *2) + Theta) - sin(Theta));
Y -= expr1 * (cos(right_minus_left / (RovWid *2) + Theta) - cos(Theta));
Theta += right_minus_left / (RovWid *2);
Theta = Theta - ((2 * Pi) * floor( Theta / (2 * Pi)));
}
ROS_INFO("X [%f]", X);
ROS_INFO("Y [%f]", Y);
ROS_INFO("Theta [%f]", Theta);
ROS_INFO("\n");
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(Theta);
geometry_msgs::TransformStamped odom_trans;
odom_trans.header.stamp = current_time;
odom_trans.header.frame_id = "odom";
odom_trans.child_frame_id = "base_link";
odom_trans.transform.translation.x = X;
odom_trans.transform.translation.y = Y;
odom_trans.transform.translation.z = 0.0;
odom_trans.transform.rotation = odom_quat;
odom_broadcaster.sendTransform(odom_trans);
nav_msgs::Odometry odom;
odom.header.stamp = current_time;
odom.header.frame_id = "odom";
odom.pose.pose.position.x = X;
odom.pose.pose.position.y = Y;
odom.pose.pose.position.z = 0.0;
odom.pose.pose.orientation = odom_quat;
odom.child_frame_id = "base_link";
odom.twist.twist.linear.x = 0;
odom.twist.twist.linear.y = 0;
odom.twist.twist.angular.z = 0;
odom_pub.publish(odom);
last_time = current_time;
r.sleep();
}
}
This published odometry information just fine. The problem is when i try and incorporate the two.
Following the hector slam robot setup, i change the base frame and odom frame too:
but i get the following timeout:
[ INFO] [1490208346.381940938]: lookupTransform base_frame to laser timed out. Could not transform laser scan into base_frame.
This stops the slam node from creating anything
When i change the base frame to base_link as it was in the original, i get the following error:
[ERROR] [1490208025.858563554]: Transform failed during publishing of map_odom transform: Lookup would require extrapolation at time 1490208024.946150226, but only time 1490208024.845273926 is in the buffer, when looking up transform from frame [base_link] to frame [odom]
This error still allows for mapping, but makes no use of my odometry info.
No doubt the problem is me not setting up the required transfrom, but im genuinely at a loss with what to do about it. Help would be greatly appreciated. At the very least i hope someone finds my code useful.
↧
How to add odometry into hector slam?
Hi, we are using [hector slam](http://wiki.ros.org/hector_slam) for builiding 2D map with [rpliar laser](http://wiki.ros.org/rplidar). To solve the problem that false mapping in environmets similiar to long corridor, we want to add odometry into hector slam.
We read the source code of [hector_mapping](https://github.com/tu-darmstadt-ros-pkg/hector_slam/tree/catkin/hector_mapping). We think that we can achieve our goal by setting the parameter "[p_use_tf_pose_start_estimate_](https://github.com/tu-darmstadt-ros-pkg/hector_slam/blob/catkin/hector_mapping/src/HectorMappingRos.cpp#L268)" to be true.
In this way, the mapping process uses the odometry information stored in tf tree. The parameter "p_use_tf_pose_start_estimated_" may not be set to be true in the beginning. When the robot is going to walk througth a corridor, "p_use_tf_pose_start_estimated_" is set to be true.
However, in our tests, whenever the "p_use_tf_pose_start_estimate_" is set to be true, it always output following error
[ERROR] [1501720267.512793847]: Transform from map to base_link failed with Lookup would require extrapolation into the future. Requested time 84.940000000 but the latest data is at time 84.740000000, when looking up transform from frame [base_link] to frame [map]
So solve this error, we launched move_base as [this link](http://answers.ros.org/question/99736/navigate-using-hector_slam/) suggests. But it didn't work.
So we have two questions
1, How to solve above error? Why is it difficult for reading transforms from tf tree in hector slam?
2, What are other ways for adding odometry information into hector slam?
Looking for any valuable advices! Many thanks!
Best Regards!
↧
↧
How does mrpt_icp_slam use odometry?
We want to know how a scan matching based method uses odometry for building a map. We study the [mrpt_icp_slam_2d](http://wiki.ros.org/mrpt_icp_slam_2d) which is a icp matching based method. In its ros wiki, it claims that "Odometry is optional". To our understandings, there should be a function to fuse the odometry information stored in tf tree.
However, we read the [source code here](https://github.com/mrpt-ros-pkg/mrpt_slam/blob/master/mrpt_icp_slam_2d/src/mrpt_icp_slam_2d_wrapper.cpp#L369), there only a function **ICPslamWrapper::updateSensorPose(std::string _frame_id)** that updates transfrom between the laser sensor frame and base frame.
Odom frame is not involved. There is also no ros::subscriber to deal with odom topic.
So how does mrpt_icp_slam use odometry?
↧
How does mrpt_icp_slam use odometry?
We want to know how a scan matching based method uses odometry for building a map. We study the [mrpt_icp_slam_2d](http://wiki.ros.org/mrpt_icp_slam_2d) which is a icp matching based method. In its ros wiki, it claims that "Odometry is optional". To our understandings, there should be a function to fuse the odometry information stored in tf tree.
However, we read the [source code here](https://github.com/mrpt-ros-pkg/mrpt_slam/blob/master/mrpt_icp_slam_2d/src/mrpt_icp_slam_2d_wrapper.cpp#L369), there only a function **ICPslamWrapper::updateSensorPose(std::string _frame_id)** that updates transfrom between the laser sensor frame and base frame.
Odom frame is not involved. There is also no ros::subscriber to deal with odom topic.
So how does mrpt_icp_slam use odometry?
↧
ros::moveit adding rotation to xyz-coordinates
Hello people,
I am currently programming a 6 DOF robotic arm using ros and moveit for the navigation. My goals are to send x,y,z coordinates as well as a fixed orientation of my gripper in order to succesfully grab certain objects.
Following the tutorial i can send x,y,z coordinates like this just fine:
geometry_msgs::Pose target_pose1;
target_pose1.orientation.w = 1;
target_pose1.position.x = 0;
target_pose1.position.y = 0;
target_pose1.position.z = 0.55;
move_group.setPoseTarget(target_pose1);
Using this method, my end effector stays in its start orientation.
I can also send a effector orientation like this:
move_group.setRPYTarget(1,1,0);
However, now i the orientation changes to the given values but the x,y,z-coordinates seem rather random.
Using both commands in a row, the sedond one overwrites the first one, leading to an insufficient result.
Right now I am stuck messing around with quaternion transformation using the tf package not knowing whether this is is the right way at all. Seems like this package is unable to combine both position and orientation as well.
Does somebody know a way to send x,y,z-coordinates AND rpy-orientations (c++)?
Thank you in advance,
Felix
↧