
Is this frequency of tf between `/base_link` and `/laser` normal? It is 10000HZ!!!
↧
Tf frequency is too large
↧
tf transformations euler agels and rotation matrix not unique
I have written a little test to learn the inner wokings of tf.transformations. I make a rotation matrix from euler angels (euler_matrix) and then extract the angels from this matrix(euler_from_matrix); many times the result is not equal to the initial inbput. What am I doing wrong?
Here is my code
euler1 = (np.random.rand(1,3)*math.pi).tolist()[0]
R1 = euler_matrix(*euler1)
euler2 = (np.random.rand(1,3)*math.pi).tolist()[0]
R2 = euler_matrix(*euler2)
eu1 = euler_from_matrix(R1)
eu2 = euler_from_matrix(R2)
print np.isclose(eu1,euler1)
print np.isclose(eu2,euler2)
print np.isclose(np.tan(eu1),np.tan(euler1))
print np.isclose(np.tan(eu2),np.tan(euler2))
↧
↧
tf transformations euler and rotation matrix not unique
I have written a little test to learn the inner wokings of tf.transformations. I make a rotation matrix from euler angels (euler_matrix) and then extract the angels from this matrix(euler_from_matrix); many times the result is not equal to the initial inbput. I am sampling euler angels from 0-pi range and I also compare tangents(also not equal). What am I doing wrong?
Here is my code
euler1 = (np.random.rand(1,3)*math.pi).tolist()[0]
R1 = euler_matrix(*euler1)
eu1 = euler_from_matrix(R1)
print np.isclose(eu1,euler1)
print np.isclose(np.tan(eu1),np.tan(euler1))
↧
robot_localization producing 180° rotation jumps only when static publishing map->odom
I have configured `world_frame=odom`. When I statically publish TF map->odom my output of ekf_localization_node is jumping a lot. I've looked all over for the problem but I am stuck!
Here's an 8 second streamable video of the problem seen in rviz:
https://drive.google.com/file/d/0B6qINUvvDwDfMzdWUG1CWWNlMm8/view?usp=sharing
When I comment out...
...the problem goes away and I see good, stable, filtered odom.
My data bag of the problem
https://drive.google.com/file/d/0B6qINUvvDwDfOGxYdVFvUlFMNzg/view?usp=sharing
My ekf_localization_node.launch
[false, false, false,
true, true, true,
false, false, false,
false, false, false,
false, false, false] [true, true, false,
false, false, false,
false, false, false,
false, false, true,
false, false, false]
My rtabmap rgbd_odometry.launch
My roswtf
dan@ubuntu:~/URC$ roswtf
Loaded plugin tf.tfwtf
No package or stack in context
================================================================================
Static checks summary:
No errors or warnings
================================================================================
Beginning tests of your ROS graph. These may take awhile...
analyzing graph...
... done analyzing graph
running graph rules...
... done running graph rules
running tf checks, this will take a second...
... tf checks complete
Online checks summary:
Found 2 warning(s).
Warnings are things that may be just fine, but are sometimes at fault
WARNING The following node subscriptions are unconnected:
* /ekf_localization_local:
* /set_pose
* /rqt_gui_py_node_8452:
* /statistics
WARNING The following nodes are unexpectedly connected:
* /rgbd_odometry/zed_odom->/rviz_1492655609198267406 (/tf)
* /rgbd_odometry/zed_odom->/roswtf_9931_1492660614427 (/tf)
* /robot_base_to_camera_link->/rviz_1492655609198267406 (/tf_static)
* /map_to_odom->/rviz_1492655609198267406 (/tf_static)
* /camera_link_to_zed_actual_frame->/rviz_1492655609198267406 (/tf_static)
My bad TF (when publishing static map->odom transform)
$ rosrun tf tf_echo robot_base odom
At time 1492662864.848
- Translation: [0.001, -0.017, -0.038]
- Rotation: in Quaternion [0.639, -0.283, -0.638, 0.324]
in RPY (radian) [1.541, 0.684, -1.535]
in RPY (degree) [88.289, 39.178, -87.969]
My good TF (when NOT publishing static map->odom transform)
$ rosrun tf tf_echo robot_base odom
At time 1492661532.036
- Translation: [0.016, -0.004, 0.000]
- Rotation: in Quaternion [-0.000, -0.000, 0.001, 1.000]
in RPY (radian) [-0.000, -0.000, 0.003]
in RPY (degree) [-0.000, -0.000, 0.154]
I'm using Ubuntu 16.04 on a TX1.
Here's my version of robot_localization
commit 3c7521fb109fcd0fe6090ae8ae407b031838e80d
Merge: fe2f10e cd86690
Author: Tom Moore
Date: Mon Mar 27 19:45:55 2017 +0100
Merge pull request #355 from clearpathrobotics/fix_gains_defaults
Fix acceleration and deceleration gains default logic
I appreciate any help! Thanks!
Fixed typo: my problem transform IS map -> odom
↧
Access tf Vector3 values
Hello,
I would like to get the z position value of my robot in gazebo. My Robot is using a skid steering plugin publishing on the "/player0/odom" topic and the
"/tf" topic with
"player0/frame" as it's frame id.
I do know how to get the x and y values by using the odom topic, but to get the z value it seems I have to use tf.
However, the tf data confuses me.
If I echo the /tf topic, I get:
transforms:
header:
seq: 0
stamp:
secs: 8
nsecs: 402000000
frame_id: player0/frame
child_frame_id: root
transform:
translation:
x: -2.00000384203
y: -0.000426774126042
z: 0.400000377264
rotation:
x: -4.57302626635e-07
y: 9.54536038539e-08
z: -0.00161574482528
w: 0.999998694683
How can I get the z: 0.4...value and store it into a variable?
↧
↧
utm points drifting on rviz over time
I have a set of gps navsatfix points that are recorded within a visible physical boundary when viewed on a google satellite image. I am using rviz_satellite to view the utm points on the satellite image, but I noticed that my trajectory is drifting south over time. Based on the answer given below regarding rviz float32 precision:
http://answers.ros.org/question/33624/how-to-view-map-regions-with-large-coordinate-values-using-rviz/
I changed my tf transform to:
tf::Transform transform;
transform.setOrigin( tf::Vector3( (utmpoint.easting-easting_ref), (utmpoint.northing-northing_ref), utmpoint.altitude ));
tf::Quaternion q;
q.setRPY(0, 0, 0);
transform.setRotation(q);
br.sendTransform(tf::StampedTransform(transform, msg->header.stamp, frame_fix_origin, frame_robot));
where ref_easting and ref_northing are utm points captured from my first incoming gps message, therefore now my first tf x,y coordinate starts at (0,0). On rviz Fixed Frame is set to frame_fix_origin and Robot frame(from rviz_satellite package) is set to frame_robot, but I still see the drift happening.
If I use another mapping visualization tool called mapviz(which has its own lat long to local xy coordinate system) and feed it my gps coordinates(using same maptile URI source) everything is displayed correctly, there is no drift. So I think my raw /fix coordinates are fine. Either I'm doing the transformation wrong when giving it to rviz or maybe the rviz_satellite is not projecting the satellite images correctly or it's something else.
I guess my question is has anyone used rviz_satellite and been able to plot their utm points without seeing this drift?
↧
Difference in CPU% between static_transform_publisher and tf::TransformBroadcaster?
Hi everyone! When broadcasting tfs, is there a difference between running static_transform_publisher or broadcasting via tf::TransformBroadcaster in terms of computational resources? When directly running the static_transform_publisher, the period parameter defines how often the transform is sent, but I don’t think something similar is possible when using the sendTransform() method in code.
I have certain launch file that uses three static_transform_publishers. I have tried broadcasting them via code instead, but when doing this, my CPU usage spikes quite notoriously.
For some practical reasons I would like to do this in code rather than in the launch file. Though if the same behaviour is not achievable, I would of course settle for the latter.
PS. I am using rospy.
Thank you very much in advance!
Diego
↧
tf2 reversing coordinate frame tree
The [tf2 documentation](http://wiki.ros.org/tf2) suggests that the implementation somewhat breaks down in cases where coordinate frames require re-parenting. Is there a reasonable solution for completely reversing the child-parent tree?
A simple example: a serial manipulator has 3 joints, 'joint1', 'joint2', 'joint3', and the tf2 tree is configured to reflect this (joint1 --> joint2 --> joint3). At any point, I'd like to reverse the configuration such that the 'joint3' frame is now effectively treated as the 'joint1' frame (i.e. joint3 --> joint2 --> joint1).
↧
ros::moveit adding rotation to xyz-coordinates
Hello people,
I am currently programming a 6 DOF robotic arm using ros and moveit for the navigation. My goals are to send x,y,z coordinates as well as a fixed orientation of my gripper in order to succesfully grab certain objects.
Following the tutorial i can send x,y,z coordinates like this just fine:
geometry_msgs::Pose target_pose1;
target_pose1.orientation.w = 1;
target_pose1.position.x = 0;
target_pose1.position.y = 0;
target_pose1.position.z = 0.55;
move_group.setPoseTarget(target_pose1);
Using this method, my end effector stays in its start orientation.
I can also send a effector orientation like this:
move_group.setRPYTarget(1,1,0);
However, now i the orientation changes to the given values but the x,y,z-coordinates seem rather random.
Using both commands in a row, the sedond one overwrites the first one, leading to an insufficient result.
Right now I am stuck messing around with quaternion transformation using the tf package not knowing whether this is is the right way at all. Seems like this package is unable to combine both position and orientation as well.
Does somebody know a way to send x,y,z-coordinates AND rpy-orientations (c++)?
Thank you in advance,
Felix
↧
↧
Does the robot_state_publisher support sensor_msgs/MultiDOFJointState messages?
Hi,
I found the message definition sensor_msgs/MultiDOFJointState and was expecting robot_state_publisher to also support it:
https://github.com/ros/common_msgs/blob/jade-devel/sensor_msgs/msg/MultiDOFJointState.msg
What I had in mind was that the node would populate TF with the current transforms taking the respective frame_ids from the URDF. When searching the repository, however, I could not find any usage of this message definition:
https://github.com/ros/robot_state_publisher/blob/kinetic-devel/include/robot_state_publisher/joint_state_listener.h
Do I see this correctly that robot_state_publisher does not support this message definition?
If not,
1. is there an established good practice of using multi-DOF joints with robot_state_publisher?
2. is there a good reason that speaks against adding support for sensor_msgs/MultiDOFJointState to robot_state_publisher?
Best,
Georg.
↧
What is the status of the rosjava client library?
What is the usability of rosjava? It seems to still be in development and has been updated to kinetic, but much of the documentation and discussion around it seems to be a couple of years old. Would anyone recommend it for projects that don't involve android development? Is there a substantial performance impact from using rosjava? Also does it support tf2?
↧
ROS_NAMESPACE , tf , turtlebot_bringup and turtlebot_navigation
Hi All,
I'm looking to provide a namespace for my turtlebot.
In my .bashrc file, I've set:
`export TURTLEBOT_NAME=Dave` and
`export ROS_NAMESPACE=$TURTLEBOT_NAME`
After launching `turtlebot_bringup minimal.launch` and `turtlebot_navigation amcl_demo.launch` the list of topics are all in the `/Dave` namespace, however the following: `
/dave/incompatible_rapp_list
/dave/rapp_list
/dave/status
/diagnostics
/diagnostics_agg
/diagnostics_toplevel_state
/rosout
/rosout_agg
/scan
/tf
/tf_static` are not or have incorrect namespace. `dave` not `Dave`. More importantly I'm looking to solve the `tf` and `tf_static` because those are very important for actual navigation. I was looking at [this](http://answers.ros.org/question/41433/multiple-robots-simulation-and-navigation/?answer=41434#post-id-41434) ROS Answers but couldn't solve my problem.
Any thoughts?
Thanks :)
↧
editing leg_detector for relative position to turtlebot
hey guys,
although it seems to be pretty quiet in answers.ros.org, I will try to get some help anyways. maybe I can fix the problem myself soon and post the answer which might help other people.
Im still trying to implement person following on my turtlebot 2 with ros indigo. Im now getting the position data of persons through the leg_detector package and I can use it to generate motor commands for my turtlebot.
My problem right now is the fact, that the position of the human seems to be in a "global frame" and doesnt change as my turtlebot moves towards him. my goal is to just get the relative position of the human to the turtlebot so i can easily calculate my motor-commands (by simple p-controllers for angle and linear distance).
Im not really into transformations right now (sadly) so I just followed this tutorial: https://github.com/jstnhuang/cse481c_tutorials/wiki/How-to-run-the-leg-detector-on-the-Turtlebot
Im pretty sure I need to edit the leg_detector package so he doesnt transform the relative position of the legs to a global position, but I really dont know where to start :/ im also more focussed on getting into python, so the c++ syntax is quite irritating for me. maybe i dont even need the kalman filter mentioned in the tutorial.
help ist much appreciated!
best regards, stoevi
↧
↧
Is there any equivalent method in python for pcl_ros::transformPointCloud() and if not, how to efficiently implement it?
I am working with the python-pcl library, and there isn't any equivalent function to transform a point cloud from one frame to another and output in the format of PointCloud2.
Are there any alternatives to this in python/rospy? Or I will have to implement one myself?
I was considering using the lookupTransform() from Tf and obtaining the translation and rotation values and creating the transformation myself somehow. Not sure if it will be an efficient way to do so or not.
Surprised that there isn't much work for PCL+ROS in Python even now.
↧
robot_localization map to odom transform seems to be incorrect
I'm using two robot_localization nodes, one for continuous data (wheels' odom + imu) and the other including gps for pose correction.
For instance ekf_localization and gps_ekf_localization are giving the exact same output seen from the odom frame which is normal, but when looking at their output in the map frame I am supposed to see a drift between them but they are still the same and the whole trajectory has the same form with a slight deformation and it's rotated with an important angle and also translated. This is an example of the published transforms:
transform:
translation:
x: 1.62237404993
y: -1.66250429176
z: 0.0
rotation:
x: 0.0
y: 0.0
z: 0.864963620982
w: 0.501834568736
This made me think that the map to odom transform is wrong for some reason,
also made me question: is the gps data really used for pose correction? With the same configuration, I eliminated the gps data and still have the same output.
The weird thing is that every thing seem to be working some times, the robot maintain its trajectory and seems to be very aware of its position and orientation. But with a more complex trajectory and in some different areas I'm getting the same behavior as the one described in this [paper](http://docs.ros.org/indigo/api/robot_localization/html/_downloads/robot_localization_ias13_revised.pdf) when fusing only odom and imu.
The launch file I'm using is a way too similar to the one provided as answer for the following
[question]( http://answers.ros.org/question/215904/robot_localization-failed-to-transform-warning/?comment=252396#post-id-252396).
Does this have something to do with the imu parameters ?
I've been reading the documentation and almost every published issue and question about the package but I still have so many questions bothering me in my head since I'm new to all this magic.
I will try to provide some bag files as soon as possible. But hopefully this little description will make you able to give me some points to check or to correct.
Thank you in advance.
EDIT in response to Tom's comment:
This is the launch file I'm using:
[false, false, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false] [false, false, false,
true, true, false,
false, false, false,
false, false, true,
false, false, false] [false, false, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false] [true, true, false,
false, false, false,
false, false, false,
false, false, false,
false, false, false] [false, false, false,
true, true, false,
false, false, false,
false, false, true,
false, false, false]
This [bag file](https://drive.google.com/file/d/0B79wfrneBrniUmVPY3hQaFp1NWM/view?usp=sharing) contains some simples of the input data.
↧
rospy: how to set a transform to TransformerROS ?
After initializing a node, a declare a TransformerROS.
transformer = tf.TransformerROS(True,rospy.Duration(10.0))
but if I want to perform a transform between two frames (in my case, I use turtlebot and I want to transform between odom and base_link), this does not work :
transformed_point = transformer.transformPoint("odom",point)
with error:
[...]
File "/opt/ros/indigo/lib/python2.7/dist-packages/tf/listener.py", line 71, in asMatrix
translation,rotation = self.lookupTransform(target_frame, hdr.frame_id, hdr.stamp)
[...]
LookupException: "odom" passed to lookupTransform argument target_frame does not exist
*note*: I have the bring up nodes of turtlebot running and rostopic informs me odom is broadcasted at 50Hz. Also odom frame shows up in rviz.
Also, I notice :
transformer.getFrameStrings()
prints []
What am I missing ?
Example from here: http://docs.ros.org/jade/api/tf/html/python/tf_python.html
seems to imply I should set the transform to the transformer first, but according to the error I get, the transformer seems to be searching for the transform itself (" ... self.lookupTransform(target_frame, hdr.frame_id, hdr.stamp) ...")
edit : I tried with an instance of TransformListener rather than TransformROS, for same result
↧
Setting initialpose for AMCL using difference between initial TFs
I'm working on bag files containing odometry and laser scan data. I have a launch file that runs simultaneously the bag, AMCL and a map node. The TF from odom to base_link is different in each bag.
When I run AMCL with initial pose not set (default values) it takes much time to estimate proper robot position on map and sometimes it doesn't even find it properly.
I'm trying to init AMCL using first odom -> base_link transform value in the bag file so that laser scan image matches the map with the highest probability. Calling /global_localization service helps but not well and it also takes too much time. Bags are only 30 sec.
Question: What is the best way to pass this transform values as initial positions in AMCL? Should I write a publisher that reads first tf in bag and then publish on /initialpose topic or is there any easier way to do it (params, services, bash files, extract data from bag to csv and then read it somehow, etc.)?
↧
↧
How to resolve waitForTransform timeout exception regardless of Duration value in ROS Kinetic (python)?
I have gone through similar questions on this website, but none have helped resolve the problem for me.
`listener.waitForTransform("world", "camera_link", rospy.Time(0), rospy.Duration(4.0))
`
- My TF tree is well defined and shows the connection between the two frames (world and a camera frame). It's World -> Link 1 -> Link 2 -> Camera Link I check this based on rqt-graph.
- If I replace "world" above with "Link 2" I don't get the timeout exception, which is interesting.
- As I mentioned, changing Duration doesn't help.
- The `use_sim_time` parameter is also defined and set to True.
Not sure how to proceed on this at all.
Something which might be important -
I am working through VMWare. Due to some issue with vmware tools, I had to run the `sudo apt-get autoremove` command. Which, unfortunately, ended up removing some of the ROS packages. So I am thinking that might be related to this problem, but I can't be sure. I have tried reinstalling the tf, tf2, geometry packages till now becacuse some posts suggested it to be a problem related to those. Didn't help.
Any idea on how to debug/solve this? I can provide more information if required.
***Update:***
Just after posting this, I tried to run `tf_echo` for those two. And I get this error -
`Exception thrown:Could not find a connection between 'camera_link' and 'world' because they are not part of the same tree.Tf has two or more unconnected trees.`
Which is weird and interesting because this seems to contradict the first point I have above.
↧
Setting rviz topic (publish )
I'm trying to view a .pcd file with rviz.
I wrote a little script to read the file and than publish it. (Yes I included way to many headers)
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
pcl::PointCloud::Ptr loadPCDfile() //Laden eines PCD Files
{
//Nach dem Dateipfad fragen
std::string pcdpath;
std::cout << "Bitte Pfad+Name.pcd eingeben: ";
//Einagbe lesen
std::getline(std::cin, pcdpath);
//Ausgabe der Eingabe
//std::cout<< pcdpath << std::endl;
// Erkennen ob es sich um eine gültige Eingabe handelt (http://pointclouds.org/documentation/tutorials/reading_pcd.php)
pcl::PointCloud::Ptr pcdcloud(
new pcl::PointCloud);
if (pcl::io::loadPCDFile(pcdpath, *pcdcloud) == -1) //* load the file
{
std::cout << "Keine pcd Datei gefunden unter: " << pcdpath << std::endl;
}
return pcdcloud;
}
int main(int argc, char **argv) {
pcl::PointCloud::Ptr pcdcloud = loadPCDfile(); //Laden unserer PCD Datei
/*for (size_t i = 0; i < pcdcloud->points.size (); ++i)
std::cout << " " << pcdcloud->points[i].x
<< " " << pcdcloud->points[i].y
<< " " << pcdcloud->points[i].z << std::endl;
*/
// Initialize ROS
ros::init(argc, argv, "pcdcloud");
ros::NodeHandle nh;
// Create a ROS publisher for the output pointcloud
ros::Publisher pubrviz = nh.advertise>("pcdcloud", 1);
pubrviz.publish(pcdcloud);
ros::Rate loop_rate(10);
loop_rate.sleep();
// Spin
ros::spin();
return (0);
}
The reading Part should be fine, since I get a lot of points returned when I don't comment the for loop out.
When I build it I get no error returned.
But I can't get the rviz gui to show the cloud.
I used `rosrun tf static_transform_publisher 0 0 0 0 0 0 1 map my_frame 10` to set up the tf map. [link](http://answers.ros.org/question/195962/rviz-fixed-frame-world-does-not-exist/?answer=232726#post-id-232726)
Probably I'm doing something wrong withing with the topic.

↧
tf Python API transform points
Hi
The TF documentation mentions that there are methods to convert points from one frame to another. I am currently using the python tf api.
I have a list of [x,y,z] points in the sensor frame and I want to transform these points into the map frame.
calling this
(tran, rot) = self.tf_listener.lookupTransform('/body', '/map', rospy.Time(0.0))
gives a translation vector and a quaternion for the rotation. I am not really familiar with quaternions... but I heard they are hard...
Anyway Is there a method in the python api to convert my vector of points to another frame?
I noticed in the c++ tf api there was a function called transformPoint. Is this what I need?
My other idea is to use the transformation.py module to convert the quaternions and the translation vector to a homogeneous transformation matrix and transform the vector of points manually. (by using the transformations.py module) http://wiki.ros.org/geometry/RotationMethods#transformations.py
↧