Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 844 articles
Browse latest View live

How to set pose goal negative to the origin

$
0
0
I use following code to transform 2D points given relative to origin (/robot) to world coordinate system tf::Pose pose; pose.setOrigin(tf::Vector3(x,y,0)); pose.setRotation(tf::createQuaternionFromYaw(atan2(y,x))); lst.lookupTransform("/world","/robot",ros::Time(0),transform); pose =transform*pose; This works fine if i want to set goals in front of the robot or left to it (x,y>0), but not if I want to transorm for points which are behind the robot or left to it (i.e. x and y are negative). How can I do the tranformation correctly (without haviong to first rotate in order to set pose with positive x and y)?

recode in python

$
0
0
I got trouble in this code coz I just know python,Is there any one that could help me recode this in python?I'm very grateful! #include #include #include void transformPoint(const tf::TransformListener& listener){ //we'll create a point in the base_laser frame that we'd like to transform to the base_link frame geometry_msgs::PointStamped laser_point; laser_point.header.frame_id = "base_laser"; //we'll just use the most recent transform available for our simple example laser_point.header.stamp = ros::Time(); //just an arbitrary point in space laser_point.point.x = 1.0; laser_point.point.y = 0.2; laser_point.point.z = 0.0; try{ geometry_msgs::PointStamped base_point; listener.transformPoint("base_link", laser_point, base_point); ROS_INFO("base_laser: (%.2f, %.2f. %.2f) -----> base_link: (%.2f, %.2f, %.2f) at time %.2f", laser_point.point.x, laser_point.point.y, laser_point.point.z, base_point.point.x, base_point.point.y, base_point.point.z, base_point.header.stamp.toSec()); } catch(tf::TransformException& ex){ ROS_ERROR("Received an exception trying to transform a point from \"base_laser\" to \"base_link\": %s", ex.what()); } } int main(int argc, char** argv){ ros::init(argc, argv, "robot_tf_listener"); ros::NodeHandle n; tf::TransformListener listener(ros::Duration(10)); //we'll transform a point once every second ros::Timer timer = n.createTimer(ros::Duration(1.0), boost::bind(&transformPoint, boost::ref(listener))); ros::spin(); }

Identity of message publisher

$
0
0
How to find out who published a message (node name), especially for the topic tf, where many nodes are publishing different transforms.

TF_NAN_INPUT errors from use of ekf_localization_node

$
0
0
UPDATE: I have verified that simply setting odom0_differential to false also solves the problem. It is in fact probably not a problem with the ARM system I'm using, as I eventually got the same error on a x86_64 laptop that I mounted on the robot. It seems as though it works fine for a while, and then the error shows up continuously after a while. Is there maybe some sort of caching applied which keeps erronous values and is reset by changing the settings in the launch file? ORIGINAL QUESTION: Whenever I use ekf_localization_node in my project, I get this error:
Error:   TF_NAN_INPUT: Ignoring transform for child_frame_id "base_footprint" from authority "unknown_publisher" because of a nan value in the transform (-nan -nan -nan) (-nan -nan -nan -nan)
         at line 240 in /tmp/buildd/ros-indigo-tf2-0.5.11-0trusty-20150522-1835/src/buffer_core.cpp
I don't get it when I put all values of the (currently only) input of the node to false, but as soon as I put a 'true' at the position for (e.g.) the x input, it shows up again. I am using wheel odometry of a Kobuki for this input. Here is a sample odometry message:
header: 
  seq: 20227
  stamp: 
    secs: 1435652980
    nsecs: 60858344
  frame_id: odom
child_frame_id: base_footprint
pose: 
  pose: 
    position: 
      x: 0.0
      y: 0.0
      z: 0.0
    orientation: 
      x: 0.0
      y: 0.0
      z: 0.0
      w: 1.0
  covariance: [0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7976931348623157e+308, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.05]
twist: 
  twist: 
    linear: 
      x: 0.0
      y: 0.0
      z: 0.0
    angular: 
      x: 0.0
      y: 0.0
      z: 0.0
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---
This is the launchfile I am currently trying to use for the ekf_localization_node: http://pastebin.com/SPrXDD53

Couldn't find executable named tf_brodcaster below

$
0
0
Hi all, I am new to ROS My question is similar to a [previous question asked](http://answers.ros.org/question/211427/stackpackage-robot_setup_tf-not-found/). I am following the tutorial http://wiki.ros.org/navigation/Tutorials/RobotSetup/TF And I am stuck with the robot_setup_tf part. Whereby I run the command rosrun robot_setup_tf tf_broadcaster Initially I get package robot_setup_tf not found I have tried the sourcing the `setup.bash` file in my `catkin_ws/devel`, (From the 1st link) source /path/to/your/catkin_ws/devel/setup.bash It solves the previous error, however, now I get a new error Couldn't find executable named tf_brodcaster below /home/user1/catkin_ws/src/robot_setup_tf I am running on ROS indigo. Would appreciate assistance! Thanks alot in advance!

Rosserial_Arduino Time and TF Tutorial: Buffer Underfill Error

$
0
0
I am attempting [tutorial 4](http://wiki.ros.org/rosserial_arduino/Tutorials/Time%20and%20TF) of the [Rosserial Arduino tutorials](http://wiki.ros.org/rosserial_arduino/Tutorials) and I keep getting an error message. I am running ROS Jade on my Ubuntu computer and have my Arduino Uno connected. Whenever I run the rosserial client application to connect to the arduino: `rosrun rosserial_python serial_node.py /dev/ttyACM0 ` ,I get this error message: [INFO] [WallTime: 1440542068.506962] ROS Serial Python Node [INFO] [WallTime: 1440542068.514658] Connecting to /dev/ttyACM0 at 57600 baud [INFO] [WallTime: 1440542071.163000] Note: publish buffer size is 280 bytes [INFO] [WallTime: 1440542071.163488] Setup publisher on tf [tf/tfMessage] Traceback (most recent call last): File "/opt/ros/jade/lib/rosserial_python/serial_node.py", line 82, in client.run() File "/opt/ros/jade/lib/python2.7/dist-packages/rosserial_python/SerialClient.py", line 482, in run self.callbacks[topic_id](msg) File "/opt/ros/jade/lib/python2.7/dist-packages/rosserial_python/SerialClient.py", line 105, in handlePacket m.deserialize(data) File "/opt/ros/jade/lib/python2.7/dist-packages/tf/msg/_tfMessage.py", line 200, in deserialize raise genpy.DeserializationError(e) #most likely buffer underfill genpy.message.DeserializationError: unpack requires a string argument of length 4 I've looked around for answers and all I've found is how to increase buffer size in case of overflow. But this lists as possible buffer underfill. I've also experienced this same error when attempting [tutorial 9](http://wiki.ros.org/rosserial_arduino/Tutorials/IR%20Ranger). Please forgive me if this is an easy fix, I still consider myself fairly new to ROS but any help that can be given would be greatly appreciated.

transformed coordinates negated and in wrong order

$
0
0
I'm trying to get the example code here: http://wiki.ros.org/image_geometry/Tutorials/ProjectTfFrameToImage ... to project object positions from gazebo into an image frame. To get it to work correctly, I had to introduce a horribly hack - negating and rearranging the order of the coordinations on line 63: - cv::Point3d pt_cv(pt.x(), pt.y(), pt.z()); + cv::Point3d pt_cv(-pt.y(), -pt.z(), pt.x()); Obviously, this is less than satisfactory, and I'm sure I'm doing something wrong up stream. Any suggestions as to how I might identify the source of the problem would be very much appreciated. cheers!

Publishing tf with kinect and watch it in rviz

$
0
0
Hiiiii I am trying to do a tf with kinect, but I can not visualize it with rviz. I have done a tf_broadcaster similar that the ros tutorials, but I don't know the way to link the last son of the tf tree to the kinect, this is the code of the last son: #include #include int main(int argc, char** argv){ ros::init(argc, argv, "robot_tf_publisher_vertical_kinect"); ros::NodeHandle n; ros::Rate r(100); tf::TransformBroadcaster broadcaster; while(n.ok()) { broadcaster.sendTransform( tf::StampedTransform( tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.4, 0, 1.25)), ros::Time::now(),"base_link", "camera_link")); r.sleep(); } } And this is the tf view_frames:![frames](/upfiles/1440668464975876.png) I don't know if it works because I dont know watch in rviz the tf and pointcloud from kinect. Please can somebody help me? Thanks

tf canTransform doesn't update return

$
0
0
I have this code: dist.is_calibrated = listener.canTransform('/openni', '/torso_1', rospy.Time(0)) if dist.is_calibrated: (trans, rot) = listener.lookupTransform('/openni', '/torso_1', rospy.Time(0)) When canTransform function returns True, it never come False again even if /torso_1 doesn't exists anymore. Anyone knows why it happen? Best Wishes.

Unable to move the second robot using nav2d exploration

$
0
0
Hi, I am trying to explore an area autonomously using multiple robots. I have created a launch file which looks like below: I am able to move the first robot for mapping and exploration, but the second robot does not moves. After I enter the command, nothing happens. For the first robot, it moves, but i get response:0 in the terminal. Can anyone please help me with what is wrong with my program?

tf target frame does not exist

$
0
0
Hi, I'm working with ROS Indigo, I was trying to set up two new transformations in a system in which I already have many and came across with the problem that ROS tells me these particular two recently adde don't exist when I ask for them. I decided to try it out in a simpler manner, so I attached the code 2 very simple nodes, one broadcasts and the other one looks for the transform, I have both C++ and python versions. Basic code: C++ broadcaster: tf::TransformBroadcaster br; tf::Transform transform; transform.setOrigin(tf::Vector3(0.5,0.0,0.5)); transform.setRotation(tf::Quaternion(0.0,0.0,0.0,1.0)); while(1){ br.sendTransform(tf::StampedTransform(transform, ros::Time::now(), "base_camera", "base_arm")); //br.sendTransform(tf::StampedTransform(proximal_tf[i], ros::Time::now(), s1, s2)); ros::Duration(0.5).sleep(); } Python broadcaster: tf_br = tf.TransformBroadcaster() while (1): tf_br.sendTransform((0.5,0.0,0.5), (0.0,0.0,0.0,1.0), rospy.Time.now(), "base_camera", "base_arm") rospy.sleep(0.5) C++ listener: tf::TransformListener listener; tf::StampedTransform transform; char s1[50]; char s2[50]; sprintf(s1, "base_camera"); sprintf(s2, "base_arm"); try{ listener.lookupTransform(s1, s2, ros::Time(0), transform); //listener.lookupTransform(s1, s2, ros::Time(0), transform); }catch (tf::TransformException ex){ ROS_ERROR("%s",ex.what()); } Python listener: tf_frames = tf.TransformListener() #object_in_arm = tf_frames.transformPose("Base_Camera", "Base_Arm", object_in_cam) (object_in_arm.pose.position, object_in_arm.pose.orientation) = tf_frames.lookupTransform("base_camera", "base_arm", rospy.Time(0)) This is what I get when executing either C++ or pythin versions: rosrun control_node tf_helper `[ERROR] [1445342805.139442730]: "base_camera" passed to lookupTransform argument target_frame does not exist. [ INFO] [1445342805.139562218]: Traslation: -0.000000/t0.000000/t-0.000000/nOrientation: -0.000000/t-0.000000/t0.506066/t3.165568/n` Similar in Python: rosrun control_node tf_read.py `Object in camera frame: [0.1, 0.1, 0.1] [0.0, 0.0, 0.0, 1.0] Traceback (most recent call last): File "/home/alva_da/catkin_src/src/control_node/src/tf_read.py", line 25, in (object_in_arm.pose.position, object_in_arm.pose.orientation) = tf_frames.lookupTransform("base_camera", "base_arm", rospy.Time(0)) tf.LookupException: "Base_Camera" passed to lookupTransform argument target_frame does not exist.` So I thought frames where not being published for some reason, but I actually got normal results when used tf tools: rosrun tf tf_echo "/base_camera" "/base_arm" ` At time 1445343041.156 - Translation: [-0.500, 0.000, -0.500] - Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000] in RPY (radian) [0.000, -0.000, 0.000] in RPY (degree) [0.000, -0.000, 0.000]` Also, tf monitor results show: rosrun tf tf_monitor "base_camera" "base_arm" `Waiting for transform chain to become available between base_camera and base_arm RESULTS: for base_camera to base_arm Chain is: Net delay avg = 0.0223074: max = 0.215266 Frames: All Broadcasters:` Also, when executig view_frames, the two frames ar listed as parent and child. I came back to this, th only new clue I found is that roswtf tells me this: `WARNING The following node subscriptions are unconnected: * /data_analyzer_node: * /tf_static * /reflex_tf_broadcaster: * /object_pose * /rviz: * /finger1_sensor9_array * /finger1_sensor7_array * /finger1_sensor8_array * /finger1_sensor6_array * /finger1_sensor3_array * /finger1_sensor5_array * /finger1_proximal_pad_array * /finger1_sensor2_array * /particles_array * /object_array * /tf_static` Although I'm not sure what this means. Thanks in advance!

Get a callback on a frame apparition

$
0
0
Hi everyone, I would like to get a callback when I got a specific frame. I already test this method but it requires a defaut topic that i don't require: point_sub_.subscribe(nh_, "scan", 10); tf_filter_ = new tf::MessageFilter(point_sub_, tf_, "laser", 10); tf_filter_->registerCallback( boost::bind(&TeleopRoboteq::msgCallback, this, _1) ); Can I subscribe to tf and trigger the message filter like a normal topic (instead of laser in my exemple)?

How to substitute TF2 prefix

$
0
0
Hi, I have seen that tf_prefix support is deprecated in TF2. What is the proper way now of having the same functionality? That is, I have two robots, robot1 and robot2, and I want the TF frames to append these names to their corresponding `base_link`s. How should I properly do it? EDIT: Also, what is the meaning then of the `tf_prefix` parameter of the `robot_state_publisher` node? Is that the way to proceed with all the other nodes using TF frames (parameters and manually concatenating in the nodes)? Thank you!

How to publish transform from odom to base_link?

$
0
0
I am running a p3at in a gazebo with ros plugins. I need to subscribe to the odometry pose and update broadcast the tf between odom and base_link.How do I do it? Should I define a "odom" link in the .xacro file used for simulation? What about the joint type?

Why does a marker still have roll, pitch and yaw when using tf::createQuaternionMsgFromYaw?

$
0
0
Hello, From a downloaded project I have access to a marker in rviz which is defined by position and orientation(quaternion). However, I would like to create a marker at the same position and with only the yaw as orientation. This means that roll and pitch should be zero and even though the original marker were to have roll and pitch, the second marker should only rotate when there is a variation in yaw. In this way the marker I create should only move on a 2D plane. I used the following function to create the quaternion for my marker: q = tf::createQuaternionMsgFromYaw(yaw); where q is a geometry_msgs::qauternion. Then I pass q.x, q.y, q.z and q.w to my marker. tf::Matrix3x3(transform.getRotation()).getRPY(pitch, yaw, roll); geometry_msgs::Quaternion q; q = tf::createQuaternionMsgFromYaw(yaw); oPose_msg.orientation.x = q.x; oPose_msg.orientation.y = q.y; oPose_msg.orientation.z = q.z; oPose_msg.orientation.w = q.w; However when observing in Rviz my marker still tilts in roll and pitch as the original marker does. Why is this and what could I change to achieve the output I desire? Note: I obtain roll, pitch and yaw from a rotation matrix. These are in this order to be consistent with the designated angles. I have the same problem when using q = tf::createQuaternionMsgFromRollPitchYaw(0.0,yaw,0.0); instead of q = tf::createQuaternionMsgFromYaw(yaw);

Use rosparam get in launch file

$
0
0
Hi I am trying to launch a static_transform_publisher from the tf package. It has the following syntax: static_transform_publisher x y z yaw pitch roll frame_id child_frame_id period_in_ms which should be called from a launch file like this What I am trying to achieve is to load the parameters `x y z yaw pitch roll` from a calibration (.yaml) file, just as I can load rosparams using the `rosparam load` syntax: But I can't find a way to use the values from the parameter server and pass them via the `args=` Argument. I am hoping to find a launchfile-based solution instead of writing a node, that basicaly does the same as the static_transform_publisher node. Does anyone have a solution? PS: I do not want to hardcode the values into the launch file, because they might change. Not during runtime, but in between runs.

tf2 listener: python working example (hydro)

$
0
0
Is there a working example of the tf2 listener under hydro. I have tried to follow tutorials on the wiki but they seems to be outdated. Just to check if I follow the [tutorial](http://wiki.ros.org/tf2/Tutorials/Migration/TransformListener) right. I have a simple example of `tf`: #!/usr/bin/env python import roslib; roslib.load_manifest('cvut_sandbox') import rospy import tf if __name__ == '__main__': rospy.init_node("test_tf_listener") tfl = tf.TransformListener() (trans, rot) = tfl.waitForTransform('r1_link_1', 'r2_link_2', rospy.Time(0), rospy.Duration(0)) This will fail under hydro due to bug reported [here](https://github.com/ros/geometry/issues/30) but works under groovy. Trying to follow the tutorial I ended up with version for `tf2`: #!/usr/bin/env python import roslib; roslib.load_manifest('cvut_sandbox') import rospy import tf2_ros if __name__ == '__main__': rospy.init_node("test_tf_listener") buffer = tf2_ros.Buffer() tfl = tf2_ros.TransformListener(buffer) (trans, rot) = tfl.waitForTransform('r1_link_1', 'r2_link_2', rospy.Time(0), rospy.Duration(0)) However, it seems that it wont be so easy: $ rosrun cvut_sandbox test_tf2_listener.py Traceback (most recent call last): File "/home/wagnelib/Source/hydro/clopema_ws/src/clopema_cvut/cvut_sandbox/scripts/test_tf2_listener.py", line 11, in (trans, rot) = tfl.waitForTransform('r1_link_1', 'r2_link_2', rospy.Time(0), rospy.Duration(0)) AttributeError: TransformListener instance has no attribute 'waitForTransform' Following the [Writing a tf2 listener (Python)](http://wiki.ros.org/tf2/Tutorials/Writing%20a%20tf2%20listener%20%28Python%29) tutorial I ended with this: #!/usr/bin/env python import roslib; roslib.load_manifest('cvut_sandbox') import rospy import tf2 if __name__ == '__main__': rospy.init_node("test_tf_listener") tfl = tf2.TransformListener() (trans, rot) = tfl.waitForTransform('r1_link_1', 'r2_link_2', rospy.Time(0), rospy.Duration(0)) But it also fails: $ rosrun cvut_sandbox test_tf2_listener_2.py Traceback (most recent call last): File "/home/wagnelib/Source/hydro/clopema_ws/src/clopema_cvut/cvut_sandbox/scripts/test_tf2_listener.py", line 5, in import tf2 ImportError: No module named tf2 So is there a working example?

rviz relative laser TF orientation seems wrong

$
0
0
It is likely I am not understanding something fundamental, but when I show laser scan data relative to a laser TF, RViz shows it in a strange orientation with respect to my base_laser frame. The "front" (that is, the scan points ahead of me) becomes the joint where the X and Y axis meet, so I see the scan points "ahead". But this, however, is not how the laser is mounter; Forward is the "X" direction along the x axis the TF. What we want, is the scan points to be ahead of the X axis...but this is never the case. I tried to upload a screen capt, but apparently I need "5 points" in the forums to do this. So let me try this in text. I get this: X< (X is scan point(s) , and '<' is the X and Y axis of the TF as shown in Rviz (Y on top)) we want this: X_| Where '_' would be my "forward" X axis of my TF and Y would be perpendicular to that. Appreciate any guidance, am really not sure what I am missing.

Problems visualizing robotmodel and tf's of urdf model in rviz

$
0
0
Hi! I have build my own robot on sketchup, created an urdf file for it with plugins for the laser (libgazebo_ros_laser.so) and for the odometry (libgazebo_ros_p3d.so) and i'm simulating an navigation with it in gazebo. Until here it's almost 100% correct... I run rviz to visualize what is happening with the sensors, tf's, etc... I just run rviz in command line and then i add what i need in the left bar (tf's, LaserScan, RobotModel and map). Problems: 1 - i can't see correctly the tf's of my robot that are "continuous"; 2 - i can't see the robot model I think i' having this problem due to the type of the joints that are "continuous", but i have to use this for the odometry plugin for gazebo. Maybe is some problem on the joint_state_publisher, i don't know. Do i have to create any kind of a launch file to launch rviz with joint_state_publisher too, just like for gazebo? And why can't i see the RobotModel in rviz for my model, even if the joints are fixed? Maybe because it was designed in Sketchup? Here is my model in urdf. -- GAZEBO true100.0left_wheel_jointright_wheel_joint0.590.16cmd_velodomodombase_linkfalsetrue010mapbase_linkrobot_pose100.00 0 0 0 0 0true407201-1.57081.57080.1030.00.01gaussian0.00.01/scanhokuyo_link

Stage navigation multiple robots

$
0
0
Hello, I want to have multiple robots in stage navigation. I used the stage navigation tutorial code. The problem is I can have gmapping with 1 robot. But if I include a second robot the TF doesnt work and gmapping does not work any more. My laser.world file: > define obstacle model ( size [0.5> 0.5 0.5] gui_nose 0 obstacle_return 1 )>> define hokuyo ranger ( sensor( > range [ 0.02 4.0 ]> fov 180> samples 682 ) block(> points 4> point[0] [0 1]> point[1] [1 1]> point[2] [1 0]> point[3] [0 0]> z [0 1] ) color "black" size [ 0.05 0.05 0.1 ] )>> define roomba position ( block ( > points 4 point[0] [ 0.700 0.705 ]> point[1] [ 0.700 0.0 ] point[2] > [ 0.0 0.0 ] point[3] [ 0.0 0.705 ]> z [0 1] ) size [0.7 0.705 0.3]> origin [-0.125 0 0 0] color> "gray50">> gui_nose 1 obstacle_return 1>> drive "diff">> localization "gps" )>> define roomba_hokuyo roomba( color> "gray90" hokuyo(pose [ 0.1 0.0 0.0> 0.0 ]) )>> define turtlebot roomba( color> "gray90" origin [-0.100 0 0 0] )>> define floorplan model ( color> "gray30" boundary 1>> gui_nose 0 gui_grid 0 > gui_outline 0 gui_move 0 > gripper_return 0 fiducial_return 0 > obstacle_return 1 ranger_return 0.8> )>> define zone model ( color "orange" > size [ 4 4 0.01 ]>> gui_nose 0 gui_grid 0 gui_move 1> gui_outline 0>> obstacle_return 0 ranger_return -1> )>> window ( size [ 500.000 400.000 ] > scale 20 center [ 0.0 0.0> ] rotate [ 0 0 ]> show_data 1 show_trailarrows 0 ) floorplan ( > name "map" size [20.000 15.000> 0.800] pose [0 1 0 0] bitmap "map.png" )>>> roomba_hokuyo ( > name "roomba_hokuyo_0" pose [ 0.00> 0.00 0 0.00 ] )>> roomba_hokuyo ( > name "roomba_hokuyo_1" pose [ 1.00> 0.00 0 0.00 ] ) Laser_gmapping.launch: Tell me if you need more files.
Viewing all 844 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>