Hi, I want to know if it is possible add a new frame by code.
For example a node publishing a frame and modify the position and orientation. Something like static_transform_publisher but doing it by code.
Thanks.
↧
Add frame by code
↧
How to subcribe tf transform from map to odom by gmapping into odom to baselink?
Hi, I have an odometry obtained from encoders but it is not accurate enough. In long periods, the error accumulated the robot position not accurate. I know that gmapping publish a tf transform from map to odom. Can I subscribe this transform and add it into my odometry or the transform from odom to baselink so that the position of my robot is more accurate?
↧
↧
Laser_ortho_projector transformation problem
I'm currently using ROS Hydro. I've been trying without success to run laser_ortho_projector without a base/world tf. According to the wiki (http://wiki.ros.org/laser_ortho_projector#Required_tf_Transforms), this is fine so long as an IMU topic is provided, since the node can determine the attitude of the laser frame relative to the fixed frame.
(Note that while the wiki article referenced is for Fuerte, I used the Hydro package: https://github.com/ccny-ros-pkg/scan_tools/tree/hydro)
Currently, I have a laser topic (/scan) and IMU topic (/imu/data) successfully publishing. Also, I have a laser > base_link static transformation being broadcast (included in launch file below).
I continue to get the following error:
[ WARN] [ros timestamp]: Skipping scan Could not find a connection between 'world' and 'base_link' because they are not part of the same tree.Tf has two or more unconnected trees.
Here is my launch file:
Thanks for any assistance!
↧
Use Robotino in ArbotiX and rviz
Hey,
i am currently working with this book [ros by example](http://www.lulu.com/shop/r-patrick-goebel/ros-by-example-hydro-volume-1/ebook/product-21393108.html). In this book the author uses the arbotix simulator to simulate the programs with the turtlebot, which is a differential wheeled robot.
I have a robotino (omnidrive wheeled robot) and want to do the same things with it, but i get stuck at launching it. When i open rviz the Robot Model is corrupted (No Transform from [...] to [odom]. I know that these are tf-problems, but i really have no idea where to look and what to do.
In my launch-file i have the following;
And the yaml-file looks like this: (i took it from the turtlebot-example and choosed omni-controller insted of diff-controller but this doesn't work either)
port: /dev/ttyUSB0
baud: 115200
rate: 20
sync_write: True
sync_read: True
read_rate: 20
write_rate: 20
controllers: {
base_controller: {type: omni_controller, base_frame_id: base_footprint, base_width: 0.26, ticks_meter: 4100, Kp: 12, Ki: 0, Ko: 50, accel_limit: 1.0 }
}
Any suggestions for me?
I added the tf trees as an image, as you can see there are two different trees, which is the problem...
[tf trees](/upfiles/14236504422359831.jpg)
Best regards
Daniel
↧
Ros Matlab i/o and /tf
Hello everybody,
I try to use the topic /tf in order to obtain the pose of the end-effector of the PR2. Unfortunately I can't subscribe to the topic, I have this error:>SubTf = rosmatlab.subscriber('/tf', 'tf/tfMessage',100,node);
Error using rosmatlab.node/addSubscriber (line 661)
Java exception occurred:
org.ros.exception.RosMessageRuntimeException: java.lang.ClassNotFoundException: tf.tfMessage
at org.ros.internal.message.definition.MessageDefinitionReflectionProvider.get(MessageDefinitionReflectionProvider.java:58)
at org.ros.internal.message.Md5Generator.generate(Md5Generator.java:44)
at org.ros.internal.message.topic.TopicDescriptionFactory.newFromType(TopicDescriptionFactory.java:36)
at org.ros.internal.node.DefaultNode.newSubscriber(DefaultNode.java:286)
at org.ros.internal.node.DefaultNode.newSubscriber(DefaultNode.java:297)
Caused by: java.lang.ClassNotFoundException: tf.tfMessage
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.ros.internal.message.definition.MessageDefinitionReflectionProvider.get(MessageDefinitionReflectionProvider.java:54)
... 4 more
Error in rosmatlab.subscriber (line 38)
sub = node.addSubscriber(topicName,topicMessageType,bufferLimit);
If you have any idea to resolve it, it would be great.
↧
↧
Odom Axes not in line with base_link
Hi,
I am using Powerbot to be able to build a map using gmapping algorithm. To setup my robot, I am using the ROSARIA package to be able to have control on the motors, get pose estimates from odometry etc. This is an ROS wrapper for the ARIA library provided by ActivMedia mobilerobots.
I am aware that I need some tf configuration in order to align the odometry frame with the base_link and to align the laser frame with base_link frame. I have followed the tutorials and I have understood the concept. I came across [this tutorial](http://www.ist.tugraz.at/robotics/bin/view/Main/Poineer_mapping) and followed it to be able to create my transforms while using ROSARIA. However, in doing so, I have noticed that the odometry axes is not aligned with the base_link axes. The laser axes are aligned perfectly as can be seen in [this screenshot](/upfiles/14238183324878828.png). The odometry axes as the ones far off from the other two.
I am aware that ROSARIA publishes its own tf as can be seen from [rosgraph.png](/upfiles/14238185341661154.png). The current transform tree according to my code is [this](/upfiles/14238187038679438.png). The code that I am using to build the transformations is the following:
#include
#include
#include
void poseCallback(const nav_msgs::Odometry::ConstPtr& odomsg)
{
//TF odom=> base_link
static tf::TransformBroadcaster odom_broadcaster;
odom_broadcaster.sendTransform(
tf::StampedTransform(
tf::Transform(tf::Quaternion(odomsg->pose.pose.orientation.x,
odomsg->pose.pose.orientation.y,
odomsg->pose.pose.orientation.z,
odomsg->pose.pose.orientation.w),
tf::Vector3(odomsg->pose.pose.position.x/1000.0,
odomsg->pose.pose.position.y/1000.0,
odomsg->pose.pose.position.z/1000.0)),
odomsg->header.stamp, "/odom", "/base_link"));
ROS_DEBUG("odometry frame sent");
ROS_INFO("odometry frame sent: y=[%f]", odomsg->pose.pose.position.y/1000.0);
}
int main(int argc, char** argv){
ros::init(argc, argv, "pioneer_tf");
ros::NodeHandle n;
ros::Rate r(100);
tf::TransformBroadcaster broadcaster;
//subscribe to pose info
ros::Subscriber pose_sub = n.subscribe("RosAria/pose", 100, poseCallback);
while(n.ok()){
//base_link => laser
broadcaster.sendTransform(
tf::StampedTransform(
tf::Transform(tf::Quaternion(0, 0, 0), tf::Vector3(/*0.034, 0.0, 0.250*/0.385, 0, 0.17)),
ros::Time::now(), "/base_link", "/laser"));
ros::spinOnce();
r.sleep();
}
}
Of course, the maps created using this setup are a mess. This happens after I move the robot with the joystick for a bit. Initially, on starting up ROSARIA and the transform node the axes are aligned. It is only after the robot moves that they lose their alignment. Can someone help me understand what is wrong with my transform tree and how can I fix this? Thanks!
EDIT
[This](/upfiles/1423820617525082.png) is a typical example of what happens to my robot's odometry when I drive it around a little bit. Basically, it moved forwards, then turned around a desk and then moved forwards some more. I do not expect such bad odometry.
↧
Obtaining joint angles from joint TFs?
Hi, I am currently using openni_tracker to tele-op a robot arm using an RGBD sensor. I am able to obtain the joint positions and orientations using openni_tracker, which publishes a TF with respect to the camera frame for every joint. I am getting confused when I think about how I can convert this into relative orientation of each joint, which I can then pass to t he robot control. For each joint, I am interested in two angles: the up-down motion angle and the sideways angle, and I feel that TF is giving me way more data than what's really relevant to me. Any suggestions will be very helpful.
Thanks,
Sai
↧
TF updating very slowly
Hey everyone,
I am trying to publish Pose messages based on distances I get from ar_track_alvar. I have static transforms which relate the markers to the map. These static transforms broadcast at 100Hz. I take the position and orientation data from the marker message and create a transform from it. I then broadcast that transform (I can't use the one supplied by ar_track_alvar so I create my own). I then listen for a transform from the map to my robot in order to get my position and orientation with respect to the map, and then publish that as a pose message. This works fine, but the problem is that the poses are published with an intolerable delay. After moving my robot, it can take up to a minute for the pose to update. To be clear, poses continue to be published, but they reflect the old position of the data. I used `rosrun tf tf_echo`to track the tfs, and I found that the tf is reflecting old data also.
Additionally, I see a lot of errors saying:
> Lookup would require extrapolation> into the past. Requested time XXXXXX> but the earliest data is at time> YYYYYYY, when looking up transform> from frame [thor] to frame [map].
Despite my constant broadcasting and the static transforms.
Things I have tried:
I have gotten rid of the waitForTransform call in case that was adding some latency.
I have reduced the tag subscribers buffer to 1 to ensure processing of only the most recent message.
I have added extra AsyncSpinners.
These had no effects. Does anyone have any idea why it is taking the tfs so long to broadcast?
Relevant code:
//Figure out which tag we are looking at.
int tagID = msg->markers[0].id;
std::stringstream ss;
ss << "hsmrs/marker_" << tagID;
std::string markerFrameID = ss.str();
//Manually create transform from robot to tag
tf::Transform transform;
double x, y;
x = msg->markers[0].pose.pose.position.x;
y = msg->markers[0].pose.pose.position.y;
tf::Quaternion q;
tf::quaternionMsgToTF(msg->markers[0].pose.pose.orientation, q);
transform.setOrigin( tf::Vector3(x, y, 0.0) );
transform.setRotation(q);
//Broadcast transform
ROS_INFO("Broadcasting transform!");
br.sendTransform(tf::StampedTransform(transform, ros::Time::now(), markerFrameID, "thor"));
ROS_INFO("Transform broadcasted!");
//Listen for transform from map to robot
ROS_INFO("Listening for transform!");
tf::StampedTransform mapTransform;
try{
listener.waitForTransform("map", "thor", ros::Time(0) ros::Duration(0.1));
listener.lookupTransform("map", "thor",
ros::Time(0), mapTransform);
ROS_INFO("Transform heard");
}
catch (tf::TransformException &ex) {
ROS_ERROR("%s",ex.what());
return;
}
//Unpack position and orientation
double map_x, map_y;
map_x = mapTransform.getOrigin().x();
map_y = mapTransform.getOrigin().y();
geometry_msgs::Quaternion quat;
tf::quaternionTFToMsg(mapTransform.getRotation(), quat);
//Publish pose as geometry message
geometry_msgs::Pose pose;
pose.position.x = map_x;
pose.position.y = map_y;
pose.orientation = quat;
pose_pub.publish(pose);
EDIT:
The static publisher was running on a separate computer. By running it on the robot this problem disappeared. But I need it to work on the workstation computer. How can I make it work?
↧
frame passed to lookupTransform does not exist
I have encountered a peculiar error with the tf. The python `lookupTransform` function throws an exception saying that requested frame does not exists whereas tools like tf_echo do not report any problems. Since the setup I am trying to run is extremely simple, I have no idea what could be wrong. Either I have missed something very simple which I can't see right now or this is a bug in tf.
I am running stage with the willow-erratic.world file:
`rosrun stage_ros stageros -g willow-erratic.world `
together with the following node:
#!/usr/bin/env python
import rospy
import tf
from sensor_msgs.msg import LaserScan
listener = tf.TransformListener()
def scanCb(msg):
print 'laserscan received'
print msg.angle_max
# lookup base_link -> laser frame transformation
(trans, rot) = listener.lookupTransform('base_link', 'base_laser_link', rospy.Time())
rospy.init_node('console_interface')
rospy.Subscriber('/base_scan', LaserScan, scanCb)
rospy.spin()
The output is:
laserscan received
2.35837626457
[ERROR] [WallTime: 1424179111.916555] [4010.900000] bad callback:
Traceback (most recent call last):
File "/opt/ros/indigo/lib/python2.7/dist-packages/rospy/topics.py", line 688, in _invoke_callback
cb(msg)
File "/home/vagrant/test_ws/src/scout_nav/scripts/console_iface", line 14, in scanCb
(trans, rot) = listener.lookupTransform('base_link', 'base_laser_link', rospy.Time())
LookupException: "base_link" passed to lookupTransform argument target_frame does not exist.
tf_echo does not seem to have problems with looking up the transform:
rosrun tf tf_echo base_link base_laser_link
At time 4147.200
- Translation: [0.050, 0.000, 0.250]
- Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000]
in RPY [0.000, -0.000, 0.000]
Does anyone see what is wrong with my approach?
Thanks
↧
↧
tf2 buffer_server timeout and buffer size
I'm starting the buffer_server with
rosrun tf2_ros buffer_server
I've seen in [its source code on github](https://github.com/ros/geometry_experimental/blob/indigo-devel/tf2_ros/src/buffer_server_main.cpp) that one can specify the buffer size through the parameter server.
My questions are: in which measurement unit is the ``buffer_size`` parameter, what happens when the buffer is overfilled and is the buffer size the same as the duration after which the transform times out?
Thanks.
↧
transform base_link to base_laser,map,odom
I know the meaning of transform base_link to base_laser.
But what is the meaning of transform base_link to map?
What is the meaning of transform base_link to odom?
Because I need the meaning to set the correct value in navigation program.
Thank you~
↧
how to force hector slam to set robot position ?
hi
we have a hokuyo laser and in the first seconds of running hector slam specially when the robot's laser data is few(for example in small places)
the robot position has very bad data jumps , but after a while it goes well .
therefore i want to force hector slam to consider and set the robot's position as 0,0 in the first seconds(for example 5 seconds). how can i do this?
i have tried to publish the robot's position to "tf" topic but it didn't work .
also i have launched the hector pose estimiation but it didn't solve the problem.
the robot position has very bad data jumps , but after a while it goes well .
therefore i want to force hector slam to consider and set the robot's position as 0,0 in the first seconds(for example 5 seconds). how can i do this?
i have tried to publish the robot's position to "tf" topic but it didn't work .
also i have launched the hector pose estimiation but it didn't solve the problem.
↧
Debugging OctoMap
Running gmapping, octomap with the turtlebot.
Fixed frame in rviz is "map"
The launch file frame id is also "map"
But no voxels are coming out on rviz. Quick change from map to base_footprint on both rviz and launch file fixed it but the 3D map is not good.
[I've learned that octomap_server needs a fixed frame id and a tf transform of the input PointCloud2 into this frame.](http://answers.ros.org/question/12320/octomap-server-rviz-mapping/)
and [the frame needs to be "map" to produce a good 3D map.](http://answers.ros.org/question/203463/how-to-improve-maps-from-octomap/)
But something is wrong because nothing is coming out.
So I rosrun rqt_console to debug.
It shows
MessageFilter [target=map ]: The majority of dropped messages were due to messages growing older than the TF cache time. The last message's timestamp was: 1425390282.550442, and the last frame_id was: /camera_rgb_optical_frame
Where should I look first to correct this?
↧
↧
Transform Quaternion
Hi!
Is there a package witch transforms the quaternion coordinates automatically in an other system? (Axis-Angle, Kardan, Euler)
Thanks.
↧
tf lookuptransform
Hello,
I'm trying to move my robot to some position. So I made a frame in some random postition named /posicao_objectivo and I made a simple broadcaster to set this position in reference to the frame /odom that gets set by Gazebo.
#include
#include
int main(int argc, char** argv){
ros::init(argc, argv, "tf_broadcaster_posicaoRobot");
ros::NodeHandle node;
tf::TransformBroadcaster br;
tf::Transform transform;
ros::Rate rate(1.0);
while(node.ok()){
transform.setOrigin( tf::Vector3(5, 0.0, 0.0) );
transform.setRotation(tf::Quaternion(0, 0, 0, 1) );
br.sendTransform(tf::StampedTransform(transform, ros::Time::now(),"/odom", "/posicao_objectivo"));
rate.sleep();
}
return 0;
};
After this I made a listener that uses the function lookuptransform to calculate the transform from the robot /base_link to my set position /posicao_objectivo created by my broadcaster.
#include
#include
#include
int main(int argc, char** argv){
ros::init(argc, argv, "my_tf_listener_mexeRobot");
ros::NodeHandle node;
ros::Publisher robot_vel = node.advertise("p3dx/cmd_vel", 10);
tf::TransformListener listener;
ros::Rate rate(1.0);
while (node.ok()){
tf::StampedTransform transform;
try{
listener.waitForTransform("/base_link", "/posicao_objectivo",
ros::Time(0), ros::Duration(3.0));
listener.lookupTransform("/base_link", "/posicao_objectivo",
ros::Time(0), transform);
}
catch (tf::TransformException &ex) {
ROS_ERROR("%s",ex.what());
ros::Duration(1.0).sleep();
continue;
}
geometry_msgs::Twist vel_msg;
vel_msg.angular.z = 0.2 * atan2(transform.getOrigin().y(),
transform.getOrigin().x());
vel_msg.linear.x = 0.5 * sqrt(pow(transform.getOrigin().x(), 2) +
pow(transform.getOrigin().y(), 2));
robot_vel.publish(vel_msg);
rate.sleep();
}
return 0;
};
The problem is that when I run the listener I get this error:
[ERROR] [1425495472.170621190, 132.992000000]: "posicao_objectivo" passed to lookupTransform argument source_frame does not exist.
I tried using the command: rosrun tf tf_echo odom posicao_objectivo
And it says:
Exception thrown:"posicao_objectivo" passed to lookupTransform argument source_frame does not exist.
The current list of frames is:
Frame base_link exists with parent odom.
Frame front_sonar exists with parent base_link.
Frame p3dx_left_hubcap exists with parent p3dx_left_wheel.
Frame p3dx_right_hubcap exists with parent p3dx_right_wheel.
Frame top_plate exists with parent base_link.
Frame camera_link exists with parent Arucos.
Frame Arucos exists with parent top_plate.
Frame center_hubcap exists with parent center_wheel.
Frame lms100 exists with parent base_link.
What is confusing me Is that went I start the broadcast of my frame /posicao_objectivo the frame appears in RVIZ tree and it appears in rosrun rqt_tf_tree rqt_tf_tree. So the frame does exist, don't understand how it does not work with rosrun tf tf_echo odom posicao_objectivo.
↧
Again a Migration Question From TF to TF2
Hi all,
at first, I am sorry if this question might be a duplicate, however I didn't find a clear answer while searching this forum.
So I decided to migrate my code from TF to TF2. I encountered lots of confusion since I am missing now some helper functions, Datatypes are obviously changed from tf:: to tf2:: and so forth.
In a nutshell, what I am trying to do is to lookup some transforms, compute my kinematic-chain-relation and publish a new frame.
1.) lookup a transform with my tf2::buffer, which returns a geometry_msgs::TransformStamped
geometry_msgs::TransformStamped geo_msg = buffer.lookupTransform( "frame1", "frame2", ros::Time(0) );
2.) transform this geometry_msgs::TransformStamped into a tf2::Transform in order to make ordinary frame operations such as multiplications, inverse() etc.
In order for this to work I have to convert my geometry_msgs::TransformStamped into a tf2::Transform and this is where the confusion starts.
in tf1, I'd use any helper functions listed in [ tf/transform_datatypes.h](http://docs.ros.org/indigo/api/tf/html/c++/transform__datatypes_8h.html)
static void tf::transformStampedMsgToTF (const geometry_msgs::TransformStamped &msg, StampedTransform &bt)
Since tf::StampedTransform and tf2::Transform are not compatible, I googled for the tf2:: equivalent and found this [tf2/transform_datatypes.h](http://code.google.com/p/ros-geometry/source/browse/tf2/include/tf2/transform_datatypes.h?spec=svne4426c32e5a41e71d34192317a41586b4f082c91&r=e4426c32e5a41e71d34192317a41586b4f082c91):
static inline void pointStampedMsgToTF2(const geometry_msgs::PointStamped & msg, Stamped& bt)
This finally leads to compilation error, since the link points to an old revision and those functions were apparently replaced by a templated convert function. So I tried:
tf2::Transform tf_msg;
tf2::convert( geo_msg, tf_msg );
This fails to compile with:
.../ros/include/tf2/convert.h:89:5: error: 'toMsg' was not declared in this scope
.../ros/include/tf2/convert.h:89:5: error: 'fromMsg' was not declared in this scope
Then I thought, maybe I am coding a mismatch between Stamped-Object and Non-Stamped objects, so I tried further:
tf2::Transform tf_msg;
tf2::convert( geo_msg.transform, tf_msg );
This fails with the same error messages ( I tried to include basically everything I have in tf2 :) )
So my only option to convert those two objects was manually copying all the translations and rotations from on into another, which ridiculously overblows my code.
tf2::Quaternion q( geo_msg.transform.rotation.x,
geo_msg.transform.rotation.y,
geo_msg.transform.rotation.z,
geo_msg.transform.rotation.w
);
tf2::Vector3 r( geo_msg.transform.translation.x,
geo_msg.transform.translation.y,
geo_msg.transform.translation.z
);
tf2::Transform tf_msg( q,r);
So with this I could finally do my transform operations.
tf2::Transform computed_tf_msg = tf_msg.inverse() *
3.) Publishing a geometry_message from a transform
In tf1 I could simply trigger the tf_broadcaster like this:
tf_broadcaster.sendTransform(tf::StampedTransform( computed_tf_msg, time, "new_frame", "frame2") );
However, this doesn't work in tf2:: so that I ended up doing the conversion from tf2::Transform to geometry_msgs::TransformStamped manually
To conclude:
Is this really the way to go or do I miss anything here? In comparison to TF1, I miss a lot of simple conversion functions and such. I think this pipeline of looking up transform, computing a new frame relation and publish this is nothing fancy.
Sorry for the long post.
↧
Viewing PointCloud2 in a frame from URDF in Rviz vice openni_depth_optical_frame
I am utilizing ROS Hydro on Ubuntu 12.04 with a Pioneer3-DX and Microsoft Kinect.
I am using p2os, openni_launch, and openni_tracker.
I have just added the Kinect to the p2os_urdf by adding the code below to the `pioneer3dx_body.xacro`. I would like to have data from /PointCloud2 in the `kinect_depth_optical_frame` of the URDF when I use Rviz, but it is currently in the `openni_depth_optical_frame`. Therefore, I can only view the /PointCloud2 or the robot model and tf's, but not both at the same time. The `openni_depth_optical_frame` is displayed in the wrong plane relative to the robot model.
Right now, I am utilizing a launch file to start everything. You can find the code of the launch file below.
----------
Kinect code added to `pioneer3dx_body.xacro`:
Launch file code:
↧
↧
Configuration of pointcloud_to_laserscan with stereo camera.
Hello,
I am trying to generate a laserscan of my environment by converting a pointcloud generated by a stereo camera, using the pointcloud_to_laserscan package, and I'm facing some issues. The details -
I am running the minoru stereo camera to generate the pointcloud, using uvc_camera's stereo node. (stereo_node.launch in uvc_camera/launch). I'm able to successfully access and manipulate the pointcloud received (/points as well as /points2). I also remapped the point cloud to /cloud_in, as required by pointcloud_to_laserscan.
To use pointcloud_to_laserscan, installed it (apt-get install, pre-built installation), on ROS Hydro.
I modified the sample_node.launch file to my own launch file, minoru_laser.launch -
Since I'm using a stereo camera, I have removed the OpenNI device node that was previously called, and I am not running the remaps.
I am also running a few nodes from navigation stack robot setup - tf_broadcaster -giving me the transformation between the base_link and camera frames. The camera frame is the default frame of the camera feeds as well as the input point cloud.
The issues I'm facing are as follows -
1) While the pointcloud itself is fine, when I echo the /scan, it has ONLY INF values (when use_inf is set to true), and = range_max + 1 (when use_inf is false). This occurs irrespective of what the pointcloud contains, whether it is within the maximum range or not.
2) Now given the range_max + 1 pointcloud, I should still be able to view this in RVIZ, which I am unable to do. The screen is just blank.
Some other details:
I think it is a transformation frame issue, between the camera frame and the frame of the laser, but I am unable to figure out how to solve this.
I ran rqt_console and changed the logger level of pointcloud_to_laserscan to debug, and saw this message -
Node: /pointcloud_to_laserscan
Time: 17:17:52.694141586 (2015-03-13) Severity: Debug
Published Topics: /rosout, /scan
Connection::drop(2)
Location:
/tmp/buildd/ros-hydro-roscpp-1.10.2-0precise-20140304-0105/src/libros/connection.cpp:Connection::drop:324
As well as :
Node: /pointcloud_to_laserscan
Time: 17:17:52.694023811 (2015-03-13)
Severity: Debug
Published Topics: /rosout, /scan
TCP socket [9] closed
Location:
/tmp/buildd/ros-hydro-roscpp-1.10.2-0precise-20140304-0105/src/libros/transport/transport_tcp.cpp:TransportTCP::close:421
----------------------------------------------------------------------------------------------------
My rqt_graph is here. 
Can anyone suggest a method to solve this?
↧
ImportError: No module named tf
I am writing a node that involves calculating the distance from the PR2's left wrist to a point in space, and publishing that distance. The point is in the base frame, so to make calculations easier I'm attempting to change the frame of the point. However, when I run it, nothing is published (when I use rostopic echo, it indicates that the parameter is not being published. Essentially it doesn't know that it exists). I tried using rosrun on the file and it errors saying
ImportError: No module named tf
Here's what the beginning of my code looks like
#!/usr/bin/env python
# This publisher will calculated the distance of an object
# from PR2
import roslib
roslib.load_manifest('PR2_assignment0')
import rospy
import sys
from std_msgs.msg import String
import tf
from std_msgs.msg import Float64
from geometry_msgs.msg import PointStamped
I've made many modifications but it still does not seem to work (I've checked threads with similar problems but their solutions aren't working). Other files with a similar format work, but mine doesn't...
I haven't tried the code without the tf frame, so I'll try that out for now and see if it sheds light on some things.
Any suggestions?
↧
RGBD Slam Error - Missing resource TF
Hello,
Im using 12.04 - Fuerte
When I run the code below i get the following error. Any ideas on how to resolve this? I would generally try to install the package but it says it cannot find tf package.
patrick@patrick-33:~/fuerte_workspace$ rosdep install rgbdslam_freiburgERROR: the following packages/stacks could not have their rosdep keys resolved
to system dependencies:
rgbdslam: Missing resource tf
ROS path [0]=/opt/ros/fuerte/share/ros
ROS path [1]=/home/user/ros/ros-pkg
ROS path [2]=/home/patrick/fuerte_workspace/
↧