I set up my robot with turtlebot and kinect and excute rtabmap.
-- http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot
I hid the image of the Kinect sensor and moved the turtlebot.
And I moved the turtlebot to calculate the travel distance only with the kinect image.
I think the rtabmap algorithm is internally fusing image information and wheel odometry.
How are you fusing?
And do you have your own paper about fusing wheel odometry and viusal odometry?
Thank you ^ ^
↧
How does rtabmap fuse wheel odometry with visual odometry?
↧
Maximum mapping distance rtabmap
Hi,
I am using a Kinect.
Is there a way to set the max mapping distance to like 3.5 Meters to prevent noise from being in the image?
↧
↧
rtabmap_localization issue
hi,
i'm using rtabmap and i'm pretty new to it. I'm noticing an unexpected behaviour where the algorithm is running continuously even when the robot is not moving, and it update the map->odom tf with differences under 10cm but often over 5-6 cm..
i've attached a video that illustrate this behaviour and also you can view the video here https://drive.google.com/open?id=0B5G6Ct8c7jq7NDNkbGcxMTI4Tlk.
↧
Persistent map frame in RTAB-Map localization mode
I have a map generated using a Kinect and RTAB-Map. I want to run RTAB-Map again in localization mode using the same database, and have repeatable coordinates within the map, regardless of my starting position.
From what I understand, the `map` frame origin is set to the starting point when odometry starts, so I don't know my position within the saved map. I'd like to have a canonical set of coordinates that is the same for the same physical location between sessions. If there was a tf frame that is fixed within the saved map, that would work.
Does RTAB-Map have this feature? What should I do to achieve this?
Thanks
↧
Help 3D navigation using a known map
Hello, I'm using rtab map and a kinect to generate a map of a corridor. I was wondering if it was possible to fly a UAV to a known point within the generated map.
My project statement requires me to:
- Manually map and generate a pointcloud of an environment.
- Using the generated map, fly a quadcopter/ UAV autonomously to a point within the generated map.
I'm able to use rtab map to generate a map of an environment however, I'm not sure where to start for the second point. I can't seem to find anything relating to the second point.. Could someone please point me in the right direction.
↧
↧
Flag/Mark Map
Hello,
I am doing a project with a kinect sensor to do map recognition and so for i managed to make it work but i would also like to mark specific areas where the robot moves and export the map with those area highlighted.
I tried to Flag the zones by click using RViz but the flags are way too big.
Are there other options to mark certain areas where the robot walks and export the map ?
Best Regards
[](https://postimg.org/image/cz8k1h9ld/)
↧
rtabmap localization issue
Hi,
I am facing one issue during localization in rtabmap. When there is an dynamic obstacle in front of the robot an even during the navigation time map data gets some times disappears on rviz and the robot gets delocalized. Why this happens so? What is the solution for this so that there is continuous update of map from rtammap? Please suggest as soon as possible. Thanks.
↧
Astra Pro with ROS & rtabmap
Hello,
Has anyone been able to use the Astra Pro camera with ROS Kinetic and rtabmap?
I was able to get the Astra Pro camera working with rviz (RGB image stream + Depth) using the usb_cam node and the Astra SDK.
However, I keep getting a strange error when using rtabmap_ros.
I described the error in the following link:
https://github.com/introlab/rtabmap_ros/issues/147
Has anyone been able to do so, and if so, would you mind providing some guidance on the topic?
Thank you!
EDIT: Attached important information from the described link:
I am using Ubuntu 16.04 and Kinetic ROS.
I am using an Astra Pro Camera. Since RGB is not supported for OpenNi for the Pro version, I am using the Astra SDK for the Depth, and the usb_cam_node for the RGB camera:
https://github.com/orbbec/ros_astra_camera
https://github.com/bosch-ros-pkg/usb_cam
I installed rtabmap first from the binaries and did not work. Then I deleted everything and installed from Source and I still get the same error. I made sure to change the topics for the rtabmap launch, but I am getting the following error:
> /rtabmap/rgbd_odometry subscribed to
> (approx sync): /usb_cam/image_raw,
> /camera/depth_registered/image_raw,
> /usb_cam/camera_info terminate called
> after throwing an instance of
> 'UException' what(): [FATAL]
> (2017-02-06 16:37:20.365)
> CameraModel.cpp:77::CameraModel()
> Condition (fx > 0.0) not met!
> [fx=0.000000]
Followed by the following:
> [ WARN] [1486366640.733577972]:
> odometry: Could not get transform from
> camera_link to head_camera
> (stamp=1486366640.510289) after
> 0.200000 seconds ("wait_for_transform_duration"=0.200000)!
> Error=". canTransform returned after
> 0.201911 timeout was 0.2."
I tried creating a calibration file, but it seems that if I go to Window->Preferences->Source it doesn't allow me to change any of the parameters (everything is dark).
My launch commands are as follows (none of them work):
> roslaunch rtabmap_ros rgbd_mapping.launch rtabmap_args:="--delete_db_on_start" rviz:=true rtabmapviz:=false rgb_topic:=/usb_cam/image_raw depth_topic:=/camera/depth_registered/image_raw camera_info_topic:=/usb_cam/camera_info
>roslaunch rtabmap_ros rgbd_mapping.launch rtabmap_args:="--delete_db_on_start" rgb_topic:=/usb_cam/image_raw depth_topic:=/camera/depth_registered/image_raw camera_info_topic:=/usb_cam/camera_info
↧
rtabmap visual odometry
hi everyone,
i'm facing some error or strange behaviour when rotating my camera along the yaw axis. i'm running the handheld tutorial (http://wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping) with a orbbec astra (running with the astra_camera ros package).
i put the camera on a tripod just to be sure to not translate my camera while rotating it, however the odometry shows an unexpected translation on the y axis (as you can see in the attached pictures).
Any sort of help will be very appreciated.
ps.
i'm not sure the browser to have attached my file.. in case you can find it here https://drive.google.com/open?id=0B5G6Ct8c7jq7V3lMcWwzMThEMms
↧
↧
Robot Navigation using kinect and 3D mapping for humanoid robot
I have been working on this project and trying to establish a 3D mapping environment. But my main question lies, whether
I can use the 3D mapping from kinect v2 and be able to do navigation for the humanoid robot. I was successfully able to process 3D mapping using rtabmap and rviz.
Currently, I am using ubuntu 14.04 LTS, processor Intel® Core™ i5-3210M CPU @ 2.50GHz × 4, kinect v2. Kindly guide me in this matter, that how am I supposed to implement navigation on humanoid robot.
↧
Rotate rtabmap's local_costmap?
We're using rtabmap and only our local costmap is incorrectly rotated 90 degrees :(. We've tried using TF to rotate a few things with the static_transform_publisher but haven't been successful. Do you have any suggestions? See screenshots:
Incorrect: (the local cost map)

Correct: (the global cost map)

Thank you!
↧
rtabmap rgbd_odometry: 0 features extracted from RealSense rgbd bag file
I tried to run `rtabmap/rgbd_odometry` node against a bag file captured from RealSense. The `pose` part of `/rtabmap/odom` messages are all 0s (position and orientation). Meanwhile `rtabmap/rgbd_odometry` keeps printing this warnings:
**[ WARN] (2017-03-22 03:05:04.366) OdometryF2M.cpp:819::computeTransform() 20 visual features required to initialize the odometry (only 0 extracted).**
I ran it with these commands:
`$ roslaunch rtabmap_ros my.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/depth/image_raw camera_info_topic:=/camera/color/camera_info rgb_topic:=/camera/color/image_raw`
`$ rosbag play -l --clock ./2017-03-11-12-11-06.bagfile`
Here are the [launch file](https://octoprint.blob.core.windows.net/vids/my.launch) and [bag file](https://octoprint.blob.core.windows.net/vids/2017-03-11-12-11-06.bagfile)
↧
Localization problem: GPS drift when using visual odomery and INS
So, I have been configuring around robot_localization package to use GPS and IMU together with visual odometry from rtabmap, but have weird behavior. To begin with, I use Orbbec Astra as source of visual odometry, which is then sent to first and second instances of ekf_localization node, which produces visual odometry+imu as first instance and visual+imu+navsat odometry from second. As well I have navsat_transform node which takes output from second ekf instance, which is odom_gps and gps/imu data. Then i use launch file from rtabmap_ros package called sensor_fusion, where I removed the robot_localization node part, because I want to use configuration described above. First instance works fine, visual odometry also seem to be reasonably stable, however second instance and navsat_transform node output unstable behavior. At the start the position is around ~20cm a drift in x and y and if some small slow movement is done with the base, it starts drifting around to positive and negative directions (sometimes to one, sometimes to both). I tried to do this outside in more open space to give GPS to catch reasonable amount of satellites and the problem still persists. I have read one similar question before, but his problem was that he used wrong output - from first ekf instance. I double checked if I had that and I am sure i use output from second ekf instance. I would really appreciate if anyone had similar problem and managed to overcome it. I post the launch file of both rtab and robot localization (https://www.dropbox.com/sh/57zaiqweulrxslf/AAB3DxMIABsECWNLPr1P7H4ia?dl=0) as well image of drift. Also geared used is: Xsens MTi-G-700, Orbbec Astra.
**Robot_localization.launch**
[true, true, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false] [false, false, false,
false, false, true,
false, false, false,
false, false, true,
false, false, false] [false, false, false,
false, false, false,
true, true, true,
false, false, false,
false, false, false] [true, true, false,
false, false, false,
false, false, false,
false, false, false,
false, false, false] [false, false, false,
false, false, true,
false, false, false,
false, false, true,
false, false, false]
**Sensor_fusion.launch (from rtabmap_ros)**
↧
↧
Rtabmap deletes part of map
I am running Ubuntu Mate 16.04 on Raspberry Pi 3 and have tried to build a map with rtabmap. I am using Kinect, fake lasermsg from Kinect, odometry from wheel encoders and imu. Rtabmap is running on the Raspberry Pi and the visualisation with Rviz on a remote computer.
When the robot drives around, he builds the map quiet well but deletes these parts of the map, where he is not right now.
Launchfile:
Warning:
[ WARN] (2017-04-15 22:28:21.616) Rtabmap.cpp:1786::process() Rejected loop closure 803 -> 819: Cannot compute transform (cor=90 corrRatio=0.140625/0.500000)
[ WARN] (2017-04-15 22:28:21.626) Memory.cpp:1580::forget() Less signatures transferred (0) than added (3)! The working memory cannot decrease in size.
↧
Hardware Setup for Slam and OpenCV
Hi,
currently I am running Ros Kinetic on a Raspberry Pi 3 and using rtabmap with Kinect V1 for mapping and localization.
I am also trying to run face and human recognition with OpenCV. For that I want to buy a new computing board and camera.
Could anybody recommend a good setup? I am thinking about the Nvidia Jetson TX1 and the Intel RealSense SR300. My budget for both is around 500$.
↧
robot_localization producing 180° rotation jumps only when static publishing map->odom
I have configured `world_frame=odom`. When I statically publish TF map->odom my output of ekf_localization_node is jumping a lot. I've looked all over for the problem but I am stuck!
Here's an 8 second streamable video of the problem seen in rviz:
https://drive.google.com/file/d/0B6qINUvvDwDfMzdWUG1CWWNlMm8/view?usp=sharing
When I comment out...
...the problem goes away and I see good, stable, filtered odom.
My data bag of the problem
https://drive.google.com/file/d/0B6qINUvvDwDfOGxYdVFvUlFMNzg/view?usp=sharing
My ekf_localization_node.launch
[false, false, false,
true, true, true,
false, false, false,
false, false, false,
false, false, false] [true, true, false,
false, false, false,
false, false, false,
false, false, true,
false, false, false]
My rtabmap rgbd_odometry.launch
My roswtf
dan@ubuntu:~/URC$ roswtf
Loaded plugin tf.tfwtf
No package or stack in context
================================================================================
Static checks summary:
No errors or warnings
================================================================================
Beginning tests of your ROS graph. These may take awhile...
analyzing graph...
... done analyzing graph
running graph rules...
... done running graph rules
running tf checks, this will take a second...
... tf checks complete
Online checks summary:
Found 2 warning(s).
Warnings are things that may be just fine, but are sometimes at fault
WARNING The following node subscriptions are unconnected:
* /ekf_localization_local:
* /set_pose
* /rqt_gui_py_node_8452:
* /statistics
WARNING The following nodes are unexpectedly connected:
* /rgbd_odometry/zed_odom->/rviz_1492655609198267406 (/tf)
* /rgbd_odometry/zed_odom->/roswtf_9931_1492660614427 (/tf)
* /robot_base_to_camera_link->/rviz_1492655609198267406 (/tf_static)
* /map_to_odom->/rviz_1492655609198267406 (/tf_static)
* /camera_link_to_zed_actual_frame->/rviz_1492655609198267406 (/tf_static)
My bad TF (when publishing static map->odom transform)
$ rosrun tf tf_echo robot_base odom
At time 1492662864.848
- Translation: [0.001, -0.017, -0.038]
- Rotation: in Quaternion [0.639, -0.283, -0.638, 0.324]
in RPY (radian) [1.541, 0.684, -1.535]
in RPY (degree) [88.289, 39.178, -87.969]
My good TF (when NOT publishing static map->odom transform)
$ rosrun tf tf_echo robot_base odom
At time 1492661532.036
- Translation: [0.016, -0.004, 0.000]
- Rotation: in Quaternion [-0.000, -0.000, 0.001, 1.000]
in RPY (radian) [-0.000, -0.000, 0.003]
in RPY (degree) [-0.000, -0.000, 0.154]
I'm using Ubuntu 16.04 on a TX1.
Here's my version of robot_localization
commit 3c7521fb109fcd0fe6090ae8ae407b031838e80d
Merge: fe2f10e cd86690
Author: Tom Moore
Date: Mon Mar 27 19:45:55 2017 +0100
Merge pull request #355 from clearpathrobotics/fix_gains_defaults
Fix acceleration and deceleration gains default logic
I appreciate any help! Thanks!
Fixed typo: my problem transform IS map -> odom
↧
rtabmap frontier exploration
Hello community,
has somebody tried to combine RTABMAP and frontier_exploration ?
How can I solve the problem of discovering an unknown area to build a map first ?
Idea:
1. discover the area with frontier_exploration and build a map.
2. navigation in a known environment with discovered map.
Does this make sense ???
Thanks for any hints or feedback.
Cheers
Christian
ps: is google cartographer a complete solution ?
↧
↧
Where does RTABMAP_Cloud save the Downloaded map and graph from RVIZ?
Hi,
I used rtabmap on my turtlebot with RVIZ. From the left panel when I Download Map or Graph it does the downloads, but I cannot figure out to which path has the map and graph been downloaded to. I have checked rtabmap.db and it shows that it was updated 3 days ago even though I did the "Download" today.
Can someone point out where the map and graph from the RTABMAP is saved to when done through RVIZ?
↧
Transform from map to odom for robot using rtabmap
I built a custom robot using URDF and I'm trying to view the position of the robot using the "odom" published from rtabmap. I managed to create a transform from odom to base_footprint but when I run rtabmap I get the following warning:
**Could not get transform from base_footprint to kinect2_rgb_optical_frame**
I've tried to create the transforms within rgbd_mapping_kinect2.launch by changing my **fixed frame** to **base_footprint**
and adding the following transforms:
My Kinect input for my robot model looks like this:
and my tf Tree looks like :http://imgur.com/a/6hTf1
As you can see, **kinect2_base_link** isn't connected to **kinect2_link**. Could you plese help with this?
↧
rtabmap depth_registration ?
I am using rtabmap. rtabmap requires depth/image (sensor_msgs/Image). Does it need a rectified registered depth image or just the raw registered depth image?
I am guessing the registered depth image needs to be in meters.
↧