Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 214 articles
Browse latest View live

how to install rtab map in ROS indigo

$
0
0
Hi all, I am using ROS indigo.I want to create a 3D map of my lab using hand held kinect and I read that rtab map is the best option.But my ROS did not have rtab map package.Someone please help me to install rtab map.

Invalid roslaunch XML syntax

$
0
0
Hi guys, I am unable to use the following launch file without getting "Invalid roslaunch XML syntax". Any suggestions? (I'm a novice) Thanks in advance. http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot#Remote_mapping

data_recorder node missing from rtabmap_ros package

$
0
0
Hello, I recently installed rtabmap_ros and rtabmap on Ubuntu 14.04, using apt-get: sudo apt-get install ros-indigo-rtabmap-ros sudo apt-get install ros-indigo-rtabmap I would like to record stereo images to a database, as is done in this sample [launch file](https://github.com/introlab/rtabmap_ros/blob/master/launch/tests/test_stereo_data_recorder.launch). However, I noticed that the node "[data_recorder](http://wiki.ros.org/rtabmap_ros#data_recorder)" is missing. rosrun rtabmap_ros data_recorder [rosrun] Couldn't find executable named data_recorder below /opt/ros/indigo/share/rtabmap_ros The only nodes available from rtabmap_ros are the following: rosrun rtabmap_ros camera grid_map_assembler map_optimizer rtabmap stereo_odometry data_player map_assembler rgbd_odometry rtabmapviz Am I missing a library to build this node? I did not receive any error or warning when downloading the packages from apt-get and my system is up-to-date. Alternatively, is there another package that can build a database of stereo images to use with rtabmap odometry viewer tool? Best Regards, Audren

RTAB-Map remote mapping issue , unable to get data fast enough over WiFi

$
0
0
Hi guys, I am having issues using RTAB-Map on the client side(base station/my laptop). I have a Kinect on my robot and the machine it is connected is set as the Master.I have no problem accessing the topics produced from the client side , when I view the topics individually on rviz on the client side everything works fine. But when I launch rtabmapviz it lags to the point of failure.It starts to build the map but it is too slow and eventually fails. I am guessing the issue is with getting the data from the robot to the base station over WiFi. I have tried using the remote mapping launch file example but there is no data when I ros echo the topics. Any help would be much appreciated. Thanks!

how to use SGBM algorithm in stereo_image_proc

$
0
0
stereo_image_proc by default uses **StereoBM** algorithm how can we change this to **SGBM**

Interfacing Rtabmap and Robot_Localization

$
0
0
Hello, I am working to use robot_localization to take in info from a piksi gps, rp 2d lidar, and UM7 IMU and to give me a good estimate of where I am in space. I want to report this to rtabmap so I can relate this position to the 3d environment of my robot. When I run rtabmap without [robot_localization](http://wiki.ros.org/robot_localization) everying works, however when I run [robot_localization](http://wiki.ros.org/robot_localization) alongside rtabmap the odom frame drifts like crazy. I didn't know if this was an issue that you have seen before or not. Let me know what you need from me! Here is my Launch file folder: https://github.com/zastrix/ROS-Launch-Files.git rtabmap Launch file called [robo_loco.launch](https://github.com/zastrix/ROS-Launch-Files/blob/master/robo_loco.launch) Robot Localization launch file called [rgbd_mapping.launch](https://github.com/zastrix/ROS-Launch-Files/blob/master/rgbd_mapping.launch) Thanks for all the help!

rtabmap-ros mapping without loop closure detection

$
0
0
Hi, I'm using rtabmap_ros with kinect and a simulation environment (Gazebo) and ubuntu 14.04 and ros indigo I'm trying to 3D map a very big structure model (Aircraft model) placed in a Gazebo environment..the 3D mapping is done using a kinect placed on a UAV that autonomously navigates around the structure ... the map starts to be created successfully and incrementally but after a several hours the first mapped parts disappears. covering the structure takes alot of time. The path that the UAV follows is too long and does not include loop closure so I increased the RGBD/localimmunizationRatio from 0.025 to 0.5 to handle longer paths and I set up the RGBD/LocalLoopDetectionSpace to false but still I have the same problem so what could be the problem in my case ?? Here is the launch file part with the parameters I used:

use of rtabmap with 3D lidars

$
0
0
I need to scan and get an accurate 3D model of a small building and its surroundings (trees, etc.). I plan on mounting my sensors on a small drone and fly around the building to scan it. The resolution should be good enough to see details like pipes, chimneys, conduits, etc. My main question is whether rtabmap can be used to achieve it. My second questions is what is the best sensor setup to use with rtabmap to achieve this goal. I have a 3D LiDAR unit (Velodyne VLP-16). Can it be used by RTAB to produce a 3D model? Do I need any additional sensors and why? I can use velodyne_pointcloud node to get sensor_msgs/PointCloud2 point clouds from the lidar. Can rtabmap be used to produce an accurate 3D model from them? Any insight is greatly appreciated. Thank you.

rtabmap_ros loop closure rejection when doing tutorial

$
0
0
I am using **ros indigo** in **ubuntu 14.04** when i am running the stereo outdoor tutorial in `rtabmap_ros` with `Stereo_outdoorB.bag` file as in :[link](http://wiki.ros.org/rtabmap_ros/Tutorials/StereoOutdoorMapping) i get a `loop closure rejction` error at the end of the bagfile here I have atached the screenshots : ![image description](/upfiles/1456494605606321.png) ![image description](/upfiles/14564946327794336.png)

How to use good parameter in rtabmap_ros (stereo_mapping.launch)

$
0
0
I get good parameter on rqt_reconfig this dont have speckle but when roslaunch rtabmap_ros stereo_mapping.launch stereo_namespace:="/stereo_camera" rtabmap_args:="--delete_db_on_start" approximate_sync:=true My map have speckle how to use parameter on mapping ![image description](http://)

rtabmap map data (get_map)

$
0
0
Hi, I have problem in loading the generated map using RTAB map, I already used rtabmap/get_map service but I don't know how to proceed and export the ply mesh using the command line or rviz. I want to do that using the command line since rtabmapviz crashes when I load the cloud and then export the mesh since my map is really too big. when I call get_map service in command line, I get a lot of information displayed (I loaded them to a file), snap shots of this information is available in the following link : [images link](https://www.dropbox.com/sh/7nb0dtha87o2zhk/AACzfkR7aym6iMC_rmK5W2dya?dl=0) could this information be used to generate a mesh file (.ply), or is there another way other than rtabmapviz to generate a mesh file ?? Note: I use ubuntu 14.04, ros indigo and I preformed the mapping using rtabmap-ros package and writing a launch file to setup rtab to my robot. Thanks

Move_base Parameters and lag in rviz

$
0
0
We are having an interesting problem with move_base interconnecting with our system. We are using RTABMAP and Robot_Localization (Stereo vision camera, IMU, 2d Lidar) to perceive where we are in space and to tell us where obstacles are around us. RTABMAP and RL are working well together when we don't run the move base node (We have move base running on a Nvidia Jetson TX1, RTABMAP running on a separate computer, and all the sensor nodes running on a little odroid). Meaning that when ever we take the sensor information then look at it with Rviz everything works fine with no real errors in localization and mapping. However wen we run Move_base and give the robot a 2d Navgoal it seems to drop updates to rviz while the robot is moving and tends to overshoot its goal then try to correct itself whilst moving around like crazy. It keeps overshooting then trying to correct then overshooting again. Move_base isn't throwing any errors however this behavior is very strange. Any help will be greatly appreciated. I will attach our param files to see if there is something in there you see. Let me know if you need anything else. base_local_planner.yaml
DWAPlannerROS:
#
# Robot Configuration Parameters
#
  acc_lim_x: 0.5 # The x acceleration limit of the robot in meters/sec^2
  acc_lim_y: 0 # The y acceleration limit of the robot in meters/sec^2
  acc_lim_th: 1.5 # The rotational acceleration limit of the robot in radians/sec^2

  max_trans_vel: 0.25 # The absolute value of the maximum translational velocity for the robot in m/s
  min_trans_vel: 0.0 # The absolute value of the minimum translational velocity for the robot in m/s

  max_vel_x: 0.25 # The maximum x velocity for the robot in m/s.
  min_vel_x: -0.25 # The minimum x velocity for the robot in m/s, negative for backwards motion.

  max_vel_y: 0.0 # The maximum y velocity for the robot in m/s
  min_vel_y: 0.0 # The minimum y velocity for the robot in m/s

  max_rot_vel: 2.0 # The absolute value of the maximum rotational velocity for the robot in rad/s
  min_rot_vel: 0.5 # The absolute value of the minimum rotational velocity for the robot in rad/s
  
# WARNING:
# These parameters may only be for TrajectoryPlannerROS... they may not work for DWAPlannerROS..
  max_vel_theta: 2.0 # The maximum rotational velocity allowed for the base in radians/sec
  min_vel_theta: -2.0 # The minimum rotational velocity allowed for the base in radians/sec
  min_in_place_vel_theta: 0.6 # The minimum rotational velocity allowed for the base while performing in-place rotations in radians/sec
  holonomic_robot: false # Determines whether velocity commands are generated for a holonomic or non-holonomic robot.
  

#
# Goal Tolerance Parameters
#
  yaw_goal_tolerance: 0.20 # (11 degrees) The tolerance in radians for the controller in yaw/rotation when achieving its goal
  xy_goal_tolerance: 0.30 # (30cm) The tolerance in meters for the controller in the x & y distance when achieving a goal
  latch_xy_goal_tolerance: false #If goal tolerance is latched, if the robot ever reaches the goal xy location it will simply rotate in place, even if it ends up outside the goal tolerance while it is doing so.

#
# Forward Simulation Parameters
#
  sim_time: 1.7 # The amount of time to forward-simulate trajectories in seconds
  sim_granularity: 0.025 # The step size, in meters, to take between points on a given trajectory

  vx_samples: 12 # The number of samples to use when exploring the x velocity space
  vy_samples: 0 # The number of samples to use when exploring the y velocity space
  vtheta_samples: 24 # The number of samples to use when exploring the theta velocity space

  penalize_negative_x: false # Whether to penalize trajectories that have negative x velocities.

#
# Trajectory Scoring Parameters
#
  path_distance_bias: 5.0 # The weighting for how much the controller should stay close to the path it was given
  goal_distance_bias: 9.0 # The weighting for how much the controller should attempt to reach its local goal, also controls speed

  occdist_scale: 0.01 # The weighting for how much the controller should attempt to avoid obstacles
  forward_point_distance: 0.325 # The distance from the center point of the robot to place an additional scoring point, in meters
  stop_time_buffer: 0.2 # The amount of time that the robot must stop before a collision in order for a trajectory to be considered valid in seconds
  scaling_speed: 0.25 # The absolute value of the veolicty at which to start scaling the robot's footprint, in m/s
  max_scaling_factor: 0.2 # The maximum factor to scale the robot's footprint by

#
# Global Plan Parameters
#
  prune_plan: true # Defines whether or not to eat up the plan as the robot moves along the path. If set to true, points will fall off the end of the plan once the robot moves 1 meter past them.
  oscillation_reset_dist: 0.05 # How far the robot must travel in meters before oscillation flags are reset
  meter_scoring: true
 

costmap_common_params.yaml

#
# Common Configuration (local_costmap) & (global_costmap)
#
robot_base_frame: base_link
plugins:
  - {name: obstacles_layer, type: "costmap_2d::ObstacleLayer"}
  - {name: inflater_layer, type: "costmap_2d::InflationLayer"}
# Robot Specific
footprint: [[ 0.765, 0.765], [ -0.765, 0.765], [ -0.765, -0.765], [ 0.765, -0.765]] # corrected to include tires in the footprint
footprint_padding: 0.10 # 10cm buffer for safety. maybe change for more precision
inflation_radius: 1.5
transform_tolerance: 1.0 # Specifies the delay in transform (tf) data that is tolerable in seconds.
controller_patience: 2.0 # How long the controller will wait in seconds without receiving a valid control before space-clearing operations are performed.


#added
meter_scoring: true

# base global planner
NavfnROS:
  allow_unknown: true # Specifies whether or not to allow navfn to create plans that traverse unknown space.

inflater_layer:
  inflation_radius: 1.5

obstacles_layer:
  observation_sources: laser zed_obstacles

  laser: {
    observation_persistence: 0.0,
    sensor_frame: lidar_link,
    data_type: LaserScan,
    topic: /scan,
    marking: true,
    clearing: true,
    inf_is_valid: true,
    raytrace_range: 5.0,
    obstacle_range: 5.0
  }
  
  zed_obstacles: {
    data_type: PointCloud2,
    topic: /obstacles,
    marking: true,
    clearing: true,
    max_obstacle_height: 1.5,
    min_obstacle_height: 0.0
  }

global_costmap_params.yaml
global_costmap:
  global_frame: map
  robot_base_frame: base_link
  update_frequency: 1
  publish_frequency: 1
  static_map: true
  rolling_window: false
  width: 40.0
  height: 40.0
  resolution: 0.125
  origin_x: -20
  origin_y: -20

plugins:
  - {name: obstacles_layer, type: "costmap_2d::ObstacleLayer"}
  - {name: inflater_layer, type: "costmap_2d::InflationLayer"}

local_costmap_params.yaml
local_costmap:
  global_frame: odom
  robot_base_frame: base_link
  update_frequency: 5.0
  publish_frequency: 8.0
  static_map: false
  rolling_window: true
  width: 7.0
  height: 7.0
  resolution: 0.125
  origin_x: -3.5
  origin_y: -3.5

#plugins:
#  - {name: obstacles_layer, type: "costmap_2d::ObstacleLayer"}
#  - {name: inflater_layer, type: "costmap_2d::InflationLayer"}

We are pretty sure this sporadic behavior in rviz has something to do with move_base however if you think otherwise we are open to suggestions. We can also supply any other set up files needed! Let me know if you need any other information from me.....we are at a stand still with this project till we figure out this issue. We are continuing testing to try to solve this issue but have no idea what we are looking for!

Can't run rtabmap from roslaunch

$
0
0
I installed **rtabmap** with `$ sudo apt-get install ros-indigo-rtabmap-ros` but I'm not able to launch it as expected: $ roslaunch rtabmap_ros rtabmap.launch [rtabmap.launch] is neither a launch file in package [rtabmap_ros] nor is [rtabmap_ros] a launch file name The traceback for the exception was written to the log file Because I installed a premade binary do I need to create my own launch files? If so, where/how do I do that? I've tried to build from source but I'm running into a number of pcl/openni issues. Both are installed and working but I believe there's some sort of conflict with how I built one or both of them. Unfortunately the error output is so long that it fills up the terminal so far that I can't even scroll up to see the beginning. I'm recompiling PCL right now with all related options enabled to see if that will help (it's taking some time). Suggestions? ** EDIT ** I trying to compile on both an `armv7` and `x86_64`. Here's the much smaller error output of the `armv7`: $ make [ 6%] Built target rtabmap_utilite [ 7%] Built target res_tool [ 42%] Built target rtabmap_core [ 87%] Built target rtabmap_gui Linking CXX executable ../../../bin/rtabmap ../../../bin/librtabmap_core.so.0.11.4: undefined reference to `pcl::OrganizedFastMesh::performReconstruction(pcl::PolygonMesh&)' ../../../bin/librtabmap_core.so.0.11.4: undefined reference to `pcl::OrganizedFastMesh::performReconstruction(std::vector>&)' collect2: error: ld returned 1 exit status make[2]: *** [../bin/rtabmap] Error 1 make[1]: *** [app/src/CMakeFiles/rtabmap.dir/all] Error 2 make: *** [all] Error 2 Do I need to add the `OrganizedFastMesh` file somewhere?

Using RTABMap and Kinect, how do you get distance at a certain point?

$
0
0
Hi. I appreciate all the efforts and amazing work in the ROS community. In my situation, I am trying to estimate a distance to an object using RTABMap, ROS, and a Kinect 360. For my implementation, we are using ROS indigo and libfreenect . In our system, we are able to display the field of view of the robot to a web page using roslibjs. When the user clicks on a certain (x,y) coordinate on that video stream, we want to obtain an estimated distance to the object. Is there an API in RTABMap/ROS that can help me obtain this information easily? It looks like the information should be available in the following topic: /camera/depth_registered/image_raw - Raw image from device. Contains uint16 depths in mm. I'm not sure how I might extract/read the information. I do appreciate your help and pointers. All the best!

How to offset costmap in rviz?

$
0
0
Hi there! I'm running the examples found here: http://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot Now, I don't have a turtlebot, so my kinect is placed a lot higher up. This makes the map visualize pretty weird in rviz (it's floating in mid air). How do I offset it to say 1m below it's current position on the z-axis? Do I make a new frame with /base_link or similar as parent and edit a few launch files, if yes, what launch files? If no, how to move the map? (Map position is uneditable in rviz) Let me know if anything was unclear, thank you!

rtabmap point cloud unit (m, mm, inch?)

$
0
0
I was wondering how to set the unit that is used in the point cloud?

Does rtabmap deal with moving objects?

$
0
0
I was reading this guide: http://wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping I am unsure whether rtabmap supports moving objects (for example, after generating a map, if a chair moves, or if the mapping captured a person who is moving, do these things throw off rtabmap's localization)?

rtabmap_ros how to set odometry and camera?

$
0
0
Hello i try to xtion, kobuki(2 wheel mobile robot). i read http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot. and apply my launch. i modify [rgbd_mapping.launch](https://github.com/introlab/rtabmap_ros/blob/master/launch/rgbd_mapping.launch).and program work normally. but result is strange. first. set my xtion from robot ( 0.11cm, 0, 0.16cm) so i add tf broadcaster. result is below link. https://plus.google.com/110617758145783258429/posts/RCTNymgamWp 1.point cloud and map cloud is different. is the result normal?? 2.map is bad when robot turn a round and go to origin. https://plus.google.com/110617758145783258429/posts/KcxDVQqxNyg full launch file. ****

From 3D to 2D - where to start?

$
0
0
I need to convert some 3D-maps made with rgbdslam/rtabmap-slam into 2D gridmaps, but really don't know where to start. Can someone please answer me some question to get this done? 1.) rgbdslam: can save octomaps over GUI but the file extension of saved files is pcd instead of bt/ot. Why? 2.) read a lot about saving maps while processing with "rosrun map_server map_saver -f filename". Tried that while rgbdslam/rtabmap where running - response is always: "waiting for map". Do I have to install octomap_server into the same workspace as both slams? And if thats the solution, will I finally have a 2D gridmap by running this command? As you can see on my questions, I'm a newbie. I know this question has been asked and answered multiple times before though i was not able to make it. I would really appreciate it if someone could help me out. THANK YOU

How do you use the iRobot Create 2, a ROS laptop running Kinetic and a Kinect v1 together?

$
0
0
If you know about TurtleBot, scroll down to see my errors when trying to run TurtleBot. How do you use the iRobot Create 2, an Ubuntu laptop running ROS Kinetic and a Kinect v1 together? I tried so many different methods and they all failed! I have very, very bad knowledge with ROS because I got it a few weeks ago. Can some please help me ASAP???!? And I also want to run RViz and navigate around the house. Thanks! (By the way, I also have rtabmap-ros for Kinect and that works really well, but the Create 2 does not!)
Edit: I also have tried the TurtleBot software, fail--- and some other ones--- fail.
Edit 2: I have tried on a Windows Desktop not running ROS and it works perfectly!
Edit 3: I think most of the errors I get are from catkin_make where I run into cmake errors and stuff like "Package not found: ros/time.h" and some other errors. I also have a Kinect 2 I can work with if my Kinect 1 fails, but the iRobot Create is still not working!!!
Edit 4: I tried `catkin_make`ing the Turtlebot things (I don't know if it's compatible with a Create 2), but all I get is: This workspace contains non-catkin packages in it, and catkin cannot build a non-homogeneous workspace without isolation. Try the 'catkin_make_isolated' command instead. Then, when I run catkin_make_isolated I get: error: #error This file requires compiler and library support for the ISO C++ 2011 standard. This support must be enabled with the -std=c++11 or -std=gnu++11 compiler options. Then tons of more errors. I tried running it with both of the options above, but all I get is: catkin_make_isolated -std=gnu++11 usage: catkin_make_isolated [-h] [-C WORKSPACE] [--source SOURCE] [--build BUILD] [--devel DEVEL] [--merge] [--install-space INSTALL_SPACE] [--use-ninja] [--install] [--force-cmake] [--no-color] [--pkg PKGNAME [PKGNAME ...] | --from-pkg PKGNAME] [--only-pkg-with-deps ONLY_PKG_WITH_DEPS [ONLY_PKG_WITH_DEPS ...]] [-q] [--cmake-args [CMAKE_ARGS [CMAKE_ARGS ...]]] [--make-args [MAKE_ARGS [MAKE_ARGS ...]]] [--catkin-make-args [CATKIN_MAKE_ARGS [CATKIN_MAKE_ARGS ...]]] [--override-build-tool-check] catkin_make_isolated: error: unrecognized arguments: -std=gnu++11 Edit 5: I have ROS Kinetic Kame so that's probably why nothing at all is supported. I would really like some updates if TurtleBot can work with Kinetic Kame, because it's not for me!
Help please?!??!!??? -Ayan
Viewing all 214 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>