Demos on Gazebo

Before running the demos it is necessary to get acquinted with the setup section of the documentation. Make sure you check: Using only the simulated robot

Launching the Simulation

  1. To launch the simulator:

    roslaunch robotont_gazebo gazebo.launch
    

The launch file has three arguments:

  • model - chooses between a model with NUC and realsense and a model without them

    • default: robotont_gazebo_nuc

    • options: robotont_gazebo_nuc, robotont_gazebo_basic

  • world - chooses which world to use

    • default: empty.world

    • options: empty.world, minimaze.world, bangbang.world, between.world, colors.world

  • x_pos - chooses x coordinate of the world, controls where the robot will spawn, default: 0

For example, the following command will spawn the robot to a world called bangbang.world in position x=2 and the model that will be used is robotont_gazebo_nuc.

roslaunch robotont_gazebo gazebo.launch world:=$(rospack find robotont_gazebo)/worlds/bangbang.world model:=robotont_gazebo_nuc x_pos:=2

Worlds

  1. minimaze.world

    _images/maze.png

    To run

    roslaunch robotont_gazebo world_minimaze.launch
    
  2. bangbang.world

    _images/bangbang.png

    To run

    roslaunch robotont_gazebo world_bangbang.launch
    
  3. between.world

    _images/between.png

    To run

    roslaunch robotont_gazebo world_between.launch
    
  4. colors.world

    _images/colors.png

    To run

    roslaunch robotont_gazebo world_colors.launch
    

2D Mapping and Localization

Installation

The following packages are needed to run the 2d mapping demo:

sudo apt update
sudo apt install ros-noetic-depthimage-to-laserscan
sudo apt install ros-noetic-move-base

To run the 2D mapping demo, you need to clone the base package:

git clone https://github.com/robotont-demos/demo_slam.git

and choose a mapping method from the following:

  1. Cartographer

  2. Gmapping

  3. Hector SLAM

Gmapping and AMCL

Installation

You can clone the package for the Gmapping method from this repository.

To clone the packages:

git clone https://github.com/robotont-demos/demo_slam_gmapping.git
git clone https://github.com/robotont-demos/demo_teleop_keyboard.git

Running the demo

  1. Launch the simulator

    roslaunch robotont_gazebo world_minimaze.launch
    
  2. Launch teleop keyboard

    roslaunch robotont_demos teleop_keyboard.launch
    
  3. Launch 2d_slam.launch

    roslaunch demo_slam_gmapping 2d_slam.launch
    
  4. Display the map on RViz

    roslaunch demo_slam 2d_slam_display.launch
    

Cartographer

Installation

You can clone the package for the Cartographer method from this repository.

To clone the packages:

git clone https://github.com/robotont-demos/demo_slam_cartographer.git
git clone https://github.com/robotont-demos/demo_teleop_keyboard.git

Running the demo

  1. Launch the simulator

    roslaunch robotont_gazebo world_minimaze.launch
    
  2. Launch teleop keyboard

    roslaunch demo_teleop teleop_keyboard.launch
    
  3. Launch 2d_slam.launch

    roslaunch demo_slam_cartographer 2d_slam.launch
    
  4. Display the map on RViz

    roslaunch demo_slam 2d_slam_display.launch
    

Hector SLAM

Installation

You can clone the package for the Hector SLAM method from this repository.

To clone the packages:

git clone https://github.com/robotont-demos/demo_slam_hector.git
git clone https://github.com/robotont-demos/demo_teleop_keyboard.git

Running the demo

  1. Launch the simulator

    roslaunch robotont_gazebo world_minimaze.launch
    
  2. Launch teleop keyboard

    roslaunch demo_teleop teleop_keyboard.launch
    
  3. Launch 2d_slam.launch

    roslaunch demo_slam_hector 2d_slam.launch
    
  4. Display the map on RViz

    roslaunch demo_slam 2d_slam_display.launch
    

Setting 2D navigation goals

  1. Using ROS Navigation to make the robot move autonomously is pretty straightforward. There are two GUI buttons in RViz to tell the robot where it is located (if it fails to accurately localize at startup) and where it needs to go.

  2. For setting initial pose, click on 2D Pose Estimate and drag the arrow where and how the robot actually is.

    _images/poseestimatearrow.png
  3. To tell the robot where to go, click on 2D Nav Goal and drag the arrow to where you want the robot to go and which way does it have to face.

_images/2dnavgoalarrow.png

3D mapping

Creates a 3D map of the robot’s surroundings.

Installation

  1. For 3D mapping:

    sudo apt install ros-noetic-rtabmap-ros
    

and clone the following packages:

git clone https://github.com/robotont-demos/demo_mapping_3d.git
git clone https://github.com/robotont-demos/demo_teleop_keyboard.git

Running the demo

  1. Launch the simulator

    roslaunch robotont_gazebo world_colors.launch
    
  2. Launch mapping_3d.launch

    roslaunch demo_mapping_3d mapping_3d.launch
    
  3. Launch mapping_3d_display.launch to visualize the result

    roslaunch demo_mapping_3d mapping_3d_display.launch
    
  4. To move the robot open another terminal window and run teleop twist keyboard

    rosrun demo_teleop teleop_keyboard.launch
    

    Hint

    Notice that the teleop node only receives keypresses when the terminal window is active.

_images/3d_mapping_gazebo.png

The robot identifies and tracks the pose of the provided AR tag and acts accordingly.

Follow the leader

The follow the leader demo showing the capabilities of the Robotont platform to detect and follow the AR Tag.

Installation

  1. For AR tracking:

    git clone https://github.com/machinekoder/ar_track_alvar.git -b noetic-devel
    git clone https://github.com/robotont-demos/demo_ar_follow_the_leader.git
    

Running the demo

On the works

AR steering

The AR steering demo showing the capabilities of the Robotont platform to detect and follow the AR Tag.

  1. For AR tracking:

    git clone https://github.com/machinekoder/ar_track_alvar.git -b noetic-devel
    git clone https://github.com/robotont-demos/demo_ar_steering.git
    
  1. Launch ar_steering.launch (change tag_nr with your AR tag number)

    roslaunch demo_ar_steering ar_steering.launch marker_id:=tag_nr
    
  2. Launch the simulator

    roslaunch robotont_gazebo world_colors.launch
    

AR maze

The gazebo AR maze demo showing the capabilities of the Robotont platform to detect and follow the AR Tag.

  1. For AR tracking:

    git clone https://github.com/machinekoder/ar_track_alvar.git -b noetic-devel
    git clone https://github.com/robotont-demos/demo_ar_maze.git
    
  1. Launch gazebo_ar_maze.launch

    roslaunch demo_ar_maze gazebo_ar_maze.launch
    
  2. Launch the simulator

    roslaunch robotont_gazebo world_minimaze_ar.launch