Demos on Gazebo

Before running the demos it is necessary to get acquainted with the setup section of the documentation. Make sure you check: Using only the simulated robot

Launching the Simulation

  1. Clone the robotont_gazebo package into your workspace:

    git clone https://github.com/robotont/robotont_gazebo.git
    
  2. Build and source the newly added package:

    colcon build --packages-select robotont_gazebo
    source install/setup.bash
    
  3. Launch the simulator using the launch file:

    ros2 launch robotont_gazebo gazebo.launch.py
    

Launch file arguments

Name

Description

Options

generation

Specify the generation of robotont model that is to be loaded

2.1, 3 (default)

model

Specify the model that is to be loaded into the world

robotont_gazebo_basic, robotont_gazebo_lidar, robotont_gazebo_nuc (default)

world

Specify world the robot is spawned in

bangbang.sdf, between.sdf, colors.sdf, mapping.sdf, maze.sdf, minimaze.sdf, minimaze_ar.sdf, empty_world.sdf (default)

x, y, z

Specify the robot’s spawn pose

Number, 0 (default)

Tip

For example, loading the generation 3 model in colors.sdf world at pose (-2, 1, 0):

ros2 launch robotont_gazebo gazebo.launch.py world:=colors.sdf x:=-2 y:=1

Worlds

World

Example

Launch Command

minimaze.sdf

_images/minimaze_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=minimaze.sdf

bangbang.sdf

_images/bangbang_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=bangbang.sdf

between.sdf

_images/between_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=between.sdf

colors.sdf

_images/colors_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=colors.sdf

mapping.sdf

_images/mapping_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=mapping.sdf

maze.sdf

_images/maze_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=maze.sdf

minimaze_ar.sdf

_images/minimaze_ar_world_example.png

ros2 launch robotont_gazebo gazebo.launch.py world:=minimaze_ar.sdf

2D Mapping and Localization

Setup

Hint

Before installing any packages from apt, make sure existing packages are up-to-date:

sudo apt update && sudo apt upgrade -y

Hint

ROS packages installed from apt are only available in terminals where the ROS environment has been sourced. To use these packages, you must first source the general ROS 2 environment:

source /opt/ros/jazzy/setup.bash
  1. Install Nav2 from apt:

    sudo apt install ros-jazzy-navigation2
    
  2. Navigate to your colcon workspace

    cd ~/<your_colcon_workspace>/src
    
  3. Clone the depthimage_to_laserscan package

    git clone https://github.com/ros-perception/depthimage_to_laserscan.git --branch ros2
    
  4. Build the package:

    colcon build --packages-select depthimage_to_laserscan
    

The demo for 2D slam based navigation is available from this repository.

  1. Navigate to your colcon workspace

    cd ~/<your_colcon_workspace>/src
    
  2. Clone the 2d_slam package

    git clone https://github.com/robotont-demos/2d_slam.git
    
  3. Build the package:

    colcon build --packages-select 2d_slam
    

Running the demo

The demo can be run on a Robotont featuring either a LIDAR or the standard Realsense D435i camera

  1. Spawn LIDAR Robotont in a gazebo world

    ros2 launch robotont_gazebo gazebo.launch.py model:=robotont_gazebo_lidar world:=<world_name>.sdf
    
  2. Launch the navigation stack and slam

    ros2 launch 2d_slam nav2_lidar_slam.launch.py
    
  3. (Optional) Visualize costmaps and the robot’s model in Rviz2

    ros2 launch 2d_slam rviz2_visualize_costmaps.launch.py
    

Setting 2D navigation goals

Using ROS Navigation to make the robot move autonomously is straightforward. In RViz, you have two main GUI buttons: one to set the robot’s current location (if it doesn’t localize itself accurately at startup), and one to set its navigation goal.

  1. To set the initial pose:

    Click on “2D Pose Estimate” in the RViz toolbar, then click and drag the arrow to indicate where the robot is located and which way it is facing.

    _images/pose_estimate.gif
  2. To set a navigation goal:

    Click on “2D Goal Pose” in the RViz toolbar, then click and drag the arrow to the desired destination and orientation for the robot.

    _images/nav_goal.gif

3D mapping

Creates a 3D map of the robot’s surroundings.

_images/wip.gif

Follow the leader

The follow the leader demo shows the capabilities of the Robotont platform to detect and follow the AR Tag.

_images/wip.gif

AR steering

The AR steering demo shows the capabilities of the Robotont platform to detect and follow the AR Tag.

_images/wip.gif

AR maze

The AR maze demo shows the capabilities of the Robotont platform to detect and follow the AR Tag.

_images/wip.gif