👀🎮 Visualization and Teleop

The goal of this section is to run the robot simulation using Gazebo, visualise the SMB robot in RViz, control the robot.

First of all, make sure the smb_gazebo has been already built successfully.

# In the host pc, build the smb_gazebo package if you haven't already
 smb_build_packages_up_to meta_smb_sim

👀 Visualisation

Run the following commands in the host pc to visualise the robot.

# In the host pc
ros2 launch smb_gazebo gazebo.launch.py

To run the simulation you do not need a connection to SMB.

Wait until gazebo launches.

To run the state estimation and mapping use the following command.

ros2 launch smb_bringup smb_sim_se.launch.py

🎮 Teleoperation

You can drive the robot with the keyboard. If you want to drive the robot, make sure that the terminal where you launched the simulation (i.e. where the teleop_twist_keyboard node is running) is selected while pressing the keys. To use the keyboard run the following alias:

```bash
# In the host pc
smb_teleop_twist_keyboard
```

- Use the following keys to move:

    | u | i | o |
    | j | k | l |
    | m | , | . |

- Use the following keys to control speed:

    - `q/z`: Increase/decrease max speeds by 10%
    - `w/x`: Increase/decrease only linear speed by 10%
    - `e/c`: Increase/decrease only angular speed by 10%

- Use any other key to stop.

🧭 Navigation

The goal of this section is to run the robot simulation using Gazebo, visualise the SMB robot in RViz, putting waypoints and goalpoints to demonstrate the autonomous capabilities of the robot.

# In the host pc
ros2 launch smb_bringup smb_sim_navigation.launch.py 

🔍 Exploration

In this section we will see a fully autonomous robot exploring the environment working with far planner and ground truth.🥳

# In the host pc
ros2 launch smb_bringup smb_sim_exploration.launch.py

🕵️‍♂️ Object Detection

In this section, we’ll demonstrate how to perform object detection on the SMB robot using a YOLOv5 model. This includes downloading the model, launching detection nodes, adding objects in simulation, and visualizing detections in RViz.

Create a new folder models in /smb_ros2_workspace/src/detection/object_detection/object_detection/models

Place yolov5l.onnx file inside this directory

wget -O src/perception/smb_object_detection/object_detection/models/yolov5l6.onnx https://pub-3ad3dd2988de4537a845ed6aaa048dc4.r2.dev/yolov5l6.onnx

smb_build_packages_up_to object_detection object_detection_msgs

And now it is time to launch:

# In the host pc
ros2 launch smb_bringup gazebo.launch.py
ros2 launch object_detection object_detection.launch.py

Start with adding objects in sim. Scroll down and download the object you want:

Here we choose a humanoid…

But its lying on the ground, how do we get it up?

To move the humanoid: click apply force torque and mouse drag with

Ctrl + right click

Now, we are trying with another object

launch rviz in a seperate terminal

rviz2

Add Image from rviz_default_plugins and change the topic to detections_in_image and here is what you would expect to see.