Skip to content

Husarion Robot XL demo

This demo utilizes Open 3D Engine simulation and allows you to work with RAI on a small mobile platform in a nice apartment.

Screenshot1

Quick start

LLM model

The demo uses the complex_model LLM configured in config.toml. The model should be a multimodal, tool-calling model. See Vendors.

ROS 2 Sourced

Make sure ROS 2 is sourced. (e.g. source /opt/ros/humble/setup.bash)

  1. Download the newest binary release:

    ./scripts/download_demo.sh rosbot
    
  2. Install and download required packages

    sudo apt install ros-${ROS_DISTRO}-navigation2 ros-${ROS_DISTRO}-nav2-bringup
    vcs import < demos.repos
    rosdep install --from-paths src --ignore-src -r -y
    poetry install --with openset
    

Alternative: Demo source build

If you would like more freedom to adapt the simulation to your needs, you can make changes using O3DE Editor and build the project yourself. Please refer to rai husarion rosbot xl demo for more details.

Running RAI

  1. Running rai nodes and agents, navigation stack and O3DE simulation.

    ros2 launch ./examples/rosbot-xl.launch.py game_launcher:=demo_assets/rosbot/RAIROSBotXLDemo/RAIROSBotXLDemo.GameLauncher
    
  2. Run streamlit gui:

    streamlit run examples/rosbot-xl-demo.py
    
  3. Play with the demo, prompting the agent to perform tasks. Here are some examples:

    • Where are you now?
    • What do you see?
    • What is the position of bed?
    • Navigate to the kitchen.
    • Please bring me coffee something from the kitchen (this one should be rejected thanks to robot embodiment module)

Changing camera view

To change camera in the simulation use 1,2,3 keys on your keyboard once it's window is focused.

How it works

The rosbot demo utilizes several components:

  1. Vision processing using Grounded SAM 2 and Grounding DINO for object detection and segmentation. See RAI OpenSet Vision.
  2. RAI agent to process the request and interact with environment via tool-calling mechanism.
  3. Navigation is enabled via nav2 toolkit, which interacts with ROS 2 nav2 asynchronously by calling ros2 actions.
  4. Embodiment of the Rosbot is achieved using RAI Whoami module. This makes RAI agent aware of the hardware platform and its capabilities.
  5. Key informations can be provided to the agent in RAI Whoami. In this demo coordinates of Kitchen and Living Room are provided as capabilities.

The details of the demo can be checked in examples/rosbot-xl-demo.py.