Skip to content

Jen-Hung-Ho/ros2_jetbot_tools

Repository files navigation

Jetbot Tools with YOLOv11 Vision Detection and NanoLLM Container for NAV2 ROS2 Robot — Version 2.1

Jetbot Tools is a collection of ROS2 nodes that integrate a YOLO‑based vision system and the Jetson NanoLLM Docker container for NVIDIA Jetson Orin platforms. With Jetbot Tools, you can build a cost‑effective two‑wheel robot equipped with a depth camera and a lidar sensor, enabling it to perform the following impressive tasks

  • Voice-Activated Copilot: Unleash the power of voice control for your ROS2 robot with Jetbot Voice-Activated Copilot Tools.
  • Jetbot Tools Task Copilot (NEW in v2.1): Manage and coordinate all Jetbot Tools tasks through a unified ROS2 Action interface. The Task Copilot can start, stop, or interrupt long‑running operations, ensuring smooth cooperation between modules. When a voice‑activated command is received, it can automatically stop any currently running tasks and cleanly transition the robot into the newly requested action.
  • Large Language Model (LLM) Chat: Empower your Jetbot to respond using LLM chat. By default, it utilizes the meta-llama/Llama-2-7b-chat-hf model hosted in a ROS2 node.
  • Vision-Language Model (VLM) Robot Camera Image Description: Enable your Jetbot to describe images captured by its camera. By default, it employs the Efficient-Large-Model/VILA1.5-3b model hosted in a ROS2 node.
  • Depth‑Camera Vision Object Avoidance Self‑Driving (NEW in v2.1): Enable your robot to navigate autonomously using depth‑camera vision, allowing it to detect obstacles in 3D space and perform smooth, vision‑based avoidance behaviors.
  • Lidar-Assisted Object Avoidance Self-Driving: Enable your robot to navigate autonomously and avoid obstacles using the lidar sensor.
  • Real-Time Object Detection and Tracking: Allow your robot to detect objects using the SSD Mobilenet V2 model. You can also make your robot follow a specific object that it detects.
  • Real-Time Object Detection and Distance Measurement (NEW in v2.1): Enable your robot to detect objects using the YOLOv11 vision system and measure their distance with the depth camera. You can also make your robot follow a selected object and automatically stop when it gets too close.
  • NAV2 TF2 Position Tracking and Following: Allow your robot to track its own position and follow another Jetbot robot using the NAV2 TF2 framework.

Here is a brief overview of the jetbot tools design diagram/architecture

Setup

Jetbot tools source code and video demos:


Requirements:

References

About

Jetbot tools is a set of ROS2 nodes that utilize the Jetson inference DNN vision library for NVIDIA Jetson

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages