Video – Real-World Validation of the Proposed Visual Perception System
Authors/Creators
Description
This video presents the experimental validation of the proposed visual perception and pose estimation system in a real water tank environment. The robotic fish (UUV) is observed from a zenithal monocular camera mounted on the surface platform.
The perception pipeline performs object detection using a YOLO-based model to localize the robotic fish in the image. A second neural network (YOLO-pose) predicts semantically defined rigid keypoints distributed over the body of the fish. These 2D keypoints are associated with their known 3D coordinates on the robot model and used to estimate the full 6D pose through a Perspective-n-Point (PnP) formulation with RANSAC-based outlier rejection.
The experiment demonstrates the robustness of the proposed monocular vision approach under real-world water tank conditions, including reflections, lighting variations, and visual disturbances, validating its suitability for cooperative surface–underwater robotic operation.
Files
riai_video_real.mp4
Files
(6.9 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:e71f1bfb7cbd894b62605c23a173aebd
|
6.9 MB | Preview Download |