Rover written with RUST
How to use the YOLO Result
2025.11.12 21:09
In Yolo result, if it is like this, what is that mean?
--------------------------------------------------
thermos 0.2710 0.3712 0.4716 0.9977
insulated cup 0.2716 0.3765 0.4710 0.9989
=====================================
Based on the Python code (yolo_service.py) we integrated earlier, that output represents the Bounding Box of the detected objects using Normalized Coordinates.
Here is the breakdown of thermos 0.2710 0.3712 0.4716 0.9977:
1. The Format
The format is: [Label] [x1] [y1] [x2] [y2]
-
Label (
thermos): The name of the object detected. -
x1 (
0.2710): The Left edge of the box (0.0 is far left, 1.0 is far right). -
y1 (
0.3712): The Top edge of the box (0.0 is top, 1.0 is bottom). -
x2 (
0.4716): The Right edge of the box. -
y2 (
0.9977): The Bottom edge of the box.
2. How to Calculate Real Pixel Position
Since your lefteye.bmp is 1280 x 720:
-
Left (x1): pixels
-
Top (y1): pixels
-
Right (x2): pixels
-
Bottom (y2): pixels (Almost the very bottom of the screen)
3. Observation on your Specific Data
You have two detections:
-
thermos(Start: 0.2710, 0.3712) -
insulated cup(Start: 0.2716, 0.3765)
These are the same object. The coordinates are nearly identical. The AI is slightly confused; it sees a cup and thinks "This looks like a Thermos" AND "This looks like an Insulated Cup," so it reported both.
How to use this with your 3D Point Cloud
To find the 3D Center of this object in your Pascal viewer:
-
Calculate the 2D Center:
-
Lookup in Point Cloud: Look at the point in your
MyCloudarray corresponding to pixel(475, 492). That specific point contains the real-world coordinates (in millimeters) of that Thermos.