This test examined scenarios where more than one LiDAR sensor is used to map a scene. In particular, our scenario used data from four sensors that cover a view of each road at an intersection forming a 360-degree view of the scene. These four sensors are fused together to form a highly accurate 3D view of the intersection.
Results
We found it was possible to overlay four different simulated streams using the SENSR application to present a full view of the entire intersection:
Findings
- The result was excellent producing a full 360-degree view of the intersection.
- Objects in the scene were seamlessly detected while moving from one LiDAR sensor coverage area to another.
- The calibration was challenging for a novice user. It is a great benefit when users are familiar with the topography of a scene so that they can more easily fuse the data from different LiDAR sensors into a single visual output.