MUN-FRL: A Visual Inertial LiDAR Dataset for Aerial Autonomous Navigation and Mapping

Voices Powered byElevenlabs logo
Connected to paperThis paper is a preprint and has not been certified by peer review

MUN-FRL: A Visual Inertial LiDAR Dataset for Aerial Autonomous Navigation and Mapping


Ravindu G. Thalagala, Sahan M. Gunawardena, Oscar De Silva, Awantha Jayasiri, Arthur Gubbels, George K. I Mann, Raymond G. Gosine


This paper presents a unique outdoor aerial visual-inertial-LiDAR dataset captured using a multi-sensor payload to promote the global navigation satellite system (GNSS)-denied navigation research. The dataset features flight distances ranging from 300m to 5km, collected using a DJI M600 hexacopter drone and the National Research Council (NRC) Bell 412 Advanced Systems Research Aircraft (ASRA). The dataset consists of hardware synchronized monocular images, IMU measurements, 3D LiDAR point-clouds, and high-precision real-time kinematic (RTK)-GNSS based ground truth. Ten datasets were collected as ROS bags over 100 mins of outdoor environment footage ranging from urban areas, highways, hillsides, prairies, and waterfronts. The datasets were collected to facilitate the development of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and landing zone detection algorithms based upon real-world drone and full-scale helicopter data. All the datasets contain raw sensor measurements, hardware timestamps, and spatio-temporally aligned ground truth. The intrinsic and extrinsic calibrations of the sensors are also provided along with raw calibration datasets. A performance summary of state-of-the-art methods applied on the datasets is also provided.

Follow Us on


Add comment