Multi-sensor Simultaneous Localization and Mapping (SLAM) is essential for Unmanned Aerial Vehicles (UAVs)
performing agricultural tasks such as spraying, surveying, and inspection. However, real-world,
multi-modal agricultural UAV datasets that enable research on robust operation remain scarce. To address
this gap, we present AgriLiRa4D, a multi-modal UAV dataset designed for challenging outdoor agricultural
environments. AgriLiRa4D spans three representative farmland types—flat, hilly, and terraced—and includes
both boundary and coverage operation modes, resulting in six flight sequence groups. The dataset provides
high-accuracy ground-truth trajectories from a Fiber Optic Inertial Navigation System with Real-Time
Kinematic capability (FINS_RTK), along with synchronized measurements from a 3D LiDAR, a 4D Radar, and an
Inertial Measurement Unit (IMU), accompanied by complete intrinsic and extrinsic calibrations.
Leveraging its comprehensive sensor suite and diverse real-world scenarios, AgriLiRa4D supports diverse
SLAM and localization studies and enables rigorous robustness evaluation against low-texture crops,
repetitive patterns, dynamic vegetation, and other challenges of real agricultural environments. To
further demonstrate its utility, we benchmark four state-of-the-art multi-sensor SLAM algorithms across
different sensor combinations, highlighting the difficulty of the proposed sequences and the necessity of
multi-modal approaches for reliable UAV localization.
By filling a critical gap in agricultural SLAM datasets, AgriLiRa4D provides a valuable benchmark for the
research community and contributes to advancing autonomous navigation technologies for agricultural UAVs.