Autonomous UAV Navigation for Rural Landscapes Using SDSA-Optimized PIDA Control and Deep Learning-Based Depth Estimation

Zhifei Wu

Abstract


Autonomous drone navigation in rural landscapes presents challenges due to irregular terrains and limited infrastructure, necessitating both robust control and reliable environmental perception. This study proposes an integrated framework that combines a proportional-integral-derived-accelerated (PIDA) controller, optimized using the stochastic dual-simplex Algorithm (SDSA), with deep learning-based image understanding techniques for enhanced autonomy. Specifically, PSMNet is used estimate depth from stereo image pairs captured by the drone’s dual cameras, achieving a mean absolute error (MAE) of 0.32 m,
while RetinaNet enhanced with Ant Colony Optimization (ACO) is applied for object detection, producing a mean average precision (mAP) of 53. 4%. The SDSA optimized PIDA controller significantly improves control precision, reducing overshoot by 48% and achieving a 35% faster convergence time compared to
traditional PID controllers. Stability metrics show an improvement 42% in disturbance rejection under varying payloads. Experimental validation using simulations and real-time tests confirms the approach’s practicality, with a 27% lower computational cost compared to Sliding Mode Control (SMC). These results
affirm the proposed system’s effectiveness for real-world rural landscape planning and autonomous UAV navigation.


Full Text:

PDF


DOI: https://doi.org/10.31449/inf.v49i5.9177

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.