TY - JOUR
T1 - Emergency Scheduling of Aerial Vehicles via Graph Neural Neighborhood Search
AU - Guo, Tong
AU - Mei, Yi
AU - Du, Wenbo
AU - Lv, Yisheng
AU - Li, Yumeng
AU - Song, Tao
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2025
Y1 - 2025
N2 - The thriving advances in autonomous vehicles and aviation have enabled the efficient implementation of aerial last-mile delivery services to meet the pressing demand for urgent relief supply distribution. Variable neighborhood search (VNS) is a promising technique for aerial emergency scheduling. However, the existing VNS methods usually exhaustively explore all considered neighborhoods with a prefixed order, leading to an inefficient search process and slow convergence speed. To address this issue, this article proposes a novel graph neural neighborhood search (GENIS) algorithm, which includes an online reinforcement learning (RL) agent that guides the search process by selecting the most appropriate low-level local search operators based on the search state. We develop a dual-graph neural representation learning method to extract comprehensive and informative feature representations from the search state. Besides, we propose a reward-shaping policy learning method to address the decaying reward issue along the search process. Extensive experiments conducted across various benchmark instances demonstrate that the proposed algorithm significantly outperforms the state-of-the-art approaches. Further investigations validate the effectiveness of the newly designed knowledge guidance scheme and the learned feature representations.
AB - The thriving advances in autonomous vehicles and aviation have enabled the efficient implementation of aerial last-mile delivery services to meet the pressing demand for urgent relief supply distribution. Variable neighborhood search (VNS) is a promising technique for aerial emergency scheduling. However, the existing VNS methods usually exhaustively explore all considered neighborhoods with a prefixed order, leading to an inefficient search process and slow convergence speed. To address this issue, this article proposes a novel graph neural neighborhood search (GENIS) algorithm, which includes an online reinforcement learning (RL) agent that guides the search process by selecting the most appropriate low-level local search operators based on the search state. We develop a dual-graph neural representation learning method to extract comprehensive and informative feature representations from the search state. Besides, we propose a reward-shaping policy learning method to address the decaying reward issue along the search process. Extensive experiments conducted across various benchmark instances demonstrate that the proposed algorithm significantly outperforms the state-of-the-art approaches. Further investigations validate the effectiveness of the newly designed knowledge guidance scheme and the learned feature representations.
KW - Adaptive operator selection (AOS)
KW - combinartorial optimization
KW - reinforcement learning (RL)
KW - variable neighborhood search (VNS)
UR - http://www.scopus.com/pages/publications/85215413731
U2 - 10.1109/TAI.2025.3528381
DO - 10.1109/TAI.2025.3528381
M3 - Article
AN - SCOPUS:85215413731
SN - 2691-4581
VL - 6
SP - 1808
EP - 1822
JO - IEEE Transactions on Artificial Intelligence
JF - IEEE Transactions on Artificial Intelligence
IS - 7
ER -