Optimization of Powered Landing Control for Reusable Rockets Using Softmax DDGN

Authors

  • Rafika Arum Sari Universitas Dirgantara Marsekal Suryadarma
  • Muhammad Hadi Widanto
  • Imron Rosadi Universitas Dirgantara Marsekal Suryadarma

Keywords:

Softmax Double Deep Q-Networks, Landing Control Optimization, Curriculum Learning, Fuel Efficiency, Reusable Rockets

Abstract

This research presented a novel approach to optimize powered landing control on reusable rockets by using Softmax Double Deep Q-Networks (DDQN). We combined the advantages of Double DQN with Softmax exploration and curriculum learning to achieve precise and efficient landing control. Through extensive experiments in a specially developed 2D simulation environment, our method achieves improved landing accuracy by 37% (reduced final position error from 2.4 m to 1.5 m), better fuel efficiency by 28% (reduced average fuel consumption from 850 kg to 612 kg per landing), and improved adaptability to initial conditions (improved successful landing rate from 76% to 94% across a wide range of altitudes and initial orientations) compared to traditional PID control methods. The results showed that the curriculum learning method significantly outperformed the non-curriculum approach, achieving 27% higher average awards (11.97 vs. 8.61) and 60% better performance consistency as measured by standard deviation (0.92 vs. 2.29). Both Softmax and ε-greedy exploration strategies proved effective with curriculum learning, with ε-greedy DDQN achieving the highest average award of 11.97. This approach allows for higher precision rocket landings while reducing operational costs through.

Downloads

Published

26-05-2025

How to Cite

Rafika Arum Sari, Muhammad Hadi Widanto, & Imron Rosadi. (2025). Optimization of Powered Landing Control for Reusable Rockets Using Softmax DDGN. Indonesian Journal of Aerospace, 22(2), 135–150. Retrieved from https://ejournal.brin.go.id/ijoa/article/view/7981

Similar Articles

<< < 1 2 3 4 5 6 7 8 9 > >> 

You may also start an advanced similarity search for this article.