Exploration-Driven Reinforcement Learning for Avionic System Fault Detection (Experience Paper)
Abstract
Critical software systems require stringent testing to identify possible failure cases, which can be di cult to nd using manual testing. In this study, we report our industrial experience in testing a realistic R&D ight control system using a heuristic based testing method. Our approach utilizes evolutionary strategies augmented with intrinsic motivation to yield a diverse range of test cases, each revealing di erent potential failure scenarios within the system. This diversity allows for a more comprehensive identi cation and understanding of the system's vulnerabilities. We analyze the test cases found by evolution to identify the system's weaknesses. The results of our study show that our approach can be used to improve the reliability and robustness of avionics systems by providing high-quality test cases in an e cient and cost-e ective manner.
Keywords
intrinsic motivation
genetic algorithms
evolutionary strategies
diversity
software reliability
physical system
critical software system
automated testing
CCS Concepts
Computer systems organization → Reliability
• Computing methodologies → Reinforcement learning
• Applied computing → Avionics
• Software and its engineering → Software testing and debugging Reinforcement learning
Domains
Computer Science [cs]Origin | Publisher files allowed on an open archive |
---|---|
licence |