Powered by OpenAIRE graph
Found an issue? Give us feedback
addClaim

Trusted Unmanned Aerial System Operations

Authors: Theyyar Maalolan, Lakshman;

Trusted Unmanned Aerial System Operations

Abstract

Proving the correctness of autonomous systems is challenged by the use of non-deterministic artificial intelligence algorithms and ever-increasing lines of code. While correctness is conventionally determined through analysis and testing, it is impossible to train and test the system for all possible scenarios or formally analyze millions of lines of code. This thesis describes an alternative method that monitors system behavior during runtime and executes a recovery action if any formally specified property is violated. Multiple parallel safety monitors synthesized from linear temporal logic (LTL) formulas capturing the correctness and liveness properties are implemented in isolated configurable hardware to avoid negative impacts on the system performance. Model checking applied to the final implementation establishes the correctness of the last line of defense against malicious attacks and software bugs. The first part of this thesis illustrates the monitor synthesis flow with rules defining a three-dimensional cage for a commercial-off-the-shelf drone and demonstrates the effectiveness of the monitoring system in enforcing strict behaviors. The second part of this work defines safety monitors to provide assurances for a virtual autonomous flight beyond visual line of sight. Distinct sets of monitors are called into action during different flight phases to monitor flight plan conformance, stability, and airborne collision avoidance. A wireless interface supported by the proposed architecture enables the configuration of monitors, thereby eliminating the need to reprogram the FPGA for every flight. Overall, the goal is to increase trust in autonomous systems as demonstrated with two common drone operations.

Software code in autonomous systems, like cars, drones, and robots, keeps growing not just in length, but also in complexity. The use of machine learning and artificial intelligence algorithms to make decisions could result in unexpected behaviors when encountering completely new situations. Traditional methods of verifying software encounter difficulties while establishing the absolute correctness of autonomous systems. An alternative to proving correctness is to enforce correct behaviors during execution. The system's inputs and outputs are monitored to ensure adherence to formally stated rules. These monitors, automatically generated from rules specified as mathematical formulas, are isolated from the rest of the system and do not affect the system performance. The first part of this work demonstrates the feasibility of the approach by adding monitors to impose a virtual cage on a commercially available drone. The second phase of this work extends the idea to a simulated autonomous flight with a predefined set of points that the drone must pass through. These points along with the necessary parameters for the monitors can be uploaded over Bluetooth. The position, speed, and distance to nearby obstacles are independently monitored and a recovery action is executed if any rule is violated. Since the monitors do not assume anything about the source of the violations, they are effective against malicious attacks, software bugs, and sensor failures. Overall, the goal is to increase confidence in autonomous systems operations.

Master of Science

Country
United States
Related Organizations
Keywords

Formal methods, Field programmable gate arrays, Safety monitors, UAS, Runtime verification

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Upload OA version
Are you the author of this publication? Upload your Open Access version to Zenodo!
It’s fast and easy, just two clicks!