Title

Securing Autonomy in Contested Environments

Guest Info

Amir Khazraei is currently a postdoctoral associate at Duke University, Cyber-Physical Systems Lab. He earned his PhD in Electrical and Computer Engineering from Duke University in 2023 and holds B.S. and M.S. degrees in Electrical Engineering from Amirkabir University of Technology and the University of Tehran in 2014 and 2017, respectively. Amir Khazraei has a notable publication record in conferences and journals, including CDC, ACC, ICCPS, L4DC, IROS, ICRA, IEEE Transaction on Automatic Control and Automatica. His research interests are resilient Cyber-physical systems, control theory, state estimation, and machine learning.

Abstract

The tight interaction between information technology and physical world makes autonomous vehicles (AVs) vulnerable to attacks beyond the standard cyber-attacks, illustrating the need to change the way we reason about AV security. In this talk, I will present our recent efforts in this domain, starting from security-aware modeling and vulnerability analysis of neural network-based control systems operating in adversarial environments. Based on the novel notions of attack effectiveness and stealthiness independent of any potentially employed anomaly detector, we developed sufficient conditions for existence of stealthy effective attacks that force the system into an unsafe operating region, for different levels of runtime information available to the attacker. Further, I will illustrate how such attacks can be launched on various perception architectures in modern AVs, exploiting intrinsic vulnerabilities in heterogenous perception-based sensing (e.g., camera, LiDAR, radar). Finally, I will advocate that we need to strengthen each layer of the autonomy stack before introducing our methods for security-aware planning for AVs operating in unknown stochastic environments, in the presence of attacks.