Presentation + Paper
14 May 2019 Physically realizable adversarial examples for convolutional object detection algorithms
David R. Chambers, H. Abe Garza
Author Affiliations +
Abstract
In our work, we make two primary contributions to the field of adversarial example generation for convolutional neural network based perception technologies. First of all, we extend recent work on physically realizable adversarial examples to make them more robust to translation, rotation, and scale in real-world scenarios. Secondly, we demonstrate attacks against object detection neural networks rather than considering only the simpler problem of classification, demonstrating the ability to force these networks to mislocalize as well as misclassify. We demonstrate our method on multiple object detection frameworks, including Faster R-CNN, YOLO v3, and our own single-shot detection architecture.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
David R. Chambers and H. Abe Garza "Physically realizable adversarial examples for convolutional object detection algorithms", Proc. SPIE 10988, Automatic Target Recognition XXIX, 109880R (14 May 2019); https://doi.org/10.1117/12.2520166
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Target detection

Neural networks

Cameras

Sensors

Detection and tracking algorithms

Image classification

Convolutional neural networks

Back to Top