P.A. Adey
Autoencoders Without Reconstruction for Textural Anomaly Detection
Adey, P.A.; Akcay, S.; Bordewich, M.J.R.; Breckon, T.P.
Authors
S. Akcay
Professor Magnus Bordewich m.j.r.bordewich@durham.ac.uk
Professor
Professor Toby Breckon toby.breckon@durham.ac.uk
Professor
Abstract
Automatic anomaly detection in natural textures is a key component within quality control for a range of high-speed, high-yield manufacturing industries that rely on camera-based visual inspection techniques. Targeting anomaly detection through the use of autoencoder reconstruction error readily facilitates training on an often more plentiful set of non-anomalous samples, without the explicit need for a representative set of anomalous training samples that may be difficult to source. Unfortunately, autoencoders struggle to reconstruct high-frequency visual information and therefore, such approaches often fail to achieve a low enough reconstruction error for non-anomalous pixels. In this paper, we propose a new approach in which the autoencoder is trained to directly output the desired per-pixel measure of abnormality without first having to perform reconstruction. This is achieved by corrupting training samples with noise and then predicting how pixels need to be shifted so as to remove the noise. Our direct approach enables the model to compress anomaly scores for normal pixels into a tight bound close to zero, resulting in very clean anomaly segmentations that significantly improve performance. We also introduce the Reflected ReLU output activation function that better facilitates training under this direct regime by leaving values that fall within the image dynamic range unmodified. Overall, an average area under the ROC curve of 96% is achieved on the texture classes of the MVTecAD benchmark dataset, surpassing that achieved by all current state-of-the-art methods.
Citation
Adey, P., Akcay, S., Bordewich, M., & Breckon, T. (2021, July). Autoencoders Without Reconstruction for Textural Anomaly Detection. Presented at 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 2021 International Joint Conference on Neural Networks (IJCNN) |
Start Date | Jul 18, 2021 |
End Date | Jul 22, 2021 |
Acceptance Date | Apr 12, 2021 |
Online Publication Date | Sep 20, 2021 |
Publication Date | 2021 |
Deposit Date | Apr 22, 2021 |
Publicly Available Date | Apr 23, 2021 |
Publisher | Institute of Electrical and Electronics Engineers |
DOI | https://doi.org/10.1109/ijcnn52387.2021.9533804 |
Public URL | https://durham-repository.worktribe.com/output/1141064 |
Files
Accepted Conference Proceeding
(11.7 Mb)
PDF
Copyright Statement
© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
You might also like
Quantifying the difference between phylogenetic diversity and diversity indices
(2024)
Journal Article
On the Complexity of Optimising Variants of Phylogenetic Diversity on Phylogenetic Networks
(2022)
Journal Article
On the Maximum Agreement Subtree Conjecture for Balanced Trees
(2022)
Journal Article
A universal tree-based network with the minimum number of reticulations
(2018)
Journal Article
Recovering normal networks from shortest inter-taxa distance information
(2018)
Journal Article
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search