S. Peng
A Ranking based Attention Approach for Visual Tracking
Peng, S.; Kamata, S.; Breckon, T.P.
Abstract
Correlation filters (CF) combined with pre-trained convolutional neural network (CNN) feature extractors have shown an admirable accuracy and speed in visual object tracking. However, existing CNN-CF based methods still suffer from the background interference and boundary effects, even when a cosine window is introduced. This paper proposes a ranking based or guided attention approach which can reduce background interference with only forward propagation. This ranking stores several convolution kernels and scores them. Subsequently, a convolutional Long Short Time Memory network (ConvLSTM) is used to update this ranking, which makes it more robust to the variation and occlusion. Moreover, a part-based multi-channel convolutional tracker is proposed to obtain the final response map. Our extensive experiments on established benchmark datasets show comparable performance against contemporary tracking approaches.
Citation
Peng, S., Kamata, S., & Breckon, T. (2019, September). A Ranking based Attention Approach for Visual Tracking. Presented at 26th IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 26th IEEE International Conference on Image Processing (ICIP) |
Start Date | Sep 22, 2019 |
End Date | Sep 25, 2019 |
Acceptance Date | Apr 30, 2019 |
Publication Date | Sep 1, 2019 |
Deposit Date | Jun 4, 2019 |
Publicly Available Date | Nov 12, 2019 |
Pages | 3073-3077 |
Series ISSN | 2381-8549 |
Book Title | 2019 IEEE International Conference on Image Processing (ICIP) ; proceedings. |
DOI | https://doi.org/10.1109/icip.2019.8803358 |
Public URL | https://durham-repository.worktribe.com/output/1142059 |
Files
Accepted Conference Proceeding
(758 Kb)
PDF
Copyright Statement
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
You might also like
Generalized Zero-Shot Domain Adaptation via Coupled Conditional Variational Autoencoders
(2023)
Journal Article
Cross-Domain Structure Preserving Projection for Heterogeneous Domain Adaptation
(2021)
Journal Article
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search