F. Stephenson
DeGraF-Flow: Extending DeGraF Features for Accurate and Efficient Sparse-to-Dense Optical Flow Estimation
Stephenson, F.; Breckon, T.P.; Katramados, I.
Abstract
Modern optical flow methods make use of salient scene feature points detected and matched within the scene as a basis for sparse-to-dense optical flow estimation. Current feature detectors however either give sparse, non uniform point clouds (resulting in flow inaccuracies) or lack the efficiency for frame-rate real-time applications. In this work we use the novel Dense Gradient Based Features (DeGraF) as the input to a sparse-to-dense optical flow scheme. This consists of three stages: 1) efficient detection of uniformly distributed Dense Gradient Based Features (DeGraF) [1]; 2) feature tracking via robust local optical flow [2]; and 3) edge preserving flow interpolation [3] to recover overall dense optical flow. The tunable density and uniformity of DeGraF features yield superior dense optical flow estimation compared to other popular feature detectors within this three stage pipeline. Furthermore, the comparable speed of feature detection also lends itself well to the aim of real-time optical flow recovery. Evaluation on established real-world benchmark datasets show test performance in an autonomous vehicle setting where DeGraF-Flow shows promising results in terms of accuracy with competitive computational efficiency among non-GPU based methods, including a marked increase in speed over the conceptually similar EpicFlow approach
Citation
Stephenson, F., Breckon, T., & Katramados, I. (2019, September). DeGraF-Flow: Extending DeGraF Features for Accurate and Efficient Sparse-to-Dense Optical Flow Estimation. Presented at 26th IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 26th IEEE International Conference on Image Processing (ICIP) |
Start Date | Sep 22, 2019 |
End Date | Sep 25, 2019 |
Acceptance Date | Apr 30, 2019 |
Publication Date | 2019 |
Deposit Date | Jun 4, 2019 |
Publicly Available Date | Nov 12, 2019 |
Pages | 1277-1281 |
Series ISSN | 2381-8549 |
Book Title | 2019 IEEE International Conference on Image Processing (ICIP) ; proceedings. |
DOI | https://doi.org/10.1109/icip.2019.8803739 |
Public URL | https://durham-repository.worktribe.com/output/1142683 |
Files
Accepted Conference Proceeding
(5.2 Mb)
PDF
Copyright Statement
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
You might also like
Generalized Zero-Shot Domain Adaptation via Coupled Conditional Variational Autoencoders
(2023)
Journal Article
Cross-Domain Structure Preserving Projection for Heterogeneous Domain Adaptation
(2021)
Journal Article
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search