Please use this identifier to cite or link to this item:
Title: Feature Correlation Transformer for Estimating Ambiguous Optical Flow
Authors: Fang, Guibiao
CHEN, Junhong 
Liang, Dayong
Asim, Muhammad
Yang, Zhenguo
Liu, Wenyin
Issue Date: 2023
Publisher: Springer Science and Business Media {LLC}
Status: Early view
Abstract: Cost volume is widely used to establish correspondences in optical flow estimation. However, when dealing with low-texture and occluded areas, it is difficult to estimate the cost volume correctly. Therefore, we propose a replacement: feature correlation transformer (FCTR), a transformer with self-and cross-attention alternations for obtaining global receptive fields and positional embedding for establishing correspondences. With global context and posi-tional information, FCTR can produce more accurate correspondences for ambiguous areas. Using position-embedded feature allows the removal of the context network; the positional information can be aggregated within ambiguous motion boundaries, and the number of model parameters can be reduced. To speed up network convergence and strengthen robust-ness, we introduce a smooth L1 loss with exponential weights in the pre-training step. At the time of submission, our method achieves competitive performance with all published optical flow methods on both the KITTI-2015 and MPI-Sintel benchmarks. Moreover, it outperforms all optical flow and scene flow methods in KITTI-2015 foreground-region prediction.
Keywords: Optical flow;Cost volume;Ambiguous correspondence;Transformer;Alternating attention
Document URI:
ISSN: 1370-4621
e-ISSN: 1573-773X
DOI: 10.1007/s11063-023-11273-6
ISI #: 000982933300003
Category: A1
Type: Journal Contribution
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
  Restricted Access
Early view2.56 MBAdobe PDFView/Open    Request a copy
FCTR(ral)-Final-提交版.pdfPeer-reviewed author version3.79 MBAdobe PDFView/Open
Show full item record

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.