TechRxiv
submit.pdf (1.12 MB)
Download file

CoT-AMFlow: Adaptive Modulation Network with Co-Teaching Strategy for Unsupervised Optical Flow Estimation

Download (1.12 MB)
preprint
posted on 2020-12-03, 16:08 authored by Hengli WangHengli Wang, Rui FanRui Fan, Ming Liu
The interpretation of ego motion and scene change is a fundamental task for mobile robots. Optical flow information can be employed to estimate motion in the surroundings. Recently, unsupervised optical flow estimation has become a research hotspot. However, unsupervised approaches are often easy to be unreliable on partially occluded or texture-less regions. To deal with this problem, we propose CoT-AMFlow in this paper, an unsupervised optical flow estimation approach. In terms of the network architecture, we develop an adaptive modulation network that employs two novel module types, flow modulation modules (FMMs) and cost volume modulation modules (CMMs), to remove outliers in challenging regions. As for the training paradigm, we adopt a co-teaching strategy, where two networks simultaneously teach each other about challenging regions to further improve accuracy. Experimental results on the MPI Sintel, KITTI Flow and Middlebury Flow benchmarks demonstrate that our CoT-AMFlow outperforms all other state-of-the-art unsupervised approaches, while still running in real time. Our project page is available at https://sites.google.com/view/cot-amflow.

Funding

National Natural Science Foundation of China No. U1713211

Collaborative Research Fund by Research Grants Council Hong Kong No. C4063-18

HKUST-SJTU Joint Research Collaboration Fund SJTU20EG03

History

Email Address of Submitting Author

ranger_fan@outlook.com

ORCID of Submitting Author

0000-0003-2593-6596

Submitting Author's Institution

UC San Diego

Submitting Author's Country

  • United States of America