Estimating the motion of moving targets from a moving platform is an extremely challenging problem in un-manned systems research. One common and often successful approach is to use optical flow for motion estimation to account for ego-motion of the platform and to then track the motion of surrounding objects. However, in the presence of video degradation such as noise, compression artifacts, and reduced frame rates, the performance of
state-of-the-art optical flow algorithms greatly diminishes. We consider the effects of video degradation on two well-known optical flow datasets as well as on a real-world video data. To highlight the need for robust optical flow algorithms in the presence of real-world conditions, we present both qualitative and quantitative results on
these datasets.
|