A computational theory for the perception of coherent visual motion

Abstract
When we see motion, our perception of how one image feature moves depends on the behaviour of other features nearby. In particular, the Gestaltists proposed the law of shared common fate, in which features tend to be perceived as moving together, that is, coherently. Recent psychophysical findings, such as the cooperativity of the motion system and motion capture, support this law. Computationally, coherence is a sensible assumption, because if two features are close then they probably belong to the same object and thus tend to move together. Moreover, the measurement of local motion may be inaccurate and so the integration of motion information over large areas may help to improve the performance. Present theories of visual motion, however, do not account fully for these coherent motion percepts. We propose here a theory that does account for these phenomena and also provides a solution to the aperture problem, where the local information in the image flow is insufficient to specify the motion uniquely.