Determining Optical Flow

Determining Optical Flow

1981 | Berthold K.P. Horn and Brian G. Schunck
The paper by Berthold K.P. Horn and Brian G. Schunck presents a method for computing optical flow from a sequence of images. Optical flow refers to the distribution of apparent velocities of brightness patterns in an image, which can provide information about the spatial arrangement and motion of objects. The authors note that optical flow cannot be computed locally because each image point has only one constraint (change in brightness) while the flow velocity has two components (components in the x and y directions). They introduce a second constraint based on the assumption that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image. The method involves an iterative implementation that minimizes the error between the predicted and actual brightness changes, while also enforcing smoothness constraints on the flow velocity. The algorithm is robust to quantization and noise in the image data and can handle coarse quantization in space and time. The authors demonstrate the effectiveness of their method through experiments with synthetic image sequences, showing that it can accurately compute optical flow for various types of motion patterns, including linear translation, rotation, and contraction. The paper also discusses the limitations of the method, such as the accuracy of the estimates being influenced by noisy and quantized measurements, and the challenges posed by discontinuities in the flow, such as those occurring at occluding boundaries. The authors conclude by summarizing the key contributions of their work and highlighting the practical applications of their method in computer vision and image processing.The paper by Berthold K.P. Horn and Brian G. Schunck presents a method for computing optical flow from a sequence of images. Optical flow refers to the distribution of apparent velocities of brightness patterns in an image, which can provide information about the spatial arrangement and motion of objects. The authors note that optical flow cannot be computed locally because each image point has only one constraint (change in brightness) while the flow velocity has two components (components in the x and y directions). They introduce a second constraint based on the assumption that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image. The method involves an iterative implementation that minimizes the error between the predicted and actual brightness changes, while also enforcing smoothness constraints on the flow velocity. The algorithm is robust to quantization and noise in the image data and can handle coarse quantization in space and time. The authors demonstrate the effectiveness of their method through experiments with synthetic image sequences, showing that it can accurately compute optical flow for various types of motion patterns, including linear translation, rotation, and contraction. The paper also discusses the limitations of the method, such as the accuracy of the estimates being influenced by noisy and quantized measurements, and the challenges posed by discontinuities in the flow, such as those occurring at occluding boundaries. The authors conclude by summarizing the key contributions of their work and highlighting the practical applications of their method in computer vision and image processing.
Reach us at info@study.space