COOPERATIVE COMPUTATION OF STEREO DISPARITY

COOPERATIVE COMPUTATION OF STEREO DISPARITY

June 1976 | D. Marr and T. Poggio
This paper presents a cooperative algorithm for stereo disparity computation, which is used to extract information from random-dot stereograms. The algorithm is based on two key constraints: uniqueness (each item can have only one disparity value) and continuity (disparity varies smoothly). The algorithm is implemented as a network of nodes, where each node represents a possible disparity value, and interactions between nodes enforce the constraints. The algorithm is shown to converge to a stable state that satisfies the constraints, and it is able to solve the correspondence problem in random-dot stereograms. The algorithm is based on a cooperative model, where local operations interact to produce global order. This is in contrast to traditional correlation techniques, which require specifying minimum or maximum correlation areas. The algorithm's ability to handle a wide range of disparity values without a fixed scale is a key feature, and it is believed to represent a generalized form of correlation. The algorithm has implications for understanding the visual system, particularly in terms of how the brain processes stereo information. The paper discusses the possibility that the brain may use a similar cooperative mechanism, with multiple disparity layers or "pools" that are sensitive to different disparity values. However, the exact number and nature of these layers remain unclear. The algorithm is also discussed in terms of its potential applications in other areas of visual processing, such as "filling-in" phenomena, subjective contours, and motion correspondence. However, the paper emphasizes that the first step in understanding these processes is to formulate the underlying computation precisely. The paper concludes that while the algorithm is a promising approach to stereo disparity computation, further research is needed to fully understand its biological relevance and to determine whether it can be applied to other perceptual processes.This paper presents a cooperative algorithm for stereo disparity computation, which is used to extract information from random-dot stereograms. The algorithm is based on two key constraints: uniqueness (each item can have only one disparity value) and continuity (disparity varies smoothly). The algorithm is implemented as a network of nodes, where each node represents a possible disparity value, and interactions between nodes enforce the constraints. The algorithm is shown to converge to a stable state that satisfies the constraints, and it is able to solve the correspondence problem in random-dot stereograms. The algorithm is based on a cooperative model, where local operations interact to produce global order. This is in contrast to traditional correlation techniques, which require specifying minimum or maximum correlation areas. The algorithm's ability to handle a wide range of disparity values without a fixed scale is a key feature, and it is believed to represent a generalized form of correlation. The algorithm has implications for understanding the visual system, particularly in terms of how the brain processes stereo information. The paper discusses the possibility that the brain may use a similar cooperative mechanism, with multiple disparity layers or "pools" that are sensitive to different disparity values. However, the exact number and nature of these layers remain unclear. The algorithm is also discussed in terms of its potential applications in other areas of visual processing, such as "filling-in" phenomena, subjective contours, and motion correspondence. However, the paper emphasizes that the first step in understanding these processes is to formulate the underlying computation precisely. The paper concludes that while the algorithm is a promising approach to stereo disparity computation, further research is needed to fully understand its biological relevance and to determine whether it can be applied to other perceptual processes.
Reach us at info@study.space