SOME CONCEPTS OF DEPENDENCE

SOME CONCEPTS OF DEPENDENCE

2012 | E. L. LEHMANN
This paper discusses concepts of dependence between pairs of random variables (X, Y), focusing on positive and negative quadrant dependence. It introduces three progressively stronger definitions of positive dependence and explores their implications. The first definition, quadrant dependence, compares the probability of a quadrant under a distribution with the probability under independence. Positive quadrant dependence occurs when the probability of a quadrant is greater than or equal to the product of the probabilities of the individual variables. Negative quadrant dependence is the opposite. The paper also examines the properties of quadrant dependent distributions, such as the relationship between covariance and positive dependence. The paper then discusses applications of these concepts to slippage problems, where the goal is to test hypotheses about parameters that may have "slipped" or changed. It provides examples of distributions that exhibit positive or negative quadrant dependence, such as bivariate normal distributions and multinomial distributions. The paper also explores regression dependence, where the conditional distribution of Y given X is non-decreasing in X, indicating positive regression dependence. The paper further discusses unbiasedness of tests of independence, showing that certain tests are unbiased against alternatives in specific families of distributions. It also introduces likelihood ratio dependence, a stronger form of dependence where the conditional density of Y given X has a monotone likelihood ratio. Examples of distributions with monotone likelihood ratio dependence are provided, including bivariate normal distributions and order statistics. The paper concludes with a discussion of the implications of these concepts for statistical inference and the development of tests for independence and association. It emphasizes the importance of understanding different forms of dependence in statistical analysis and the role of these concepts in the design and interpretation of statistical tests.This paper discusses concepts of dependence between pairs of random variables (X, Y), focusing on positive and negative quadrant dependence. It introduces three progressively stronger definitions of positive dependence and explores their implications. The first definition, quadrant dependence, compares the probability of a quadrant under a distribution with the probability under independence. Positive quadrant dependence occurs when the probability of a quadrant is greater than or equal to the product of the probabilities of the individual variables. Negative quadrant dependence is the opposite. The paper also examines the properties of quadrant dependent distributions, such as the relationship between covariance and positive dependence. The paper then discusses applications of these concepts to slippage problems, where the goal is to test hypotheses about parameters that may have "slipped" or changed. It provides examples of distributions that exhibit positive or negative quadrant dependence, such as bivariate normal distributions and multinomial distributions. The paper also explores regression dependence, where the conditional distribution of Y given X is non-decreasing in X, indicating positive regression dependence. The paper further discusses unbiasedness of tests of independence, showing that certain tests are unbiased against alternatives in specific families of distributions. It also introduces likelihood ratio dependence, a stronger form of dependence where the conditional density of Y given X has a monotone likelihood ratio. Examples of distributions with monotone likelihood ratio dependence are provided, including bivariate normal distributions and order statistics. The paper concludes with a discussion of the implications of these concepts for statistical inference and the development of tests for independence and association. It emphasizes the importance of understanding different forms of dependence in statistical analysis and the role of these concepts in the design and interpretation of statistical tests.
Reach us at info@study.space