The paper by G. Lorden from the California Institute of Technology addresses the problem of optimal stopping in quality control and reliability theory. The author formulates a problem where independent random variables $X_1, X_2, \cdots$ are observed sequentially, with the first $m-1$ following a distribution $F_0$ and the remaining $m$ following a different distribution $F_1$. The goal is to determine a stopping rule that minimizes the expected number of observations before detecting a change from $F_0$ to $F_1$, while controlling the frequency of false reactions.
Lorden proposes a "minimax" criterion, $\bar{E}_1 N$, which measures the average number of observations before reacting, and seeks to minimize this criterion subject to a condition that controls the frequency of false reactions. The paper introduces Page's procedure, which involves stopping at the first time the cumulative sum of the logarithms of the likelihood ratios exceeds a threshold, and shows that this procedure asymptotically minimizes $\bar{E}_1 N$.
The paper also discusses the extension of this problem to a family of distributions $\{F_\theta\}$, where $\theta$ is unknown, and provides conditions under which one-sided tests can be used to achieve asymptotic optimality. The results are applied to specific families of distributions, such as Koopman-Darmois families, and explicit stopping boundaries are derived.
Finally, the paper includes examples and applications in quality control and reliability theory, demonstrating how the proposed procedures can be used to detect changes in mean values and failure rates.The paper by G. Lorden from the California Institute of Technology addresses the problem of optimal stopping in quality control and reliability theory. The author formulates a problem where independent random variables $X_1, X_2, \cdots$ are observed sequentially, with the first $m-1$ following a distribution $F_0$ and the remaining $m$ following a different distribution $F_1$. The goal is to determine a stopping rule that minimizes the expected number of observations before detecting a change from $F_0$ to $F_1$, while controlling the frequency of false reactions.
Lorden proposes a "minimax" criterion, $\bar{E}_1 N$, which measures the average number of observations before reacting, and seeks to minimize this criterion subject to a condition that controls the frequency of false reactions. The paper introduces Page's procedure, which involves stopping at the first time the cumulative sum of the logarithms of the likelihood ratios exceeds a threshold, and shows that this procedure asymptotically minimizes $\bar{E}_1 N$.
The paper also discusses the extension of this problem to a family of distributions $\{F_\theta\}$, where $\theta$ is unknown, and provides conditions under which one-sided tests can be used to achieve asymptotic optimality. The results are applied to specific families of distributions, such as Koopman-Darmois families, and explicit stopping boundaries are derived.
Finally, the paper includes examples and applications in quality control and reliability theory, demonstrating how the proposed procedures can be used to detect changes in mean values and failure rates.