Determinantal point processes (DPPs) are probabilistic models that capture repulsion between elements, arising in quantum physics and random matrix theory. Unlike traditional models like Markov random fields, DPPs offer efficient algorithms for sampling, marginalization, and conditioning, making them suitable for machine learning tasks. This paper provides an introduction to DPPs, focusing on their relevance to machine learning, and discusses their applications in diverse areas such as text summarization, image search, and news timeline extraction. DPPs model diversity by assigning higher probabilities to sets of items that are less similar, leveraging a kernel matrix that defines similarity between items. The paper also explores extensions of DPPs, including k-DPPs for controlling the number of selected items and structured DPPs for modeling complex, structured data. Efficient inference algorithms are presented, along with theoretical results that enable practical modeling and learning. The paper highlights the versatility of DPPs in capturing diverse sets of data, offering a probabilistic framework that complements traditional methods in handling structured, complex data.Determinantal point processes (DPPs) are probabilistic models that capture repulsion between elements, arising in quantum physics and random matrix theory. Unlike traditional models like Markov random fields, DPPs offer efficient algorithms for sampling, marginalization, and conditioning, making them suitable for machine learning tasks. This paper provides an introduction to DPPs, focusing on their relevance to machine learning, and discusses their applications in diverse areas such as text summarization, image search, and news timeline extraction. DPPs model diversity by assigning higher probabilities to sets of items that are less similar, leveraging a kernel matrix that defines similarity between items. The paper also explores extensions of DPPs, including k-DPPs for controlling the number of selected items and structured DPPs for modeling complex, structured data. Efficient inference algorithms are presented, along with theoretical results that enable practical modeling and learning. The paper highlights the versatility of DPPs in capturing diverse sets of data, offering a probabilistic framework that complements traditional methods in handling structured, complex data.