Determinantal Point Processes (DPPs) are elegant probabilistic models that arise in quantum physics and random matrix theory, offering efficient algorithms for sampling, marginalization, conditioning, and other inference tasks. Unlike traditional structured models like Markov Random Fields, DPPs handle negative correlations effectively. This paper provides an introduction to DPPs, focusing on their intuition, algorithms, and extensions relevant to machine learning. DPPs are particularly useful for tasks requiring diverse sets, such as finding diverse search results, summarizing documents, modeling non-overlapping human poses in images, and building timelines of news stories. The paper covers the mathematical background, representation, algorithms, learning, and extensions like $k$-DPPs and structured DPPs, demonstrating their practical applications through various examples and experiments.Determinantal Point Processes (DPPs) are elegant probabilistic models that arise in quantum physics and random matrix theory, offering efficient algorithms for sampling, marginalization, conditioning, and other inference tasks. Unlike traditional structured models like Markov Random Fields, DPPs handle negative correlations effectively. This paper provides an introduction to DPPs, focusing on their intuition, algorithms, and extensions relevant to machine learning. DPPs are particularly useful for tasks requiring diverse sets, such as finding diverse search results, summarizing documents, modeling non-overlapping human poses in images, and building timelines of news stories. The paper covers the mathematical background, representation, algorithms, learning, and extensions like $k$-DPPs and structured DPPs, demonstrating their practical applications through various examples and experiments.