BERTopic is a topic modeling technique that extends the cluster embedding approach by leveraging state-of-the-art language models and applying a class-based TF-IDF procedure to generate coherent topic representations. The model first converts documents into embeddings using pre-trained transformer-based language models, then clusters these embeddings, and finally extracts topic representations using a class-based TF-IDF procedure. This approach allows BERTopic to generate more coherent topics and remains competitive across various benchmarks, including both classical models and more recent clustering-based topic models. The paper discusses the strengths and weaknesses of BERTopic, highlighting its flexibility, stability across different language models, and ability to handle dynamic topic modeling. Experimental results demonstrate that BERTopic performs well in terms of topic coherence and diversity, and it scales effectively with new developments in language models.BERTopic is a topic modeling technique that extends the cluster embedding approach by leveraging state-of-the-art language models and applying a class-based TF-IDF procedure to generate coherent topic representations. The model first converts documents into embeddings using pre-trained transformer-based language models, then clusters these embeddings, and finally extracts topic representations using a class-based TF-IDF procedure. This approach allows BERTopic to generate more coherent topics and remains competitive across various benchmarks, including both classical models and more recent clustering-based topic models. The paper discusses the strengths and weaknesses of BERTopic, highlighting its flexibility, stability across different language models, and ability to handle dynamic topic modeling. Experimental results demonstrate that BERTopic performs well in terms of topic coherence and diversity, and it scales effectively with new developments in language models.