close-mar11Top10
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("Thang203/close-mar11Top10")
topic_model.get_topic_info()
Topic overview
- Number of topics: 10
- Number of training documents: 4147
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | models - language - llms - gpt - chatgpt | 11 | -1_models_language_llms_gpt |
0 | models - language - llms - large - language models | 1366 | 0_models_language_llms_large |
1 | ai - chatgpt - students - education - learning | 2110 | 1_ai_chatgpt_students_education |
2 | llms - attacks - adversarial - attack - security | 193 | 2_llms_attacks_adversarial_attack |
3 | training - models - transformer - model - large | 184 | 3_training_models_transformer_model |
4 | financial - legal - models - llms - analysis | 106 | 4_financial_legal_models_llms |
5 | materials - chemistry - drug - molecule - discovery | 87 | 5_materials_chemistry_drug_molecule |
6 | recommendation - recommender - recommender systems - user - systems | 35 | 6_recommendation_recommender_recommender systems_user |
7 | game - agents - games - llms - language | 30 | 7_game_agents_games_llms |
8 | astronomy - scientific - wave - knowledge - data | 25 | 8_astronomy_scientific_wave_knowledge |
Training hyperparameters
- calculate_probabilities: False
- language: None
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: 10
- seed_topic_list: None
- top_n_words: 10
- verbose: True
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 1.25.2
- HDBSCAN: 0.8.33
- UMAP: 0.5.5
- Pandas: 1.5.3
- Scikit-Learn: 1.2.2
- Sentence-transformers: 2.6.1
- Transformers: 4.38.2
- Numba: 0.58.1
- Plotly: 5.15.0
- Python: 3.10.12
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.