File size: 2,925 Bytes
f64de27
bff2939
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9d84683
 
53b37b7
9d84683
53b37b7
 
9d84683
53b37b7
 
 
 
9d84683
53b37b7
9d84683
53b37b7
9d84683
53b37b7
 
 
 
9d84683
53b37b7
 
 
9d84683
53b37b7
 
9d84683
53b37b7
 
 
 
 
9d84683
53b37b7
 
 
 
 
 
9d84683
53b37b7
9d84683
53b37b7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9d84683
53b37b7
9d84683
53b37b7
9d84683
53b37b7
9d84683
53b37b7
 
 
 
 
9d84683
53b37b7
 
9d84683
53b37b7
 
 
9d84683
53b37b7
9d84683
53b37b7
 
 
9d84683
53b37b7
 
9d84683
53b37b7
 
 
 
9d84683
53b37b7
9d84683
53b37b7
 
 
9d84683
53b37b7
9d84683
53b37b7
 
 
 
df5601f
53b37b7
df5601f
53b37b7
df5601f
53b37b7
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
---
license: mit
datasets:
- Canstralian/Wordlists
- Canstralian/CyberExploitDB
- Canstralian/pentesting_dataset
- Canstralian/ShellCommands
language:
- en
metrics:
- accuracy
- code_eval
base_model:
- replit/replit-code-v1_5-3b
- WhiteRabbitNeo/Llama-3.1-WhiteRabbitNeo-2-8B
- WhiteRabbitNeo/Llama-3.1-WhiteRabbitNeo-2-70B
library_name: transformers
tags:
- code
- text-generation-inference
---

# πŸ‡ RabbitRedux Code Classification Model

## πŸ” Overview
The **RabbitRedux Code Classification Model** is a transformer-based AI designed for **code classification** in **cybersecurity** and **software engineering** contexts.

### 🧠 Features
βœ… **Pre-trained on diverse datasets**  
βœ… **Fine-tuned for cybersecurity-focused classification**  
βœ… **Optimized for Python, JavaScript, and more**  

---

## πŸš€ Usage

### **1️⃣ Install Dependencies**
```sh
pip install transformers torch
```

### **2️⃣ Load the Model**
```python
from transformers import pipeline

# Load RabbitRedux
classifier = pipeline("text-classification", model="canstralian/RabbitRedux")

# Example classification
code_snippet = "def hello_world():\n    print('Hello, world!')"
result = classifier(code_snippet)
print(result)
```

### **3️⃣ Example Output**
```json
[
  {"label": "Python Function", "score": 0.98}
]
```

---

## πŸ“Š Model Details
   β€’ **Developed by**: canstralian  
   β€’ **Architecture**: Transformer-based (Fine-tuned)  
   β€’ **Training Datasets**:
     - Canstralian/Wordlists
     - Canstralian/CyberExploitDB
     - Canstralian/pentesting_dataset
     - Canstralian/ShellCommands  
   β€’ **Fine-tuned from**:
     - replit/replit-code-v1_5-3b  
     - WhiteRabbitNeo/Llama-3.1-WhiteRabbitNeo-2-8B  
     - WhiteRabbitNeo/Llama-3.1-WhiteRabbitNeo-2-70B  
   β€’ **License**: MIT  

## πŸ† Performance

| Metric     | Value    |
|------------|----------|
| Accuracy   | 94.5%    |
| F1 Score   | 92.8%    |

---

## πŸ”₯ Deployment

You can deploy this model as an API using Hugging Face Spaces.

### **Deploy with Docker**
```sh
docker build -t rabbitredux .
docker run -p 5000:5000 rabbitredux
```

### **Use with FastAPI**
If you want a scalable API:

```sh
pip install fastapi uvicorn
```

Then, create a FastAPI server:

```python
from fastapi import FastAPI
from transformers import pipeline

app = FastAPI()
classifier = pipeline("text-classification", model="canstralian/RabbitRedux")

@app.post("/classify/")
def classify_code(data: dict):
    return {"classification": classifier(data["code"])}
```

Run with:

```sh
uvicorn app:app --host 0.0.0.0 --port 8000
```

---

## πŸ“š Useful Resources
   β€’ **GitHub**: [canstralian](https://github.com/canstralian)  
   β€’ **Hugging Face Model**: [RabbitRedux](https://huggingface.co/canstralian/RabbitRedux)  
   β€’ **Replit Profile**: [canstralian](https://replit.com/@canstralian)  

---

## πŸ“œ License

Licensed under the **MIT License**.