File size: 5,140 Bytes
d550e14
 
 
 
 
 
 
8373aa5
 
 
 
 
 
 
 
 
d550e14
8373aa5
18b0adf
8373aa5
 
18b0adf
8373aa5
 
 
 
 
18b0adf
8373aa5
 
 
 
 
 
 
 
 
 
 
 
 
b71bb23
8373aa5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18b0adf
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
datasets:
- akridge/NOAA-ESD-CORAL-Bleaching-Dataset
language:
- en
base_model:
- google/vit-base-patch16-224
tags:
- vit
- vision-transformer
- coral
- coral-bleaching
- image-classification
- NOAA
- marine-ecosystem
pipeline_tag: image-classification
---
# NOAA ESD Coral Bleaching ViT Classifier

## πŸ“ Model Overview
This model was trained to **classify coral bleaching conditions** using the Vision Transformer (ViT) architecture on imagery from  **NOAA-PIFSC Ecosystem Sciences Division (ESD)** Coral Bleaching Classifier dataset. The dataset includes **human-annotated points** indicating healthy and bleached coral, enabling classification for marine ecosystem monitoring.

- **Model Architecture**: Google/ViT Base Patch16 224  
- **Task**: Coral Bleaching Image Classification  
- **Classes**:  
  - `CORAL`: Healthy coral  
  - `CORAL_BL`: Bleached coral  

## πŸ“Š Model Weights
- Download the **TorchScript model** [here](./noaa-esd-coral-bleaching-vit-classifier-v1.pt)  
- Download the **ONNX model** [here](./noaa-esd-coral-bleaching-vit-classifier-v1.onnx)  
- Access the **base model folder** [here](./coral_vit_model/)  

## πŸ“… Dataset & Annotations
- **Dataset**: [NOAA ESD Coral Bleaching Classifier Dataset](https://huggingface.co/datasets/akridge/NOAA-ESD-CORAL-Bleaching-Dataset)  
- **Annotation Method**:  
  - Points annotated by human experts using both **randomly generated** and **targeted sampling methods**.  

## πŸ“š Training Configuration
- **Dataset**: NOAA ESD Coral Bleaching Classifier Dataset  
- **Training/Validation Split**: 70% training, 15% validation, 15% testing  
- **Epochs**: 100  
- **Batch Size**: 16  
- **Learning Rate**: 3e-4  
- **Image Size**: 224x224 (consistent with ViT input requirements)  

## πŸ“ˆ Results and Metrics
The model was evaluated using a withheld test set. The predictions were compared against human-labeled points for validation.

### Confusion Matrix

## πŸš€ How to Use the Model

### πŸ”— Load with Transformers
```
import torch
from transformers import ViTForImageClassification, AutoImageProcessor
from PIL import Image

# βœ… Load the model and processor
model = ViTForImageClassification.from_pretrained("akridge/noaa-esd-coral-bleaching-vit-classifier-v1")
processor = AutoImageProcessor.from_pretrained("akridge/noaa-esd-coral-bleaching-vit-classifier-v1")

# βœ… Load and process image
image = Image.open("your_image.jpg").convert("RGB")
inputs = processor(images=image, return_tensors="pt")

# βœ… Perform inference
with torch.no_grad():
    outputs = model(**inputs)
    prediction = outputs.logits.argmax(-1).item()

id2label = model.config.id2label
print(f"Prediction: {prediction} ({id2label[prediction]})")
```
### πŸ”— Use TorchScript
```
import torch

# βœ… Load the TorchScript model
scripted_model = torch.jit.load("noaa-esd-coral-bleaching-vit-classifier-v1.pt")
scripted_model.eval()

# βœ… Inference with TorchScript model
with torch.no_grad():
    scripted_output = scripted_model(inputs["pixel_values"])
    scripted_prediction = scripted_output.argmax(-1).item()

print(f"TorchScript Prediction: {id2label[scripted_prediction]}")

```
### πŸ”— Use ONNX
```
import onnxruntime as ort

# βœ… Load the ONNX model
onnx_model = "noaa-esd-coral-bleaching-vit-classifier-v1.onnx"
ort_session = ort.InferenceSession(onnx_model)

# βœ… Prepare ONNX input
onnx_inputs = {"input": inputs["pixel_values"].numpy()}

# βœ… Run inference with ONNX
onnx_outputs = ort_session.run(None, onnx_inputs)
onnx_prediction = onnx_outputs[0].argmax(axis=1)[0]

print(f"ONNX Prediction: {id2label[onnx_prediction]}")

```

### Intended Use
- **Monitoring coral reef health** through automated image classification.  
- **Scientific research** in marine biology and ecosystem science.  

### Limitations
- The model was trained on the NOAA ESD dataset; it may not generalize to different regions or unrepresented coral species.  
- Images with **low resolution** or **poor lighting** may lead to incorrect predictions.  
- **Vertical or flipped images** should be processed with appropriate orientation adjustments.  

### Ethical Considerations
- Predictions should not replace expert human validation in **critical conservation decisions**.  

#### Disclaimer
This repository is a scientific product and is not official communication of the National Oceanic and Atmospheric Administration, or the United States Department of Commerce. All NOAA project content is provided on an β€˜as is’ basis and the user assumes responsibility for its use. Any claims against the Department of Commerce or Department of Commerce bureaus stemming from the use of this project will be governed by all applicable Federal law. Any reference to specific commercial products, processes, or services by service mark, trademark, manufacturer, or otherwise, does not constitute or imply their endorsement, recommendation or favoring by the Department of Commerce. The Department of Commerce seal and logo, or the seal and logo of a DOC bureau, shall not be used in any manner to imply endorsement of any commercial product or activity by DOC or the United States Government.