--- datasets: - akridge/NOAA-ESD-CORAL-Bleaching-Dataset language: - en license: agpl-3.0 base_model: - google/vit-base-patch16-224 tags: - vit - vision-transformer - coral - coral-bleaching - image-classification - NOAA - marine-ecosystem pipeline_tag: image-classification library_name: transformers model-index: - name: noaa-esd-coral-bleaching-vit-classifier-v1 results: - task: type: image-classification dataset: type: NOAA-ESD-CORAL-Bleaching-Dataset name: NOAA-ESD-CORAL-Bleaching-Dataset metrics: - type: accuracy value: 0.85 - type: precision value: 0.84 - type: recall value: 0.75 --- # NOAA ESD Coral Bleaching ViT Classifier ## πŸ“ Model Overview This model was trained to **classify coral bleaching conditions** using the Vision Transformer (ViT) architecture on imagery from **NOAA-PIFSC Ecosystem Sciences Division (ESD)** Coral Bleaching Classifier dataset. The dataset includes **human-annotated points** indicating healthy and bleached coral, enabling classification for marine ecosystem monitoring. - **Model Architecture**: Google/ViT Base Patch16 224 - **Task**: Coral Bleaching Image Classification - **Classes**: - `CORAL`: Healthy coral - `CORAL_BL`: Bleached coral ![results](./00_example.png) ![results](./01_example.png) ## πŸ“Š Model Weights - Download the **TorchScript model** [here](./noaa-esd-coral-bleaching-vit-classifier-v1.pt) - Download the **ONNX model** [here](./noaa-esd-coral-bleaching-vit-classifier-v1.onnx) - Access the **base model folder** [here](./model.safetensors) ## πŸ“… Dataset & Annotations - **Dataset**: [NOAA ESD Coral Bleaching Classifier Dataset](https://huggingface.co/datasets/akridge/NOAA-ESD-CORAL-Bleaching-Dataset) - **Annotation Method**: - Points annotated by human experts using both **randomly generated** and **targeted sampling methods**. ## πŸ“š Training Configuration - **Dataset**: NOAA ESD Coral Bleaching Classifier Dataset - **Training/Validation Split**: 70% training, 15% validation, 15% testing - **Epochs**: 100 - **Batch Size**: 16 - **Learning Rate**: 3e-4 - **Image Size**: 224x224 (consistent with ViT input requirements) ## πŸ“ˆ Results and Metrics The model was evaluated using a withheld test set. The predictions were compared against human-labeled points for validation. πŸ“„ Classification Report: | |precision |recall|f1-score|support| |-|-|-|-|-| |CORAL| 0.86| 0.91| 0.88| 973| |CORAL_BL| 0.84| 0.75| 0.79| 589| | | | | | | |accuracy| | | 0.85| 1562| |macro avg| 0.85| 0.83| 0.84| 1562| |weighted avg| 0.85| 0.85| 0.85| 1562| ![results](./02_example.png) ## πŸš€ How to Use the Model ### πŸ”— Load with Transformers ``` # Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("akridge/noaa-esd-coral-bleaching-vit-classifier-v1") model = AutoModelForImageClassification.from_pretrained("akridge/noaa-esd-coral-bleaching-vit-classifier-v1") ``` ``` import torch from transformers import ViTForImageClassification, AutoImageProcessor from PIL import Image # βœ… Load the model and processor model = ViTForImageClassification.from_pretrained("akridge/noaa-esd-coral-bleaching-vit-classifier-v1") processor = AutoImageProcessor.from_pretrained("akridge/noaa-esd-coral-bleaching-vit-classifier-v1") # βœ… Load and process image image = Image.open("your_image.jpg").convert("RGB") inputs = processor(images=image, return_tensors="pt") # βœ… Perform inference with torch.no_grad(): outputs = model(**inputs) prediction = outputs.logits.argmax(-1).item() id2label = model.config.id2label print(f"Prediction: {prediction} ({id2label[prediction]})") ``` ### πŸ”— Use TorchScript ``` import torch # βœ… Load the TorchScript model scripted_model = torch.jit.load("noaa-esd-coral-bleaching-vit-classifier-v1.pt") scripted_model.eval() # βœ… Inference with TorchScript model with torch.no_grad(): scripted_output = scripted_model(inputs["pixel_values"]) scripted_prediction = scripted_output.argmax(-1).item() print(f"TorchScript Prediction: {id2label[scripted_prediction]}") ``` ### πŸ”— Use ONNX ``` import onnxruntime as ort # βœ… Load the ONNX model onnx_model = "noaa-esd-coral-bleaching-vit-classifier-v1.onnx" ort_session = ort.InferenceSession(onnx_model) # βœ… Prepare ONNX input onnx_inputs = {"input": inputs["pixel_values"].numpy()} # βœ… Run inference with ONNX onnx_outputs = ort_session.run(None, onnx_inputs) onnx_prediction = onnx_outputs[0].argmax(axis=1)[0] print(f"ONNX Prediction: {id2label[onnx_prediction]}") ``` ### Intended Use - **Monitoring coral reef health** through automated image classification. - **Scientific research** in marine biology and ecosystem science. ### Limitations - The model was trained on the NOAA ESD dataset; it may not generalize to different regions or unrepresented coral species. - Images with **low resolution** or **poor lighting** may lead to incorrect predictions. - **Vertical or flipped images** should be processed with appropriate orientation adjustments. ### Ethical Considerations - Predictions should not replace expert human validation in critical conservation decisions. #### Disclaimer This repository is a scientific product and is not official communication of the National Oceanic and Atmospheric Administration, or the United States Department of Commerce. All NOAA project content is provided on an β€˜as is’ basis and the user assumes responsibility for its use. Any claims against the Department of Commerce or Department of Commerce bureaus stemming from the use of this project will be governed by all applicable Federal law. Any reference to specific commercial products, processes, or services by service mark, trademark, manufacturer, or otherwise, does not constitute or imply their endorsement, recommendation or favoring by the Department of Commerce. The Department of Commerce seal and logo, or the seal and logo of a DOC bureau, shall not be used in any manner to imply endorsement of any commercial product or activity by DOC or the United States Government.