huytranduck/convnextv2_50x_dataset

Model Description

This is a fine-tuned ConvNeXtV2-Atto model for image classification with 100 classes.

Classes

['Acacia melanoxylon', 'Acer saccharinum', 'Afzelia africana', 'Afzelia pachyloba', 'Afzelia quanzensis', 'Albizia lucida (Albizia lucidior)', 'Allophylus cobbe (Pometia pinnata)', 'Anisoptera costata (Anisoptera Robusta)', 'Apuleia\xa0leiocarpa', 'Artocarpus calophyllus (Artocarpus asperulus)', 'Artocarpus heterophyllus', 'Autranella congolensis', 'Berlinia bracteosa', 'Betula pendula', 'Bobgunnia fistuloides (Swartzia fistuloides)', 'Brachystegia sp', 'Burckella obovata', 'Burretiodendron tonkinense', 'Callitris columellaris', 'Calocedrus sp', 'Canarium album', 'Chrysophyllum sp', 'Cinnamomum camphora', 'Clarisia racemosa', 'Colophospermum mopane', 'Cunninghamia lanceolata', 'Cupressus funebris (Cupressus pendula)', 'Cylicodiscus gabunensis', 'Dalbergia cochinchinensis', 'Dalbergia oliveri', 'Detarium macrocarpum', 'Dialium bipindense', 'Didelotia africana', 'Diospyros mun', 'Diospyros salletii', 'Distemonanthus benthamianus', 'Engelhardia chrysolepis (Engelhardia roxburghiana)', 'Entandrophragma cylindricum', 'Entandrophragma utile', 'Erythrophleum fordii\xa0', 'Erythrophleum ivorense', 'Eucalyptus cladocalyx', 'Eucalyptus grandis', 'Eucalyptus microcorys', 'Eucalyptus saligna', 'Fokienia hodginsii', 'Fraxinus excelsior', 'Gilbertiodendron dewevrei', 'Guarea cedrata', 'Guibourtia coleosperma', 'Heritiera littoralis', 'Hevea brasiliensis', 'Homalium caryophyllaceum', 'Homalium foetidum', 'Hopea iriana', 'Hopea pierrei', 'Hymenaea courbaril', 'Hymenolobium heterocarpum', 'Juglans regia', 'Khaya senegalensis', 'Klainedoxa gabonensis', 'Lithocarpus ducampii', 'Lophira alata', 'Magnolia hypolampra', 'Martiodendron parviflorum', 'Milicia excelsa', 'Milicia regia', 'Millettia laurentii', 'Monopetalanthus letestui (Bikinia letestui)', 'Myracrodruon urundeuva', 'Myroxylon balsamum', 'Myroxylon balsamum_v2', 'Myroxylon peruiferum', 'Nauclea diderrichii', 'Pachyelasma tessmannii', 'Palaquium waburgianum', 'Pericopsis elata', 'Pinus sp', 'Piptadeniastrum africanum', 'Populus sp', 'Prunus serotina', 'Pterocarpus macrocarpus', 'Pterocarpus soyauxii', 'Pterocarpus sp', 'Qualea paraensis', 'Quercus petraea', 'Quercus robur', 'Quercus rubra', 'Samanea saman', 'Shorea hypochra (Anthoshorea hypochra)', 'Shorea roxburghii (Anthoshorea roxburghii)', 'Sindora cochinchinensis', 'Staudtia stipitata', 'Syzygium hemisphericum (Syzygium chanlos)', 'Tarrietia cochinchinensis (Heritiera cochinchinesis)', 'Tectona grandis', 'Terminalia superba', 'Tetraberlinia bifoliolata', 'Toona sureni', 'Xylia xylocarpa']

Training Details

  • Base model: facebook/convnextv2-atto-1k-224
  • Fine-tuned on custom dataset
  • Number of classes: 100
  • Image size: 224x224

Training Configuration

  • Epochs: 2 (as specified in your code)
  • Batch size: 64
  • Learning rate: 1e-4
  • Optimizer: AdamW
  • Loss function: CrossEntropyLoss

Performance Metrics

  • Metrics will be updated after evaluation

Usage

from transformers import ConvNextV2ForImageClassification, AutoImageProcessor
import torch
from PIL import Image

# Load model and processor
model = ConvNextV2ForImageClassification.from_pretained("huytranduck/convnextv2_50x_dataset")
processor = AutoImageProcessor.from_pretrained("huytranduck/convnextv2_50x_dataset")

# Load and preprocess image
image = Image.open("path_to_your_image.jpg")
inputs = processor(image, return_tensors="pt")

# Make prediction
with torch.no_grad():
    outputs = model(**inputs)
    predictions = torch.nn.functional.softmax(outputs.logits, dim=-1)
    predicted_class = torch.argmax(predictions, dim=-1)

print(f"Predicted class: {predicted_class.item()}")

Training Code

The model was trained using PyTorch and Transformers library with custom data augmentation.

Downloads last month
4
Safetensors
Model size
3.42M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support