YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Important: No commercial use without commercial license (yet)

Deca 3 Alpha Ultra: Model Overview

Deca 3 Alpha Ultra is a large language model built on a DynAMoE (Dynamically Activated Mixture of Experts) architecture, offering a distinct approach from traditional Mixture of Experts (MoE) models. With 4.6 trillion parameters, it leverages novel techniques to push the limits of natural language understanding and generation.

Key Specifications

  • Architecture: DynAMoE (Dynamically Activated Mixture of Experts), designed for optimal performance. Derived from existing architectures
  • Parameters: 4.6 trillion.

License and Usage

  • Current License: Restricted access for select research institutions and strategic partners. This is to prevent the model from slowing down Deca 2.5
  • Future License: A revised licensing model will expand access while maintaining responsible deployment.

About Deca Founded as a small AI company in the U.S., Deca has rapidly expanded after receiving key support from GenLabs, advancing AI research and developing groundbreaking models like Deca 3 Alpha Ultra.

Downloads last month

-

Downloads are not tracked for this model. How to track
Safetensors
Model size
4,744B params
Tensor type
BF16
F32
F8_E4M3
U8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 5 Ask for provider support