Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Mabeck
/
Heidrun-Mistral-7B-chat
like
6
Text Generation
Transformers
PyTorch
Safetensors
GGUF
Mabeck/danish-OpenHermes
kobprof/skolegpt-instruct
English
Danish
mistral
text-generation-inference
unsloth
trl
Inference Endpoints
License:
mit
Model card
Files
Files and versions
Community
3
Train
Deploy
Use this model
207219d
Heidrun-Mistral-7B-chat
3 contributors
History:
18 commits
Mabeck
Update README.md
207219d
verified
about 1 year ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
README.md
3.6 kB
Update README.md
about 1 year ago
config.json
698 Bytes
Upload MistralForCausalLM
about 1 year ago
generation_config.json
111 Bytes
Upload MistralForCausalLM
about 1 year ago
heidrun.jpeg
144 kB
Upload heidrun.jpeg
about 1 year ago
model-00001-of-00003.safetensors
4.94 GB
LFS
Adding `safetensors` variant of this model (#1)
about 1 year ago
model-00002-of-00003.safetensors
5 GB
LFS
Adding `safetensors` variant of this model (#1)
about 1 year ago
model-00003-of-00003.safetensors
4.54 GB
LFS
Adding `safetensors` variant of this model (#1)
about 1 year ago
model.safetensors.index.json
25.1 kB
Adding `safetensors` variant of this model (#1)
about 1 year ago
pytorch_model-00001-of-00003.bin
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
4.94 GB
LFS
Upload MistralForCausalLM
about 1 year ago
pytorch_model-00002-of-00003.bin
5 GB
LFS
Upload MistralForCausalLM
about 1 year ago
pytorch_model-00003-of-00003.bin
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.54 GB
LFS
Upload MistralForCausalLM
about 1 year ago
pytorch_model.bin.index.json
24 kB
Upload MistralForCausalLM
about 1 year ago
special_tokens_map.json
552 Bytes
Upload tokenizer
about 1 year ago
tokenizer.json
1.8 MB
Upload tokenizer
about 1 year ago
tokenizer.model
493 kB
LFS
Upload tokenizer
about 1 year ago
tokenizer_config.json
971 Bytes
Upload tokenizer
about 1 year ago