YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Nt-novel-qwen2.5-14b

Nt-novel-qwen2.5-14b is a Chinese writing large language model fine-tuned by FenYi Studio (MultEase) based on Qwen2.5-14b-Instruct. It supports various writing tasks including news, blogs, and novels. The model is primarily trained on Chinese and some English data, with potentially limited effectiveness in other languages.

Features

  • Based on Qwen2.5-14b-Instruct
  • Extended output length to 32k tokens (from 8k)
  • Training data: ≈8B tokens (100k text samples)
  • Supports: news, blogs, novels
  • License: Apache-2.0

Limitations

  • Limited training data may restrict performance improvements
  • May produce hallucinations; outputs require human verification

Demo Output

我想写一本小说,给你一段正文,帮我续写 正文:乐菱噗哧一声笑了出来,缓缓摇头道: “我呀,却想给周动好好庆祝庆祝,不仅是因为他帮了我,周动的故事太励志了,将来我一旦当上了云隐阁主,周动的故事传颂出去,对于云隐阁的公务开展也有莫大好处。” 公孙若笑了: “好啊,还没当上这阁主,八字还没一撇呢,就准备好好利用别人做宣传了啊。” 公孙若说完这些再次笑了,缓缓点头道: “乐菱,人就该有这种舍我其谁的气魄,放心吧,云隐阁主一定会是你的。” 乐菱重重地点头。 那边,金雨欣听见这头聊的热闹也凑了过来,明媚地笑着加入两女之间的话题: “我到期盼着比赛快些结束,我要好好盘问盘问周动,他的实力到底是怎么修上来的,怎么我就无法做到?” 三人一齐笑了起来,乐菱点头道: “快了,比赛结束应该快了,用不了多久你就可以见到周动,然后将他的底细全部揭露出来。” 金雨欣笑着点头,缓缓地说道: “我到是有些顾虑,一会比赛结束后,我若是见到周动,面对着一个生死尊境的兄弟,我要怎么问他话呢?” 公孙若哈哈大笑起来: “我想等周动出来后,你会发现他应该一点也没有变化的,他还是那个你熟悉的周动。” “嗯。” 金雨欣点头。 就在这时,那边的青流火突然大喊了一声:“雨欣,快看,你看周动……” “周动怎么了?” 金雨欣几个女仙听了大吃一惊,同时霍然转头向天上看去,就见在腥红色的血海中,周动身体突然轻轻舒展,然后轻轻向前一步迈出,唰地一声,一步数百丈,身体刹那间就纵掠成一道流光,向前直冲了出去。

看到这一幕,“轰”地一声群山轰动,不少观战的众仙们几乎是下意识地就呼地站起身,大张着口目瞪口呆地看着这一切,人们全都快傻掉了。
比赛进行到这种程度,几乎接近尾声,几乎没有几个人可以继续前进,而周动这一刻干什么,他突然再次暴发,以一种肆无忌惮的方式飞行?
在血海深压中,周动行进了那么深远的距离,竟然还能这样快的飞行?
公孙若、乐菱下意识地呀了一声,二人同时站起身向天上看去,脸色全都变了。
方才她们还在认为比赛马上就要结束了,可谁知周动在比赛进行到这个程度再掀高潮,竟然一切重新开始,竟然才开始真正发力。
都知道周动突破生死境,但是即使突破,周动也就是现在这个位置的实力啊,前边可是一位比他突破早了不知多少年的生死境,周动突然暴发,竟然一下子掠出了那么快的速度,周动要干什么?他还要超越吗?
周动哪里来的那么强的力量?
所有人都有些懵了,紧紧地盯着天空中周动纵横飞掠的身姿,如大鸟一般玄奥灵动,唰地一声就是数百丈的距离,在向着前方的第六名罗天道快速追去。
越来越近,越来越近,不知过去多久,罗天道的身影已经进入视线所及的范围了,周动与罗天道的距离在一点一点拉近着。
乐菱的脸有些苍白,诧然转头看向公孙若问道:
“若姐,这个周动到底想干什么?”
群山之间的群仙震惊、兴奋、茫然、狂喜均有之。沸腾的吵杂声、议论声响彻山谷,无论是朋友还是敌人,亦或只是毫无相干的观战者,几乎所有的目光全被周动那里彻底地吸引了过去。
太惊人了,在比赛进行到这一步的时候周动竟然再次疯狂暴发,这一轮暴发的简直太惊人了。
“周动,你到底要做什么?”

Requirements

transformers>=4.37.0
torch>=2.0.0

The code of Qwen2.5 has been in the latest Hugging face transformers and we advise you to use the latest version of transformers.

With transformers<4.37.0, you will encounter the following error:

KeyError: 'qwen2'

Quick Start

Uses same calling method as Qwen2.5-14b-Instruct; only requires changing the model path.

from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "MultEase/nt-novel-qwen2.5-14b"
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype="auto",
    device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
prompt = "写一篇关于春天的短文"
messages = [
    {"role": "system", "content": "你是一个专业的写手,善于创作优质文章。"},
    {"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(
    messages,
    tokenize=False,
    add_generation_prompt=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
generated_ids = model.generate(
    **model_inputs,
    max_new_tokens=32768
)
generated_ids = [
    output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]

Note

Download model_file.zip and extract it to use the model locally.

Training Details

  • Data: 100k Chinese texts (news, blogs, novels)
  • Total tokens: ≈8B
  • Training epochs: 4
  • Method: LoRA
  • Base model: Qwen2.5-14b-Instruct

Acknowledgments

  • Alibaba Cloud's Qwen team for base model
  • Open-source data contributors

License

Apache-2.0 license, same as Qwen2.5-14b-Instruct

About Us

FenYi Studio is a tech interest group advancing large language model applications. Contact us to join discussions or provide computing resources.Meet us on GitHub: https://github.com/Fenyi-Studios

Citations

@misc{nt-novel-qwen,
    title = {nt-novel-qwen2.5-14b: A Chinese Writing Assistant Based on Qwen2.5},
    author = {FenYi Studio},
    year = {2025}
}

You can also cite Qwen2.5-14b-Instruct:

@misc{qwen2.5,
    title = {Qwen2.5: A Party of Foundation Models},
    url = {https://qwenlm.github.io/blog/qwen2.5/},
    author = {Qwen Team},
    month = {September},
    year = {2024}
}
@article{qwen2,
      title={Qwen2 Technical Report}, 
      author={An Yang and Baosong Yang and Binyuan Hui and Bo Zheng and Bowen Yu and Chang Zhou and Chengpeng Li and Chengyuan Li and Dayiheng Liu and Fei Huang and Guanting Dong and Haoran Wei and Huan Lin and Jialong Tang and Jialin Wang and Jian Yang and Jianhong Tu and Jianwei Zhang and Jianxin Ma and Jin Xu and Jingren Zhou and Jinze Bai and Jinzheng He and Junyang Lin and Kai Dang and Keming Lu and Keqin Chen and Kexin Yang and Mei Li and Mingfeng Xue and Na Ni and Pei Zhang and Peng Wang and Ru Peng and Rui Men and Ruize Gao and Runji Lin and Shijie Wang and Shuai Bai and Sinan Tan and Tianhang Zhu and Tianhao Li and Tianyu Liu and Wenbin Ge and Xiaodong Deng and Xiaohuan Zhou and Xingzhang Ren and Xinyu Zhang and Xipin Wei and Xuancheng Ren and Yang Fan and Yang Yao and Yichang Zhang and Yu Wan and Yunfei Chu and Yuqiong Liu and Zeyu Cui and Zhenru Zhang and Zhihao Fan},
      journal={arXiv preprint arXiv:2407.10671},
      year={2024}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.