代码拉取完成,页面将自动刷新
Build your own GPT2 quickly, without doing many useless work.
This project is base on 🤗 transformer. This tutorial show you how to train your own language(such as chinese or Japanese) GPT2 model in a few code with Tensorflow 2.
You can try this project in colab right now.
├── configs
│ ├── test.py
│ └── train.py
├── build_tokenizer.py
├── predata.py
├── predict.py
└── train.py
git clone git@github.com:mymusise/gpt2-quickly.git
cd gpt2-quickly
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
this is a example of raw dataset: raw.txt
python build_tokenizer.py
python predata.py --n_processes=2
python train.py
python predict.py
ENV=FINETUNE python finetune.py
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。