AI开发平台MODELARTS-SFT全参微调任务:启动训练脚本

时间:2024-11-05 09:54:19

启动训练脚本

单机启动

以baichuan2-13b为例,单机SFT全参微调启动命令如下。进入代码目录/home/ma-user/ws/6.3.904-Ascend/llm_train/AscendSpeed下执行启动脚本,超参详解参考表1 增量预训练超参配置

MODEL_TYPE=13B  RUN_TYPE=sft DATA_PATH=/home/ma-user/ws/processed_for_ma_input/BaiChuan2-13B/data/finetune/alpaca_ft TOKENIZER_MODEL=/home/ma-user/ws/tokenizers/BaiChuan2-13B CKPT_LOAD_DIR=
/home/ma-user/ws/processed_for_ma_input/BaiChuan2-13B/converted_weights
 TRAIN_ITERS=300 MBS=1  GBS=16 TP=8 PP=1 WORK_DIR=/home/ma-user/ws sh scripts/baichuan2/baichuan2.sh

其中 MODEL_TYPE 、RUN_TYPE、DATA_PATH、TOKENIZER_MODEL为必填;TRAIN_ITERS、MBS、GBS、 TP、PP WORK_DIR为非必填,有默认值。

多机启动

以baichuan2-13b为例,多台机器执行训练启动命令如下。多机启动需要在每个节点上执行,以双机为例。进入代码目录/home/ma-user/ws/6.3.904-Ascend/llm_train/AscendSpeed下执行启动脚本,超参详解参考表1 增量预训练超参配置

 
第一台节点
MASTER_ADDR=xx.xx.xx.xx NNODES=2 NODE_RANK=0 MODEL_TYPE=13B  RUN_TYPE=sft DATA_PATH=/home/ma-user/ws/processed_for_ma_input/BaiChuan2-13B/data/finetune/alpaca_ft TOKENIZER_MODEL=/home/ma-user/ws/tokenizers/BaiChuan2-13B CKPT_LOAD_DIR=/home/ma-user/ws/processed_for_ma_input/BaiChuan2-13B/converted_weights TRAIN_ITERS=300 MBS=1  GBS=16 TP=8 PP=1 WORK_DIR=/home/ma-user/ws sh scripts/baichuan2/baichuan2.sh
 ... 
 ... 
# 第二台节点 
MASTER_ADDR=xx.xx.xx.xx NNODES=2 NODE_RANK=1 MODEL_TYPE=13B  RUN_TYPE=sft DATA_PATH=/home/ma-user/ws/processed_for_ma_input/BaiChuan2-13B/data/finetune/alpaca_ft TOKENIZER_MODEL=/home/ma-user/ws/tokenizers/BaiChuan2-13B CKPT_LOAD_DIR=/home/ma-user/ws/processed_for_ma_input/BaiChuan2-13B/converted_weights TRAIN_ITERS=300 MBS=1  GBS=16 TP=8 PP=1 WORK_DIR=/home/ma-user/ws sh scripts/baichuan2/baichuan2.sh

以上命令多台机器执行时,只有${NODE_RANK}:节点ID值不同,其他参数都保持一致。

其中MASTER_ADDR、NODE_RANK、MODEL_TYPE 、RUN_TYPE、DATA_PATH、TOKENIZER_MODEL、CKPT_LOAD_DIR为必填;TRAIN_ITERS、MBS、GBS、TP、PP、WORK_DIR为非必填,有默认值。

support.huaweicloud.com/bestpractice-modelarts/modelarts_10_1927.html