name | about | labels |
---|---|---|
Bug Report | Use this template for reporting a bug | kind/bug |
[ST][MS][MF][glm3-6b-32k]网络KBK,batch推理失败,ValueError: Unable to create tensor,
模型仓地址:https://gitee.com/mindspore/mindformers/blob/dev/docs/model_cards/glm3.md
Ascend
/GPU
/CPU
) / 硬件环境:Please delete the backend not involved / 请删除不涉及的后端:
/device ascend
【CANN版本】:Milan_C17/20240414
【MindSpore版本】:master_B020
【MindFormers版本】:master_B020
PyNative
/Graph
):Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative
/mode graph
用例仓地址:/MindFormers_Test/cases/llama2/7b/train/
用例:
不涉及
get code from mindformers
cd mindformers/scripts
python run_mindformer.py --config research/glm32k/predict_glm32k_seq_256_bs_8.yaml --run_mode predict --use_parallel False --predict_data "使用python编写快速排序代码" "晚上睡不着应该怎么办" "请介绍一下华为" "平板有什么用" "使用python编写快速排序代码" "晚上睡不着应该怎么办" "请介绍一下华为" "平板有什么用" --auto_trans_ckpt False --predict_batch_size 8 --device_id 0 > 32k_infer_seq256_bs8.log 2>&1 &
验证网络是否推理成功
网络推理成功
Traceback (most recent call last):
File "/home/jenkins0/sjw/mindformers/run_mindformer.py", line 268, in <module>
main(config_)
File "/home/jenkins0/sjw/mindformers/mindformers/tools/cloud_adapter/cloud_monitor.py", line 44, in wrapper
raise exc
File "/home/jenkins0/sjw/mindformers/mindformers/tools/cloud_adapter/cloud_monitor.py", line 34, in wrapper
result = run_func(*args, **kwargs)
File "/home/jenkins0/sjw/mindformers/run_mindformer.py", line 43, in main
trainer.predict(predict_checkpoint=config.load_checkpoint, input_data=config.input_data,
File "/home/jenkins0/.conda/envs/wxy39/lib/python3.9/site-packages/mindspore/_checkparam.py", line 1371, in wrapper
return func(*args, **kwargs)
File "/home/jenkins0/sjw/mindformers/mindformers/trainer/trainer.py", line 684, in predict
output_result = self.trainer.predict(
File "/home/jenkins0/sjw/mindformers/mindformers/trainer/causal_language_modeling/causal_language_modeling.py", line 338, in predict
return self.predict_process(config=config,
File "/home/jenkins0/sjw/mindformers/mindformers/trainer/base_trainer.py", line 937, in predict_process
output_results = self.pipeline_task(input_data, top_k=top_k)
File "/home/jenkins0/sjw/mindformers/mindformers/pipeline/base_pipeline.py", line 149, in __call__
outputs = self.run_multi(inputs, batch_size, preprocess_params, forward_params, postprocess_params)
File "/home/jenkins0/sjw/mindformers/mindformers/pipeline/text_generation_pipeline.py", line 187, in run_multi
outputs.extend(self.run_single(item, preprocess_params,
File "/home/jenkins0/sjw/mindformers/mindformers/pipeline/base_pipeline.py", line 236, in run_single
model_inputs = self.preprocess(inputs, **preprocess_params)
File "/home/jenkins0/sjw/mindformers/mindformers/pipeline/text_generation_pipeline.py", line 135, in preprocess
return self.tokenizer.build_batch_input(inputs)
File "/home/jenkins0/sjw/mindformers/mindformers/models/glm3/glm3_tokenizer.py", line 248, in build_batch_input
return self.batch_encode_plus(batch_inputs, return_tensors="np", is_split_into_words=True)
File "/home/jenkins0/sjw/mindformers/mindformers/models/tokenization_utils_base.py", line 3352, in batch_encode_plus
return self._batch_encode_plus(
File "/home/jenkins0/sjw/mindformers/mindformers/models/tokenization_utils.py", line 871, in _batch_encode_plus
batch_outputs = self._batch_prepare_for_model(
File "/home/jenkins0/sjw/mindformers/mindformers/models/tokenization_utils.py", line 951, in _batch_prepare_for_model
batch_outputs = BatchEncoding(batch_outputs, tensor_type=return_tensors)
File "/home/jenkins0/sjw/mindformers/mindformers/models/tokenization_utils_base.py", line 273, in __init__
self.convert_to_tensors(tensor_type=tensor_type, prepend_batch_axis=prepend_batch_axis)
File "/home/jenkins0/sjw/mindformers/mindformers/models/tokenization_utils_base.py", line 796, in convert_to_tensors
raise ValueError(
ValueError: Unable to create tensor, you should probably activate truncation and/or padding with 'padding=True' 'truncation=True' to have batched tensors with the same length. Perhaps your features (`input_ids` in this case) have excessive nesting (inputs type `list` where type `int` is expected).
走给吴致远
Please assign maintainer to check this issue.
请为此issue分配处理人。
@sunjiawei999
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。
感谢您的提问,您可以评论//mindspore-assistant更快获取帮助:
Appearance & Root Cause
问题:batch推理失败
根因:多batch未padding
Fix Solution
多batch数据padding到统一长度
Self-test Report & DT Review
是否需要补充ST/UT:否
原因:research下模型不涉及
note_27189266
登录 后才可以发表评论