2.4K Star 8.2K Fork 4.4K

GVPMindSpore / mindspore

 / 详情

[CT][MS][CI]<test_ops_add_layernorm.py::test_add_layer_norm> failed in gate

DONE
Bug-Report 成员
创建于  
2024-06-12 15:08

Describe the current behavior / 问题描述 (Mandatory / 必填)

test_ops_add_layernorm.py::test_add_layer_norm failed in gate

Environment / 环境信息 (Mandatory / 必填)

  • Hardware Environment(Ascend/GPU/CPU) / 硬件环境:

Please delete the backend not involved / 请删除不涉及的后端:
/device ascend910B

  • Software Environment / 软件环境 (Mandatory / 必填):
    -- MindSpore version (e.g., 1.7.0.Bxxx) :
    -- Python version (e.g., Python 3.7.5) :
    -- OS platform and distribution (e.g., Linux Ubuntu 16.04):
    -- GCC/Compiler version (if compiled from source):

  • Excute Mode / 执行模式 (Mandatory / 必填)(PyNative/Graph):

Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative
/mode graph

Related testcase / 关联用例 (Mandatory / 必填)

[gate failed]tests/st/ops
test_ops_add_layernorm.py::test_add_layer_norm

Steps to reproduce the issue / 重现步骤 (Mandatory / 必填)

Describe the expected behavior / 预期结果 (Mandatory / 必填)

Related log / screenshot / 日志 / 截图 (Mandatory / 必填)

[2024-06-11T10:11:21.505Z] tensor_type = mindspore.bfloat16
[2024-06-11T10:11:21.505Z] 
[2024-06-11T10:11:21.505Z]     @pytest.mark.level0
[2024-06-11T10:11:21.505Z]     @pytest.mark.env_onecard
[2024-06-11T10:11:21.505Z]     @pytest.mark.platform_arm_ascend910b_training
[2024-06-11T10:11:21.505Z]     @pytest.mark.platform_x86_ascend910b_training
[2024-06-11T10:11:21.505Z]     @pytest.mark.parametrize('tensor_type', [mstype.float32, mstype.float16, mstype.bfloat16])
[2024-06-11T10:11:21.505Z]     def test_add_layer_norm(tensor_type):
[2024-06-11T10:11:21.505Z]         """
[2024-06-11T10:11:21.505Z]         Feature: test add_layernorm fusion in kbk mode
[2024-06-11T10:11:21.505Z]         Description: test add_layernorm.
[2024-06-11T10:11:21.505Z]         Expectation: the result is the same with aclnn version of two ops
[2024-06-11T10:11:21.505Z]         """
[2024-06-11T10:11:21.505Z]         os.environ["MS_DISABLE_INTERNAL_KERNELS_LIST"] = "AddLayerNorm"
[2024-06-11T10:11:21.505Z]         context.set_context(mode=0)
[2024-06-11T10:11:21.505Z]     
[2024-06-11T10:11:21.505Z]         x1 = generate_random_input((2, 3), np.float32)
[2024-06-11T10:11:21.505Z]         x2 = generate_random_input((2, 3), np.float32)
[2024-06-11T10:11:21.505Z]         gamma = np.ones([3]).astype(np.float32)
[2024-06-11T10:11:21.505Z]         beta = np.zeros([3]).astype(np.float32)
[2024-06-11T10:11:21.505Z]         x1_tensor = Tensor(x1, dtype=tensor_type)
[2024-06-11T10:11:21.505Z]         x2_tensor = Tensor(x2, dtype=tensor_type)
[2024-06-11T10:11:21.505Z]         gamma_tensor = Tensor(gamma, dtype=tensor_type)
[2024-06-11T10:11:21.505Z]         beta_tensor = Tensor(beta, dtype=tensor_type)
[2024-06-11T10:11:21.505Z]     
[2024-06-11T10:11:21.505Z]         net = Add_LayerNorm()
[2024-06-11T10:11:21.506Z]         net.set_jit_config(JitConfig(jit_level="O0", infer_boost="on"))
[2024-06-11T10:11:21.506Z]         output = net(x1_tensor, x2_tensor, gamma_tensor, beta_tensor)
[2024-06-11T10:11:21.506Z]     
[2024-06-11T10:11:21.506Z]         expect = generate_expect_forward_output(x1, x2, gamma, beta)
[2024-06-11T10:11:21.506Z]         assert np.allclose(output[0].float().asnumpy(), expect[0], rtol=5e-3, atol=5e-3)
[2024-06-11T10:11:21.506Z]         assert np.allclose(output[1].float().asnumpy(), expect[1], rtol=5e-3, atol=5e-3)
[2024-06-11T10:11:21.506Z] >       assert np.allclose(output[2].float().asnumpy(), expect[2], rtol=5e-3, atol=5e-3)
[2024-06-11T10:11:21.506Z] E       assert False
[2024-06-11T10:11:21.506Z] E        +  where False = <function allclose at 0xfffed6cc7710>(array([[5.3118715 ],\n       [0.91416574]], dtype=float32), array([[5.2235823],\n       [0.9144495]], dtype=float32), rtol=0.005, atol=0.005)
[2024-06-11T10:11:21.506Z] E        +    where <function allclose at 0xfffed6cc7710> = np.allclose
[2024-06-11T10:11:21.506Z] E        +    and   array([[5.3118715 ],\n       [0.91416574]], dtype=float32) = <bound method Tensor.asnumpy of Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]])>()
[2024-06-11T10:11:21.506Z] E        +      where <bound method Tensor.asnumpy of Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]])> = Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]]).asnumpy
[2024-06-11T10:11:21.506Z] E        +        where Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]]) = <bound method Tensor.float of Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]])>()
[2024-06-11T10:11:21.506Z] E        +          where <bound method Tensor.float of Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]])> = Tensor(shape=[2, 1], dtype=Float32, value=\n[[ 5.31187153e+00],\n [ 9.14165735e-01]]).float
[2024-06-11T10:11:21.506Z] 
[2024-06-11T10:11:21.506Z] test_ops_add_layernorm.py:81: AssertionError
[2024-06-11T10:11:21.506Z] =============================== warnings summary ===============================

Special notes for this issue/备注 (Optional / 选填)

评论 (4)

陈盼妙 创建了Bug-Report
陈盼妙 添加了
 
kind/ci
标签
陈盼妙 添加了
 
br_base
标签
陈盼妙 添加了
 
sig/ops
标签
陈盼妙 添加了
 
device/ascend
标签
陈盼妙 添加了
 
kind/occasionally
标签
陈盼妙 添加了关联分支br_base 选项
陈盼妙 添加了问题后端类型Ascend 选项
展开全部操作日志

Please assign maintainer to check this issue.
请为此issue分配处理人。
@陈盼妙

感谢您的提问,您可以评论//mindspore-assistant更快获取帮助:

  1. 如果您刚刚接触MindSpore,或许您可以在教程找到答案
  2. 如果您是资深Pytorch用户,您或许需要:
  1. 如果您遇到动态图问题,可以设置set_context(pynative_synchronize=True)查看报错栈协助定位
  2. 模型精度调优问题可参考官网调优指南
  3. 如果您反馈的是框架BUG,请确认您在ISSUE中提供了MindSpore版本、使用的后端类型(CPU、GPU、Ascend)、环境、训练的代码官方链接以及可以复现报错的代码的启动方式等必要的定位信息
  4. 如果您已经定位出问题根因,欢迎提交PR参与MindSpore开源社区,我们会尽快review

问题原因:融合算子计算方式和numpy函数有细微差别,少数随机函数生成数据会产生误差
解决方案:固定输入输出,使用不会产生随机误差的输入
修改结果:
输入图片说明

i-robot 添加了
 
foruda
标签
dairenjie 任务状态TODO 修改为VALIDATION
dairenjie 添加协作者dairenjie
dairenjie 负责人dairenjie 修改为陈盼妙
dairenjie 里程碑B-SIG-OPS 修改为B-ComponentTest

br_base上的st/ops下test_ops_add_layernorm.py::test_add_layer_norm用例在master上没有
master上st/ops/ascend下的test_add_layernorm.py被上至level0
回归通过
戴仁杰 2024-06-18 09:17
se说br_base要被覆盖掉,不让往br_base上合

陈盼妙 任务状态VALIDATION 修改为DONE

登录 后才可以发表评论

状态
负责人
项目
里程碑
Pull Requests
关联的 Pull Requests 被合并后可能会关闭此 issue
分支
开始日期   -   截止日期
-
置顶选项
优先级
预计工期 (小时)
参与者(3)
11046135 chen panmiao 1702868389
Python
1
https://gitee.com/mindspore/mindspore.git
git@gitee.com:mindspore/mindspore.git
mindspore
mindspore
mindspore

搜索帮助

344bd9b3 5694891 D2dac590 5694891