1 Star 0 Fork 4.9K

fish / docs

forked from OpenHarmony / docs 
加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
testing.md 54.82 KB
一键复制 编辑 原始数据 按行查看 历史
NEEN 提交于 2021-03-12 17:59 . !197 Docs Update version 1.0.1

Testing

Overview

Basic Concepts

The testing subsystem provides a one-click Python-based self-test platform for developers. It supports cross-platform tests and extension to third-party testing frameworks. The subsystem consists of modules for compiling, managing, scheduling and distributing, and executing test cases, collecting test results, generating test reports, creating test case templates, managing test environments, and many others.

Before development using the testing subsystem, you need to understand the following concepts:

  • Test case compilation

    This operation compiles the source code of test cases into binary files that can be executed on the tested device.

  • Test case scheduling & distributing

    This operation distributes test cases to different tested devices through the network port or serial port, and allocates a specific executor for each test case.

  • Test case executor

    A test case executor defines the execution logic of each test case, such as its pre-processing, execution, and result recording.

  • Test case template

    A test case template defines respective unified formats for test cases and for GN files.

  • Test platform kits

    The test platform provides common methods to be used during the running of the test tool, for example, providing the test case directory to mount the file system to a tested device, distributing test cases to the tested device, or obtaining test results from the tested device.

  • Test report generation

    This operation defines a template for generating self-test reports and web test reports.

  • Test environment management

    The tested devices can be managed through the USB port or serial port, including discovering a device and querying the device status.

Working Principles

  • The following figure shows the architecture of the test platform.

Figure 1 Platform architecture

  • The following figure shows the running sequence diagram of the test platform.

Figure 2 Running sequence of the test platform

  • Working principle of the test platform

The test platform is started using a shell script. It executes a series of testing commands entered on the command line interface (CLI) and prints the command output.

Limitations and Constraints

  • The self-test platform supports only code-level test case development and verification, such as unit testing and module testing.
  • Currently, the testing framework supports only white-box testing.
  • Only one test platform can be started on a testing device.

Setting Up a Test Environment

Environment Requirements

Table 1 Environment requirements

Item

Testing Device

Tested Device

Hardware

  • Memory: 8 GB or above
  • Hard disk space: 100 GB or above
  • Hardware architecture: x86 or ARM64
  • Hi3516D V300 development board
  • Hi3518E V300 development board

Software

  • OS: Windows 10 (64-bit) or Ubuntu 18.04

    System component (Linux): libreadline-dev

  • Python: 3.7.5 or later
  • Python plug-ins: pySerial 3.3 or later, Paramiko 2.7.1 or later, Setuptools 40.8.0 or later, and RSA 4.0 or later
  • NFS server: haneWIN NFS Server 1.2.50 or later, or NFSv4 or later
  • OS: OpenHarmony 1.0 or later
  • Kernel: LiteOS Cortex-A or Linux kernel

Installing the Environment

  1. (Optional) If the test environment runs Linux, run the following command to install system component Readline:

    sudo apt-get install libreadline-dev

    If the installation is successful, the following prompts are displayed:

    Reading package lists... Done
    Building dependency tree
    Reading state information... Done
    libreadline-dev is already the newest version (7.0-3).
    0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded.
  2. Install Python extension plug-ins Setuptools. Install RSA, Paramiko, and pySerial if the device supports the serial port only.

    1. Run the following command to install Setuptools:
    pip install setuptools

    If the installation is successful, the following prompts are displayed:

    Requirement already satisfied: setuptools in d:\programs\python37\lib\site-packages (41.2.0)
    1. Run the following command to install RSA:
    pip install rsa

    If the installation is successful, the following prompts are displayed:

    Installing collected packages: pyasn1, rsa
    Successfully installed pyasn1-0.4.8 rsa-4.7
    1. Run the following command to install Paramiko:
    pip install paramiko

    If the installation is successful, the following prompts are displayed:

    Installing collected packages: pycparser, cffi, pynacl, bcrypt, cryptography, paramiko
    Successfully installed bcrypt-3.2.0 cffi-1.14.4 cryptography-3.3.1 paramiko-2.7.2 pycparser-2.20 pynacl-1.4.0
    1. (Optional) Run the following command to install pySerial. This step is mandatory for tested devices that support serial ports only.
    pip install pyserial

    If the installation is successful, the following prompts are displayed:

    Requirement already satisfied: pyserial in d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg (3.4)
  3. (Optional) Install the NFS server. This step is mandatory for tested devices that support serial ports only.

    Windows OS

    Download and install haneWIN NFS Server 1.2.50 at https://www.hanewin.net/nfs-e.htm.

    Linux OS

    sudo apt install nfs-kernel-server

    After the environment is installed, you can conduct coding and debugging for a test platform in an integrated development environment (IDE) (DevEco Studio is recommended).

Verifying the Test Environment

Table 2 Environment verification

Item

Operation

Requirement

Check that a compliant Python version has been installed.

Run the python --version command.

The Python version must be 3.7.5 or later.

Check that Python extension plug-ins have been installed.

Access the test/xdevice directory and run run.bat or run.sh.

The >>> prompt is displayed.

Check that the NFS server has been started (for tested devices that support serial ports only).

Log in to the development board through the serial port and run the mount command to mount the NFS server.

The file directory can be mounted properly.

Development Guidelines

When to Use

You can call the APIs to conduct white box tests of service code.

Available APIs

The testing framework integrates the open-source unit testing framework and expands the macros of the test cases. For details about the framework, see the official open-source documentation.

Table 3 Expanded macros of the framework

Macro

Description

HWTEST

The execution of test cases does not rely on setup and teardown execution. Based on the TEST macro, this macro has added the TestSize.Level1 parameter to specify the test case level, for example, HWTEST(CalculatorAddTest, TestPoint_001, TestSize.Level1).

HWTEST_F

The execution of test cases (without parameters) depends on setup and teardown execution. Based on the TEST_F macro, this macro has added the TestSize.Level1 parameter to specify the test case level, for example, HWTEST_F(CalculatorAddTest, TestPoint_001, TestSize.Level1).

HWTEST_P

The execution of test cases (with parameters) depends on setup and teardown execution. Based on the TEST_P macro, this macro has added the TestSize.Level1 parameter to specify the test case level, for example, HWTEST_P(CalculatorAddTest, TestPoint_001, TestSize.Level1).

How to Develop

  1. Define a test suite file based on the test case directory, for example, test/developertest/examples/lite/cxx_demo/test/unittest/common/calc_subtraction_test.cpp. The class in this test suite should be inherited from the testing::Test class and named in the format of "Tested feature_Test".

    /*
     * Copyright (c) 2020 OpenHarmony.
     * Licensed under the Apache License, Version 2.0 (the "License");
     * you may not use this file except in compliance with the License.
     * You may obtain a copy of the License at
     *
     *     http://www.apache.org/licenses/LICENSE-2.0
     *
     * Unless required by applicable law or agreed to in writing, software
     * distributed under the License is distributed on an "AS IS" BASIS,
     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     * See the License for the specific language governing permissions and
     * limitations under the License.
     */
    #include <gtest/gtest.h>
    
    using namespace std;
    using namespace testing::ext;
    
    class CalcSubtractionTest : public testing::Test {
    public:
        static void SetUpTestCase(void);
        static void TearDownTestCase(void);
        void SetUp();
        void TearDown();
    };

    NOTE: You must write test cases by observing the following specifications:

    • Naming The source file name of a test case must be consistent with the test suite content. Each test suite has multiple test cases and a test source file that is globally unique and named in [Feature]_[Function]_[Subfunction 1]_[Subfunction 1.1] format (subfunctions can be further divided). The file name can contain only lower-case letters and underscores (_) and must end with test, for example, developertest/examples/lite/cxx_demo.
    • Coding The test cases must comply with the coding specifications for feature code. In addition, case descriptions are required for further clarification. For details, see Test case template.
    • Compilation and configuration The test cases must be compiled using GN, and the configurations must comply with the compilation guide of this open-source project. For details, see Compilation and Building Guidelines.
    • Test case template For details, see the example test case developertest/examples/lite/cxx_demo/test/unittest/common/calc_subtraction_test.cpp.
  2. Implement the preprocessing (via the SetUp function) and postprocessing (via the TearDown function) operations required by the execution of the test suite.

    void CalcSubtractionTest::SetUpTestCase(void)
    {
        // step 1: input testsuite setup step
    }
    
    void CalcSubtractionTest::TearDownTestCase(void)
    {
        // step 2: input testsuite teardown step
    }
    
    void CalcSubtractionTest::SetUp(void)
    {
        // step 3: input testcase setup step
    }
    
    void CalcSubtractionTest::TearDown(void)
    {
        // step 4: input testcase teardown step
    }
  3. Compile a test case based on the feature to be tested. The following code uses HWTEST_F as an example:

    /**
     * @tc.name: integer_sub_001
     * @tc.desc: Test Calculator
     * @tc.type: FUNC
     * @tc.require: AR00000000 SR00000000
     */
    HWTEST_F(CalcSubtractionTest, integer_sub_001, TestSize.Level1)
    {
        EXPECT_EQ(0, Subtraction(1, 0));
    }

    NOTE:

    • @tc.name: test case name, which briefly describes the test purpose
    • @tc.desc: detailed description of the test case, including the test purpose, test procedure, and expected result
    • @tc.type: test type, which can be FUNC, PERF, SECU, or RELI.
    • @tc.require: requirement ID or issue ID, which is used to associate the modification with the test case

    SN

    Test Type

    Code

    Description

    1

    Functionality test

    FUNC

    Verifies that each functionality of the software complies with the function design and specifications.

    2

    Performance test

    PERF

    Verifies that the software meets the performance requirements. Performance tests include load tests, capacitance tests, and pressure tests.

    3

    Security test

    SECU

    Verifies that the software complies with security requirements and related laws and regulations within the software lifecycle.

    4

    Reliability test

    RELI

    Verifies the probability that the software does not cause system failures within a specified period of time and under given conditions. Software stability is also involved in the test.

  4. Compile the GN file of the test case, including defining the compilation target, adding compilation dependencies, and setting the source file.

    Example file path: test/developertest/examples/lite/cxx_demo/test/unittest/common/BUILD.gn

    import("//build/lite/config/test.gni")
    
    unittest("CalcSubTest") {
        output_extension = "bin"
        sources = [
            "calc_subtraction_test.cpp"
        ]
        include_dirs = []
        deps = []
    }
  5. Add the compilation target to the subsystem compilation configuration to ensure that the test case is compiled with the version distribution. The following is an example:

    1. For devices that support connection to the Harmony device connector (hdc), the example compilation configuration directory is test/developertest/examples/ohos.build.

      {
        "subsystem": "subsystem_examples",
        "parts": {
          "subsystem_examples": {
            "module_list": [
              "//test/developertest/examples/detector:detector",
              ...
            ],
            "test_list": [
              "//test/developertest/examples/detector/test:unittest",
              ...
            ]
          },
          ...
      }
    2. For devices that support serial ports only, the example compilation configuration directory is test/developertest/examples/lite/BUILD.gn.

      import("//build/lite/config/test.gni")
      
      subsystem_test("test") {
          test_components = []
          if(ohos_kernel_type == "liteos_riscv") {
              features += [
              ]
          }else if(ohos_kernel_type == "liteos_a") {
              test_components += [
                  "//test/developertest/examples/lite/cxx_demo/test/unittest/common:CalcSubTest"
              ]
          }
      }
  6. Create a resource configuration file for the test case to use static resources.

    1. Create the resource directory in the test directory of a component or module.

    2. Create a directory for a device type, for example, phone, in the resource directory.

    3. Create a folder named after the module in the device type directory, for example, testmodule.

    4. Create the ohos_test.xml file in the folder named after the module. The file content is in the following format:

      <?xml version="1.0" encoding="UTF-8"?>
      <configuration ver="2.0">
          <target name="DetectorFileTest">
              <preparer>
                  <option name="push" value="test.txt -> /data/test/resource" src="res"/>
              </preparer>
          </target>
      </configuration>
    5. Define resource_config_file in the compilation configuration file of the test case to specify the resource file ohos_test.xml.

      NOTE: The resource file is used to push the test.txt file in the resource directory to the /data/test/resource directory of the device to test. To do so, run the hdc push command.

  7. Execute the test case after it is compiled (the preceding steps are complete).

    NOTE:

    • For devices that support connection to the hdc, test cases can be compiled separately.
    • For devices that support serial ports only, to compile the test case, run the commands in the root directory for compiling the debug code. For details about how to execute a test case, see How to Use the Test Platform.

Development Example

The code repository of the testing subsystem provides complete demo cases, which are available in the test/developertest/examples/ directory. The following is an example of compiling a test case for a subtraction function:

  • The tested code is as follows:

    static int Subtraction(int a, int b)
    {
        return a - b;
    }
  • The test case code is as follows:

    /**
     * @tc.name: integer_sub_002
     * @tc.desc: Verify the Subtraction function.
     * @tc.type: FUNC
     * @tc.require: AR00000000 SR00000000
     */
    HWTEST_F(CalcSubtractionTest, integer_sub_002, TestSize.Level1)
    {
        EXPECT_EQ(1, Subtraction(2, 1));
    }

How to Use the Test Platform

  1. (Optional) Install the XDevice component. XDevice can be used as a Python extension package.

    Go to the installation directory test/xdevice and run the following command:

    python setup.py install

    If the installation is successful, the following prompts are displayed:

    ...
    Installed d:\programs\python37\lib\site-packages\xdevice-0.0.0-py3.7.egg
    Processing dependencies for xdevice==0.0.0
    Searching for pyserial==3.4
    Best match: pyserial 3.4
    Processing pyserial-3.4-py3.7.egg
    pyserial 3.4 is already the active version in easy-install.pth
    Installing miniterm.py script to D:\Programs\Python37\Scripts
    
    Using d:\programs\python37\lib\site-packages\pyserial-3.4-py3.7.egg
    Finished processing dependencies for xdevice==0.0.0
  2. Modify the developertest/config/user_config.xml file to configure the Developertest component.

    1. For devices that support connection to the hdc, modify the configuration file as follows:

      Between the device tags with the "usb-hdc" attribute, modify the IP address of the device and the port number matching the HDC connection. For example:

      <device type="usb-hdc">
          <ip>192.168.1.1</ip>
          <port>9111</port>
          <sn></sn>
      </device>
    2. For devices that support serial ports only, modify the configuration file as follows:

      Between the device tags with the "ipcamera" attribute, modify the serial port information, including the COM port and baud rate. For example:

      <device type="com" label="ipcamera">
          <serial>
              <com>COM1</com>
              <type>cmd</type>
              <baud_rate>115200</baud_rate>
              <data_bits>8</data_bits>
              <stop_bits>1</stop_bits>
              <timeout>1</timeout>
          </serial>
      </device>
  3. (Optional) Modify the Developertest configuration. If a test case has been compiled, specify the compilation output path of the test case. In this case the test platform will not recompile the test case.

    Modify the config/user_config.xml file.

    1. Specify the output path of the test case, that is, the compilation output directory between the test_cases tags. Example:

      <test_cases>
          <dir>/home/opencode/out/release/tests</dir>
      </test_cases>
    2. For devices that support serial ports only, specify the NFS directory on the PC (host_dir) and the corresponding directory on the board (board_dir) between the NFS tags. For example:

      <NFS>
          <host_dir>D:\nfs</host_dir>
          <board_dir>user</board_dir>
      </NFS>
  4. (Optional) Prepare the test environment. If devices to be tested support only serial ports, check whether the environment is ready:

    • The system image and file system have been burnt into the development board and are running properly on it. For example, in system mode, if the device prompt OHOS# when you log in with the shell, the system is running properly.
    • The development host has been connected to the serial port of the development board and the network port.
    • IP addresses of the development host and development board are in the same network segment and can ping each other.
    • An empty directory has been created on the development host for mounting test cases through NFS, and the NFS service has been started properly.
  5. Start the test platform and execute the test case.

    • Start the test framework, go to the test/developertest directory, and execute the startup script.

      1. Run the following command to start the test framework in Windows:

        start.bat
      2. Run the following command to start the test framework in Linux:

        ./strat.sh
    • Select a device type.

      Configure the device type based on the development board in the configuration file, for example, developertest/config/framework_config.xml.

    • Run test commands.

      1. To query the subsystems, modules, product form, and test types supported by test cases, run the show commands.

        Usage: 
            show productlist      Query supported product forms
            show typelist         Query the supported test type
            show subsystemlist    Query supported subsystems
            show modulelist       Query supported modules
      2. Run test commands. -t is mandatory, and -ss and -tm are optional. The following is an example:

        run -t ut -ss subsystem_examples -tm calculator
      3. Specify the arguments to execute the test suite for a specific feature or module.

        usage: run [-h] [-p PRODUCTFORM] [-t [TESTTYPE [TESTTYPE ...]]]
            [-ss SUBSYSTEM] [-tm TESTMODULE] [-ts TESTSUIT]
            [-tc TESTCASE] [-tl TESTLEVEL] 
        
        Optional arguments:
            -h, --help            Show this help message and exit.
            -p PRODUCTFORM, --productform PRODUCTFORM    Specified product form
            -t [TESTTYPE [TESTTYPE ...]], --testtype [TESTTYPE [TESTTYPE ...]]
                Specify test type(UT,MST,ST,PERF,ALL)
            -ss SUBSYSTEM, --subsystem SUBSYSTEM    Specify test subsystem
            -tm TESTMODULE, --testmodule TESTMODULE    Specified test module
            -ts TESTSUIT, --testsuite TESTSUIT    Specify test suite
            -tc TESTCASE, --testcase TESTCASE    Specify test case
            -tl TESTLEVEL, --testlevel TESTLEVEL    Specify test level
    • View the test framework help if needed.

      Run the following command query test commands that are supported by the test platform:

      help
    • Exit the test platform.

      Run the following command to exit the test platform:

      quit
  6. View the test result and logs. The test logs and reports are generated in the developertest/reports directory after you run the test commands.

    • The test result is displayed on the console. The root path of the test result is as follows:

      reports/xxxx-xx-xx-xx-xx-xx
    • The test case formatting result is stored in the following directory:

      result/
    • The test logs are stored in the following directory:

      log/plan_log_xxxx-xx-xx-xx-xx-xx.log
    • The report summary file is as follows:

      summary_report.html
    • The report details file is as follows:

      details_report.html
    • The log directory of the test platform is as follows:

      reports/platform_log_xxxx-xx-xx-xx-xx-xx.log

Directory Structure

The source code of XDevice is stored in the test/xdevice directory. The following table describes the xdevice directory structure.

Table 4 XDevice structure

Directory

Description

xdevice

Basic components of the test platform

xdevice/src/xdevice

Source code for the basic test framework

xdevice/config

Configuration file of the basic test framework

xdevice/src/xdevice/__main__.py

Internal entrance to the basic test framework

xdevice/src/xdevice/__init__.py

Package and plug-in dependencies

xdevice/src/xdevice/variables.py

Global variables

xdevice/src/xdevice/_core/command

Commands input by test cases

xdevice/src/xdevice/_core/config

Configuration management of the basic test framework

xdevice/src/xdevice/_core/environment

Environment management of the basic test framework, including device management

xdevice/src/xdevice/_core/executor

Scheduling and distribution of test cases

xdevice/src/xdevice/_core/driver

Test executor for the basic test framework

xdevice/src/xdevice/_core/resource

Resource files and test report templates for the basic test framework

xdevice/src/xdevice/_core/testkit

Common operations for the basic test framework, including NFS mounting

xdevice/src/xdevice/_core/logger.py

Log management of the basic test framework

xdevice/src/xdevice/_core/plugin.py

Plug-in management of the basic test framework

xdevice/src/xdevice/_core/interface.py

Interfaces for plug-ins of the basic test framework

xdevice/setup.py

Installation script of the basic test framework

xdevice/run.bat

Startup script of the basic test framework (Windows)

xdevice/run.sh

Startup script of the basic test framework (Linux)

The source code of Developertest is stored in the test/developertest directory. The following table describes the developertest directory structure.

Table 5 Developertest structure

Directory

Description

developertest

Development test framework

developertest/src

Test framework source code

developertest/src/core

Test executor

developertest/src/core/build

Test case compilation

developertest/src/core/command

Processing of command lines entered by users

developertest/src/core/config

Test framework configuration management

developertest/src/core/driver

Test framework driver executor

developertest/src/core/resource

Test framework configuration file

developertest/src/core/testcase

Test case management

developertest/src/core/common.py

Common operations on the test framework

developertest/src/core/constants.py

Global constants of the test framework

developertest/src/core/exception.py

Test framework exceptions

developertest/src/core/utils.py

Test framework tools and methods

developertest/src/main

Test framework platform

developertest/src/main/__main__.py

Internal entrance of the test framework

developertest/examples

Test framework demo cases

developertest/third_party

Third-party components

developertest/BUILD.gn

Compilation configuration of the subsystem

developertest/start.bat

Developer test entry (Windows)

developertest/start.sh

Developer test entry (Linux)

1
https://gitee.com/fish_neil/docs.git
git@gitee.com:fish_neil/docs.git
fish_neil
docs
docs
master

搜索帮助