Twister¶
Twister ⎘ is a test runner bundled with Zephyr. It discovers test suites in the source tree, compiles them, runs them, and provides a summary.
As with the accompanying Ztest introduction, this document is intended to be a quick overview. For more details, please refer to the official documentation ⎘.
Discovery¶
Twister searches for testcase.yaml and sample.yaml files in the specified test root directory.
The presence of such a file indicates the directory contains some kind of testable code, be it a regular test suite
(e.g. a ZTest project), or an application which provides console output which can be checked for expected content.
The names are arbitrary, Twister does not care if the file is named testcase.yaml or sample.yaml.
testcase.yaml¶
The demo project contains one of the most simple testcase.yaml files:
tests:
demo.sim:
platform_allow:
- native_posix
- native_sim
harness: ztest
This file specifies a single test scenario, named demo.sim, which is marked compatible with the native_posix
and native_sim platforms. Because the scenario has the ztest harness configured, Twister will look for and parse
the test suite summary on the console output.
Harnesses¶
The specified harness tells Twister how to parse and interpret the test results. See Twister's documentation on Harnesses ⎘ for a complete list of available harnesses and their configuration options.
ZTest¶
The ZTest harness can parse the summary block between the ------ TESTSUITE SUMMARY START ------ and
------ TESTSUITE SUMMARY END ------ markers, which ZTest prints to the console after a test suite has been run.
This is what you would normally use for a ZTest project.
Console¶
The Console harness can parse the output of the application under test, which itself does not need to be a ZTest suite. Any application that generates console output can be tested with this harness.
Other than with a ZTest suite, the expected results need to be specified in the testcase.yaml (or sample.yaml) file,
as the application itself does not necessarily perform these assertions itself.
The BMS self test application is an example of this: the application is intended to be run by a human as part of the quality assurance process, but it can also be automated with Twister's console harness.
# abbreviated testcase.yaml
tests:
bms_selftest.disconnected:
platform_allow:
- bms@D.1
- bms@D.2
harness: console
harness_config:
fixture: disconnected
type: one_line
regex:
- "Test Summary: T1: PASS, T2: PASS, T3: SKIP"
For the bms_selftest.disconnected scenario, Twister's console harness will search for the test summary regex.
If a line from the console matches this regex (Python re.search() ⎘, not re.match() ⎘),
the test is considered to have passed.
As always, see the upstream documentation ⎘ for more details.
Running Tests¶
Twister can be invoked via west twister, or by running the $ZEPHYR_BASE/scripts/twister script directly.
To run all tests for the default board configuration (no idea what that is), you can use
(assuming you are in the project root directory):
$ west twister -v --testsuite-root .
INFO - Using Ninja..
INFO - Zephyr version: v3.6.0+starcopter-04
INFO - Using 'zephyr' toolchain.
INFO - Selecting default platforms per test case
INFO - Building initial testsuite list...
INFO - Writing JSON report twister-out/testplan.json
INFO - JOBS: 4
INFO - Adding tasks to the queue...
INFO - Added initial list of jobs to queue
INFO - 2/36 bms@D.2 applications/bms_selftest/bms_selftest.connected PASSED (build)
INFO - 3/36 bms@D.1 applications/bms_selftest/bms_selftest.connected PASSED (build)
[... more output ...]
INFO - 11/36 native_sim tests/demo/demo.sim PASSED (native 0.005s)
INFO - 12/36 native_sim tests/state_machine_test/state_machine.sim PASSED (native 0.400s)
[... more output ...]
INFO - 35/36 bms@D.2 tests/drivers/bq76925/drivers.bq76925 PASSED (build)
INFO - 36/36 bms@D.1 tests/drivers/bq76925/drivers.bq76925 PASSED (build)
INFO - 21 test scenarios (36 test instances) selected, 1 configurations skipped (1 by static filter, 0 at runtime).
INFO - 35 of 36 test configurations passed (100.00%), 0 failed, 0 errored, 1 skipped with 0 warnings in 562.03 seconds
INFO - In total 184 test cases were executed, 10 skipped on 4 out of total 687 platforms (0.58%)
INFO - 4 test configurations executed on platforms, 31 test configurations were only built.
INFO - Saving reports...
INFO - Writing JSON report twister-out/twister.json
INFO - Writing xunit report twister-out/twister.xml...
INFO - Writing xunit report twister-out/twister_report.xml...
INFO - Run completed
This will run all tests that can be run in simulation, and build all other tests which cannot be run on native_sim.
Testing on Hardware¶
[!NOTE] If you want to manually execute tests on a single device, look at the upstream docs ⎘. If you want to execute tests on multiple devices, read on.
With some preparation, you can automatically execute tests on multiple devices. Twister will take care of compiling applications, flashing the devices, monitoring console output, and collecting results.
Hardware Map File¶
Twister requires a hardware map file, to know which devices are available. A template file can be generate from twister itself:
$ west twister --generate-hardware-map hardware-map.yaml
INFO - Using Ninja..
INFO - Zephyr version: v3.6.0+starcopter-04
INFO - Using 'zephyr' toolchain.
INFO - Scanning connected hardware...
[...]
INFO - Detected devices:
| Platform | ID | Serial device |
|------------|--------------------------|-----------------|
| unknown | 004800223232511739353236 | /dev/ttyACM0 |
| unknown | 003C00443232511939353236 | /dev/ttyACM1 |
This will create a basic hardware-map.yaml file in the current directory. It will not be usable just yet,
but it serves well as a starting point:
# hardware-map.yaml
- connected: true
id: 004800223232511739353236
platform: unknown
product: STLINK-V3
runner: openocd
serial: /dev/ttyACM0
- connected: true
id: 003C00443232511939353236
platform: unknown
product: STLINK-V3
runner: openocd
serial: /dev/ttyACM1
This file can be modified to configure each entry as needed:
- specify actual platform
- configure runner (we usually use
pyocd) - specify non-default serial baud rates (BMS currently uses 1 Mbit/s)
- declare fixtures (more on this below)
Expand to see the hardware map file Lasse currently uses
- baud: 1000000
connected: true
fixtures:
- passive_cell_voltage_divider
flash_timeout: 10
id: 003C00443232511939353236
platform: bms@D.1
product: STLINK-V3
runner: stm32cubeprogrammer
runner_params:
- --tool-opt=-quietMode
serial: /dev/serial/by-id/usb-STMicroelectronics_STLINK-V3_003C00443232511939353236-if02
- baud: 1000000
connected: true
fixtures:
- disconnected
flash_timeout: 10
id: 004800223232511739353236
platform: bms@D.2
product: STLINK-V3
runner: stm32cubeprogrammer
runner_params:
- --tool-opt=-quietMode
serial: /dev/serial/by-id/usb-STMicroelectronics_STLINK-V3_004800223232511739353236-if02
Running Tests on Hardware¶
With the hardware map set up, running the tests is simple:
$ west twister -v --device-testing --hardware-map path/to/hardware-map.yaml --testsuite-root .
INFO - Using Ninja..
INFO - Zephyr version: v3.6.0+starcopter-04
INFO - Using 'zephyr' toolchain.
INFO - Building initial testsuite list...
INFO - Writing JSON report twister-out/testplan.json
Device testing on:
| Platform | ID | Serial device |
|------------|--------------------------|----------------------------------------------------------------------------------|
| bms@D.1 | 003C00443232511939353236 | /dev/serial/by-id/usb-STMicroelectronics_STLINK-V3_003C00443232511939353236-if02 |
| bms@D.2 | 004800223232511739353236 | /dev/serial/by-id/usb-STMicroelectronics_STLINK-V3_004800223232511739353236-if02 |
INFO - JOBS: 4
INFO - Adding tasks to the queue...
INFO - Added initial list of jobs to queue
INFO - 19/42 bms@D.1 samples/drivers/bq76925/sample.bq76925 PASSED (device: 003C00443232511939353236, 2.178s)
INFO - 20/42 bms@D.2 samples/drivers/bq76925/sample.bq76925 PASSED (device: 004800223232511739353236, 2.160s)
INFO - 21/42 bms@D.2 samples/adc/sample.adc PASSED (device: 004800223232511739353236, 2.699s)
INFO - 22/42 bms@D.2 applications/bms_selftest/bms_selftest.disconnected PASSED (device: 004800223232511739353236, 11.572s)
INFO - 23/42 bms@D.2 tests/demo/demo.sim PASSED (device: 004800223232511739353236, 2.027s)
INFO - 24/42 bms@D.1 samples/adc/sample.adc PASSED (device: 003C00443232511939353236, 1.696s)
INFO - 25/42 bms@D.1 tests/demo/demo.sim PASSED (device: 003C00443232511939353236, 1.743s)
INFO - 26/42 bms@D.2 tests/hs_ok_test/hs_ok_test.target PASSED (device: 004800223232511739353236, 3.980s)
INFO - 27/42 bms@D.2 tests/drivers/scp/drivers.scp PASSED (device: 004800223232511739353236, 3.047s)
INFO - 28/42 bms@D.2 tests/drivers/comp/drivers.comp.vi_slow PASSED (device: 004800223232511739353236, 4.414s)
INFO - 29/42 bms@D.1 tests/drivers/comp/drivers.comp.vi_slow PASSED (device: 003C00443232511939353236, 4.408s)
INFO - 30/42 bms@D.2 tests/drivers/comp/drivers.comp.vi_fast PASSED (device: 004800223232511739353236, 3.275s)
INFO - 31/42 bms@D.1 tests/drivers/comp/drivers.comp.vi_fast PASSED (device: 003C00443232511939353236, 3.465s)
INFO - 32/42 bms@D.2 tests/drivers/comp/drivers.comp.70mV_hysteresis PASSED (device: 004800223232511739353236, 2.808s)
INFO - 33/42 bms@D.1 tests/drivers/comp/drivers.comp.70mV_hysteresis PASSED (device: 003C00443232511939353236, 3.082s)
INFO - 34/42 bms@D.2 tests/drivers/comp/drivers.comp.10mV_hysteresis PASSED (device: 004800223232511739353236, 3.326s)
INFO - 35/42 bms@D.1 tests/drivers/comp/drivers.comp.10mV_hysteresis PASSED (device: 003C00443232511939353236, 3.320s)
INFO - 36/42 bms@D.2 tests/drivers/comp/drivers.comp.no_hysteresis PASSED (device: 004800223232511739353236, 2.433s)
INFO - 37/42 bms@D.1 tests/drivers/comp/drivers.comp.no_hysteresis PASSED (device: 003C00443232511939353236, 3.236s)
INFO - 38/42 bms@D.2 tests/drivers/comp/drivers.comp PASSED (device: 004800223232511739353236, 3.507s)
INFO - 39/42 bms@D.1 tests/drivers/comp/drivers.comp PASSED (device: 003C00443232511939353236, 4.467s)
INFO - 40/42 bms@D.1 tests/drivers/bq76925/drivers.bq76925.cell_voltage_passive PASSED (device: 003C00443232511939353236, 3.936s)
INFO - 41/42 bms@D.2 tests/drivers/bq76925/drivers.bq76925 PASSED (device: 004800223232511739353236, 4.104s)
INFO - 42/42 bms@D.1 tests/drivers/bq76925/drivers.bq76925 PASSED (device: 003C00443232511939353236, 2.979s)
INFO - 21 test scenarios (42 test instances) selected, 18 configurations skipped (18 by static filter, 0 at runtime).
INFO - 24 of 42 test configurations passed (100.00%), 0 failed, 0 errored, 18 skipped with 0 warnings in 218.68 seconds
INFO - In total 95 test cases were executed, 141 skipped on 2 out of total 687 platforms (0.29%)
INFO - 24 test configurations executed on platforms, 0 test configurations were only built.
Hardware distribution summary:
| Board | ID | Counter |
|---------|--------------------------|-----------|
| bms@D.1 | 003C00443232511939353236 | 11 |
| bms@D.2 | 004800223232511739353236 | 13 |
INFO - Saving reports...
INFO - Writing JSON report twister-out/twister.json
INFO - Writing xunit report twister-out/twister.xml...
INFO - Writing xunit report twister-out/twister_report.xml...
INFO - Run completed
Fixtures¶
Some tests require a specific test environment. To measure cell voltages, for example, the BMS must have real cells or a cell voltage simulator connected. For other tests, it may be important that the BMS is not connected to any load to safely test the BMS's power output; to interact with the CAN network, both the device under test and the testing host must be connected to the same CAN network. These physical hardware environment requirements can be specified as Fixtures ⎘.
Fixtures are referenced in two locations: in the test scenario definition (testcase.yaml), and in the hardware map file.
The test scenario can specify which fixtures are required, and the hardware map can specify which devices have which fixtures.
[!NOTE] A device can have multiple fixtures at once, but a test scenario can only require a single fixture.
The self test application for example has a test scenario which requires the BMS to be disconnected from any load:
# partial testcase.yaml
tests:
bms_selftest.disconnected:
platform_allow: bms@D.2
harness_config:
fixture: disconnected # <-- this scenario requires the 'disconnected' fixture
This is matched by the hardware map file:
# partial hardware-map.yaml
- connected: true # <-- this only means the device itself is connected and available for testing
fixtures:
- disconnected # <-- this device has the 'disconnected' fixture
- some_other_unrelated_fixture # <-- devices can have multiple fixtures
id: 004800223232511739353236
platform: bms@D.2
From Twister's perspective, fixtures are only arbitrary labels. It is up to us which fixtures to define, how to name them, and to make sure the hardware map file maps the correct fixtures to the correct devices.
Note
The fixtures available at starcopter should be listed in the fixtures/ directory.