summaryrefslogtreecommitdiffstats
path: root/documentation/test-manual
diff options
context:
space:
mode:
Diffstat (limited to 'documentation/test-manual')
-rw-r--r--documentation/test-manual/index.rst2
-rw-r--r--documentation/test-manual/intro.rst22
-rw-r--r--documentation/test-manual/ptest.rst135
-rw-r--r--documentation/test-manual/reproducible-builds.rst60
-rw-r--r--documentation/test-manual/runtime-testing.rst595
-rw-r--r--documentation/test-manual/test-process.rst2
-rw-r--r--documentation/test-manual/understand-autobuilder.rst2
7 files changed, 790 insertions, 28 deletions
diff --git a/documentation/test-manual/index.rst b/documentation/test-manual/index.rst
index 86a2f436ea..d365d337ea 100644
--- a/documentation/test-manual/index.rst
+++ b/documentation/test-manual/index.rst
@@ -12,6 +12,8 @@ Yocto Project Test Environment Manual
12 12
13 intro 13 intro
14 test-process 14 test-process
15 ptest
16 runtime-testing
15 understand-autobuilder 17 understand-autobuilder
16 reproducible-builds 18 reproducible-builds
17 yocto-project-compatible 19 yocto-project-compatible
diff --git a/documentation/test-manual/intro.rst b/documentation/test-manual/intro.rst
index c31fd11c7a..d55540c8df 100644
--- a/documentation/test-manual/intro.rst
+++ b/documentation/test-manual/intro.rst
@@ -51,13 +51,11 @@ fashion. Basically, during the development of a Yocto Project release,
51the Autobuilder tests if things work. The Autobuilder builds all test 51the Autobuilder tests if things work. The Autobuilder builds all test
52targets and runs all the tests. 52targets and runs all the tests.
53 53
54The Yocto Project uses now uses standard upstream 54The Yocto Project uses standard upstream Buildbot to drive its integration and
55Buildbot (`version 3.8 <https://docs.buildbot.net/3.8.0/>`__) to 55testing. Buildbot has a plug-in interface that the Yocto Project customizes
56drive its integration and testing. Buildbot has a plug-in interface 56using code from the :yocto_git:`yocto-autobuilder2 </yocto-autobuilder2>`
57that the Yocto Project customizes using code from the 57repository, adding its own console UI plugin. The resulting UI plug-in allows
58``yocto-autobuilder2`` repository, adding its own console UI plugin. The 58you to visualize builds in a way suited to the project's needs.
59resulting UI plug-in allows you to visualize builds in a way suited to
60the project's needs.
61 59
62A ``helper`` layer provides configuration and job management through 60A ``helper`` layer provides configuration and job management through
63scripts found in the ``yocto-autobuilder-helper`` repository. The 61scripts found in the ``yocto-autobuilder-helper`` repository. The
@@ -130,7 +128,9 @@ the following types of tests:
130 $ bitbake image -c testimage 128 $ bitbake image -c testimage
131 129
132 The tests use the :ref:`ref-classes-testimage` 130 The tests use the :ref:`ref-classes-testimage`
133 class and the :ref:`ref-tasks-testimage` task. 131 class and the :ref:`ref-tasks-testimage` task. See the
132 :ref:`test-manual/runtime-testing:Performing Automated Runtime Testing`
133 section of the Yocto Project Test Environment Manual for more information.
134 134
135- *Layer Testing:* The Autobuilder has the possibility to test whether 135- *Layer Testing:* The Autobuilder has the possibility to test whether
136 specific layers work with the test of the system. The layers tested 136 specific layers work with the test of the system. The layers tested
@@ -140,7 +140,7 @@ the following types of tests:
140- *Package Testing:* A Package Test (ptest) runs tests against packages 140- *Package Testing:* A Package Test (ptest) runs tests against packages
141 built by the OpenEmbedded build system on the target machine. See the 141 built by the OpenEmbedded build system on the target machine. See the
142 :ref:`Testing Packages With 142 :ref:`Testing Packages With
143 ptest <dev-manual/packages:Testing Packages With ptest>` section 143 ptest <test-manual/ptest:Testing Packages With ptest>` section
144 in the Yocto Project Development Tasks Manual and the 144 in the Yocto Project Development Tasks Manual and the
145 ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more 145 ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more
146 information on Ptest. 146 information on Ptest.
@@ -380,7 +380,7 @@ with common tasks, including:
380- *Running a bitbake invocation for a build:* Use 380- *Running a bitbake invocation for a build:* Use
381 ``oeqa.utils.commands.bitbake()`` 381 ``oeqa.utils.commands.bitbake()``
382 382
383- *Running a command:* Use ``oeqa.utils.commandsrunCmd()`` 383- *Running a command:* Use ``oeqa.utils.commands.runCmd()``
384 384
385There is also a ``oeqa.utils.commands.runqemu()`` function for launching 385There is also a ``oeqa.utils.commands.runqemu()`` function for launching
386the ``runqemu`` command for testing things within a running, virtualized 386the ``runqemu`` command for testing things within a running, virtualized
@@ -458,7 +458,7 @@ the ``devtool build`` command within the eSDK.
458 458
459These tests are run against built SDKs. The tests can assume that an SDK 459These tests are run against built SDKs. The tests can assume that an SDK
460has already been extracted and its environment file has been sourced. A 460has already been extracted and its environment file has been sourced. A
461simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the 461simple example from ``meta/lib/oeqa/sdk/cases/python.py`` contains the
462following:: 462following::
463 463
464 class Python3Test(OESDKTestCase): 464 class Python3Test(OESDKTestCase):
diff --git a/documentation/test-manual/ptest.rst b/documentation/test-manual/ptest.rst
new file mode 100644
index 0000000000..4e6be35df5
--- /dev/null
+++ b/documentation/test-manual/ptest.rst
@@ -0,0 +1,135 @@
1.. SPDX-License-Identifier: CC-BY-SA-2.0-UK
2
3***************************
4Testing Packages With ptest
5***************************
6
7A Package Test (ptest) runs tests against packages built by the
8OpenEmbedded build system on the target machine. A ptest contains at
9least two items: the actual test, and a shell script (``run-ptest``)
10that starts the test. The shell script that starts the test must not
11contain the actual test --- the script only starts the test. On the other
12hand, the test can be anything from a simple shell script that runs a
13binary and checks the output to an elaborate system of test binaries and
14data files.
15
16The test generates output in the format used by Automake::
17
18 result: testname
19
20where the result can be ``PASS``, ``FAIL``, or ``SKIP``, and
21the testname can be any identifying string.
22
23For a list of Yocto Project recipes that are already enabled with ptest,
24see the :yocto_wiki:`Ptest </Ptest>` wiki page.
25
26.. note::
27
28 A recipe is "ptest-enabled" if it inherits the :ref:`ref-classes-ptest`
29 class.
30
31Adding ptest to Your Build
32==========================
33
34To add package testing to your build, add the :term:`DISTRO_FEATURES` and
35:term:`EXTRA_IMAGE_FEATURES` variables to your ``local.conf`` file, which
36is found in the :term:`Build Directory`::
37
38 DISTRO_FEATURES:append = " ptest"
39 EXTRA_IMAGE_FEATURES += "ptest-pkgs"
40
41Once your build is complete, the ptest files are installed into the
42``/usr/lib/package/ptest`` directory within the image, where ``package``
43is the name of the package.
44
45Running ptest
46=============
47
48The ``ptest-runner`` package installs a shell script that loops through
49all installed ptest test suites and runs them in sequence.
50
51During the execution ``ptest-runner`` keeps count of total and failed
52``ptests``. At end the execution summary is written to the console.
53If any of the ``run-ptest`` fails, ``ptest-runner`` returns '1'.
54
55Consequently, you might want to add ``ptest-runner`` to your image.
56
57
58Getting Your Package Ready
59==========================
60
61In order to enable a recipe to run installed ``ptests`` on target hardware,
62you need to prepare the recipes that build the packages you want to
63test. Here is what you have to do for each recipe:
64
65- *Be sure the recipe inherits the* :ref:`ref-classes-ptest` *class:*
66 Include the following line in each recipe::
67
68 inherit ptest
69
70 .. note::
71
72 Classes for common frameworks already exist in :term:`OpenEmbedded-Core
73 (OE-Core)`, such as:
74
75 - :oe_git:`go-ptest </openembedded-core/tree/meta/classes-recipe/go-ptest.bbclass>`
76 - :ref:`ref-classes-ptest-cargo`
77 - :ref:`ref-classes-ptest-gnome`
78 - :oe_git:`ptest-perl </openembedded-core/tree/meta/classes-recipe/ptest-perl.bbclass>`
79 - :oe_git:`ptest-python-pytest </openembedded-core/tree/meta/classes-recipe/ptest-python-pytest.bbclass>`
80
81 Inheriting these classes with the ``inherit`` keyword in your recipe will
82 make the next steps automatic.
83
84- *Create run-ptest:* This script starts your test. Locate the
85 script where you will refer to it using
86 :term:`SRC_URI`. Be sure ``run-ptest`` exits with 0 to mark it
87 as successfully executed otherwise will be marked as fail.
88 Here is an example that starts a test for ``dbus``::
89
90 #!/bin/sh
91 cd test
92 make -k runtest-TESTS
93
94- *Ensure dependencies are met:* If the test adds build or runtime
95 dependencies that normally do not exist for the package (such as
96 requiring "make" to run the test suite), use the
97 :term:`DEPENDS` and
98 :term:`RDEPENDS` variables in
99 your recipe in order for the package to meet the dependencies. Here
100 is an example where the package has a runtime dependency on "make"::
101
102 RDEPENDS:${PN}-ptest += "make"
103
104- *Add a function to build the test suite:* Not many packages support
105 cross-compilation of their test suites. Consequently, you usually
106 need to add a cross-compilation function to the package.
107
108 Many packages based on Automake compile and run the test suite by
109 using a single command such as ``make check``. However, the host
110 ``make check`` builds and runs on the same computer, while
111 cross-compiling requires that the package is built on the host but
112 executed for the target architecture (though often, as in the case
113 for ptest, the execution occurs on the host). The built version of
114 Automake that ships with the Yocto Project includes a patch that
115 separates building and execution. Consequently, packages that use the
116 unaltered, patched version of ``make check`` automatically
117 cross-compiles.
118
119 Regardless, you still must add a ``do_compile_ptest`` function to
120 build the test suite. Add a function similar to the following to your
121 recipe::
122
123 do_compile_ptest() {
124 oe_runmake buildtest-TESTS
125 }
126
127- *Ensure special configurations are set:* If the package requires
128 special configurations prior to compiling the test code, you must
129 insert a ``do_configure_ptest`` function into the recipe.
130
131- *Install the test suite:* The :ref:`ref-classes-ptest` class
132 automatically copies the file ``run-ptest`` to the target and then runs make
133 ``install-ptest`` to run the tests. If this is not enough, you need
134 to create a ``do_install_ptest`` function and make sure it gets
135 called after the "make install-ptest" completes.
diff --git a/documentation/test-manual/reproducible-builds.rst b/documentation/test-manual/reproducible-builds.rst
index 91f94a5c74..b913aa4eaf 100644
--- a/documentation/test-manual/reproducible-builds.rst
+++ b/documentation/test-manual/reproducible-builds.rst
@@ -91,13 +91,21 @@ run::
91 91
92 oe-selftest -r reproducible.ReproducibleTests.test_reproducible_builds 92 oe-selftest -r reproducible.ReproducibleTests.test_reproducible_builds
93 93
94This defaults to including a ``world`` build so, if other layers are added, it would 94This defaults to including a ``world`` build so, if other layers are added, it
95also run the tests for recipes in the additional layers. Different build targets 95would also run the tests for recipes in the additional layers. Different build
96can be defined using the :term:`OEQA_REPRODUCIBLE_TEST_TARGET` variable in ``local.conf``. 96targets can be defined using the :term:`OEQA_REPRODUCIBLE_TEST_TARGET` variable
97The first build will be run using :ref:`Shared State <overview-manual/concepts:Shared State>` if 97in ``local.conf``. For example, running reproducibility tests for only the
98available, the second build explicitly disables 98``python3-numpy`` recipe can be done by setting::
99:ref:`Shared State <overview-manual/concepts:Shared State>` except for recipes defined in 99
100the :term:`OEQA_REPRODUCIBLE_TEST_SSTATE_TARGETS` variable, and builds on the 100 OEQA_REPRODUCIBLE_TEST_TARGET = "python3-numpy"
101
102in local.conf before running the ``oe-selftest`` command shown above.
103
104Reproducibility builds the target list twice. The first build will be run using
105:ref:`Shared State <overview-manual/concepts:Shared State>` if available, the
106second build explicitly disables :ref:`Shared State
107<overview-manual/concepts:Shared State>` except for recipes defined in the
108:term:`OEQA_REPRODUCIBLE_TEST_SSTATE_TARGETS` variable, and builds on the
101specific host the build is running on. This means we can test reproducibility 109specific host the build is running on. This means we can test reproducibility
102builds between different host distributions over time on the Autobuilder. 110builds between different host distributions over time on the Autobuilder.
103 111
@@ -111,16 +119,15 @@ https://autobuilder.yocto.io/pub/repro-fail/ in the form ``oe-reproducible +
111The project's current reproducibility status can be seen at 119The project's current reproducibility status can be seen at
112:yocto_home:`/reproducible-build-results/` 120:yocto_home:`/reproducible-build-results/`
113 121
114You can also check the reproducibility status on supported host distributions: 122You can also check the reproducibility status on the Autobuilder:
123:yocto_ab:`/valkyrie/#/builders/reproducible`.
115 124
116- CentOS: :yocto_ab:`/typhoon/#/builders/reproducible-centos` 125===================================
117- Debian: :yocto_ab:`/typhoon/#/builders/reproducible-debian` 126How can I test my layer or recipes?
118- Fedora: :yocto_ab:`/typhoon/#/builders/reproducible-fedora` 127===================================
119- Ubuntu: :yocto_ab:`/typhoon/#/builders/reproducible-ubuntu`
120 128
121=============================== 129With world build
122Can I test my layer or recipes? 130~~~~~~~~~~~~~~~~
123===============================
124 131
125Once again, you can run a ``world`` test using the 132Once again, you can run a ``world`` test using the
126:ref:`oe-selftest <ref-manual/release-process:Testing and Quality Assurance>` 133:ref:`oe-selftest <ref-manual/release-process:Testing and Quality Assurance>`
@@ -128,6 +135,9 @@ command provided above. This functionality is implemented
128in :oe_git:`meta/lib/oeqa/selftest/cases/reproducible.py 135in :oe_git:`meta/lib/oeqa/selftest/cases/reproducible.py
129</openembedded-core/tree/meta/lib/oeqa/selftest/cases/reproducible.py>`. 136</openembedded-core/tree/meta/lib/oeqa/selftest/cases/reproducible.py>`.
130 137
138Subclassing the test
139~~~~~~~~~~~~~~~~~~~~
140
131You could subclass the test and change ``targets`` to a different target. 141You could subclass the test and change ``targets`` to a different target.
132 142
133You may also change ``sstate_targets`` which would allow you to "pre-cache" some 143You may also change ``sstate_targets`` which would allow you to "pre-cache" some
@@ -135,3 +145,23 @@ set of recipes before the test, meaning they are excluded from reproducibility
135testing. As a practical example, you could set ``sstate_targets`` to 145testing. As a practical example, you could set ``sstate_targets`` to
136``core-image-sato``, then setting ``targets`` to ``core-image-sato-sdk`` would 146``core-image-sato``, then setting ``targets`` to ``core-image-sato-sdk`` would
137run reproducibility tests only on the targets belonging only to ``core-image-sato-sdk``. 147run reproducibility tests only on the targets belonging only to ``core-image-sato-sdk``.
148
149Using :term:`OEQA_REPRODUCIBLE_TEST_* <OEQA_REPRODUCIBLE_TEST_LEAF_TARGETS>` variables
150~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
151
152If you want to test the reproducibility of a set of recipes, you can define
153:term:`OEQA_REPRODUCIBLE_TEST_LEAF_TARGETS`, in your local.conf::
154
155 OEQA_REPRODUCIBLE_TEST_LEAF_TARGETS = "my-recipe"
156
157This will test the reproducibility of ``my-recipe`` but will use the
158:ref:`Shared State <overview-manual/concepts:Shared State>` for most its
159dependencies (i.e. the ones explicitly listed in DEPENDS, which may not be all
160dependencies, c.f. [depends] varflags, PACKAGE_DEPENDS and other
161implementations).
162
163You can have finer control on the test with:
164
165- :term:`OEQA_REPRODUCIBLE_TEST_TARGET`: lists recipes to be built,
166- :term:`OEQA_REPRODUCIBLE_TEST_SSTATE_TARGETS`: lists recipes that will
167 be built using :ref:`Shared State <overview-manual/concepts:Shared State>`.
diff --git a/documentation/test-manual/runtime-testing.rst b/documentation/test-manual/runtime-testing.rst
new file mode 100644
index 0000000000..557e0530b0
--- /dev/null
+++ b/documentation/test-manual/runtime-testing.rst
@@ -0,0 +1,595 @@
1.. SPDX-License-Identifier: CC-BY-SA-2.0-UK
2
3************************************
4Performing Automated Runtime Testing
5************************************
6
7The OpenEmbedded build system makes available a series of automated
8tests for images to verify runtime functionality. You can run these
9tests on either QEMU or actual target hardware. Tests are written in
10Python making use of the ``unittest`` module, and the majority of them
11run commands on the target system over SSH. This section describes how
12you set up the environment to use these tests, run available tests, and
13write and add your own tests.
14
15For information on the test and QA infrastructure available within the
16Yocto Project, see the ":ref:`ref-manual/release-process:testing and quality assurance`"
17section in the Yocto Project Reference Manual.
18
19Enabling Tests
20==============
21
22Depending on whether you are planning to run tests using QEMU or on the
23hardware, you have to take different steps to enable the tests. See the
24following subsections for information on how to enable both types of
25tests.
26
27Enabling Runtime Tests on QEMU
28------------------------------
29
30In order to run tests, you need to do the following:
31
32- *Set up to avoid interaction with sudo for networking:* To
33 accomplish this, you must do one of the following:
34
35 - Add ``NOPASSWD`` for your user in ``/etc/sudoers`` either for all
36 commands or just for ``runqemu-ifup``. You must provide the full
37 path as that can change if you are using multiple clones of the
38 source repository.
39
40 .. note::
41
42 On some distributions, you also need to comment out "Defaults
43 requiretty" in ``/etc/sudoers``.
44
45 - Manually configure a tap interface for your system.
46
47 - Run as root the script in ``scripts/runqemu-gen-tapdevs``, which
48 should generate a list of tap devices. This is the option
49 typically chosen for Autobuilder-type environments.
50
51 .. note::
52
53 - Be sure to use an absolute path when calling this script
54 with sudo.
55
56 - Ensure that your host has the package ``iptables`` installed.
57
58 - The package recipe ``qemu-helper-native`` is required to run
59 this script. Build the package using the following command::
60
61 $ bitbake qemu-helper-native
62
63- *Set the DISPLAY variable:* You need to set this variable so that
64 you have an X server available (e.g. start ``vncserver`` for a
65 headless machine).
66
67- *Be sure your host's firewall accepts incoming connections from
68 192.168.7.0/24:* Some of the tests (in particular DNF tests) start an
69 HTTP server on a random high number port, which is used to serve
70 files to the target. The DNF module serves
71 ``${WORKDIR}/oe-rootfs-repo`` so it can run DNF channel commands.
72 That means your host's firewall must accept incoming connections from
73 192.168.7.0/24, which is the default IP range used for tap devices by
74 ``runqemu``.
75
76- *Be sure your host has the correct packages installed:* Depending
77 your host's distribution, you need to have the following packages
78 installed:
79
80 - Ubuntu and Debian: ``sysstat`` and ``iproute2``
81
82 - openSUSE: ``sysstat`` and ``iproute2``
83
84 - Fedora: ``sysstat`` and ``iproute``
85
86 - CentOS: ``sysstat`` and ``iproute``
87
88Once you start running the tests, the following happens:
89
90#. A copy of the root filesystem is written to ``${WORKDIR}/testimage``.
91
92#. The image is booted under QEMU using the standard ``runqemu`` script.
93
94#. A default timeout of 500 seconds occurs to allow for the boot process
95 to reach the login prompt. You can change the timeout period by
96 setting
97 :term:`TEST_QEMUBOOT_TIMEOUT`
98 in the ``local.conf`` file.
99
100#. Once the boot process is reached and the login prompt appears, the
101 tests run. The full boot log is written to
102 ``${WORKDIR}/testimage/qemu_boot_log``.
103
104#. Each test module loads in the order found in :term:`TEST_SUITES`. You can
105 find the full output of the commands run over SSH in
106 ``${WORKDIR}/testimgage/ssh_target_log``.
107
108#. If no failures occur, the task running the tests ends successfully.
109 You can find the output from the ``unittest`` in the task log at
110 ``${WORKDIR}/temp/log.do_testimage``.
111
112Enabling Runtime Tests on Hardware
113----------------------------------
114
115The OpenEmbedded build system can run tests on real hardware, and for
116certain devices it can also deploy the image to be tested onto the
117device beforehand.
118
119For automated deployment, a "controller image" is installed onto the
120hardware once as part of setup. Then, each time tests are to be run, the
121following occurs:
122
123#. The controller image is booted into and used to write the image to be
124 tested to a second partition.
125
126#. The device is then rebooted using an external script that you need to
127 provide.
128
129#. The device boots into the image to be tested.
130
131When running tests (independent of whether the image has been deployed
132automatically or not), the device is expected to be connected to a
133network on a pre-determined IP address. You can either use static IP
134addresses written into the image, or set the image to use DHCP and have
135your DHCP server on the test network assign a known IP address based on
136the MAC address of the device.
137
138In order to run tests on hardware, you need to set :term:`TEST_TARGET` to an
139appropriate value. For QEMU, you do not have to change anything, the
140default value is "qemu". For running tests on hardware, the following
141options are available:
142
143- *"simpleremote":* Choose "simpleremote" if you are going to run tests
144 on a target system that is already running the image to be tested and
145 is available on the network. You can use "simpleremote" in
146 conjunction with either real hardware or an image running within a
147 separately started QEMU or any other virtual machine manager.
148
149- *"SystemdbootTarget":* Choose "SystemdbootTarget" if your hardware is
150 an EFI-based machine with ``systemd-boot`` as bootloader and
151 ``core-image-testmaster`` (or something similar) is installed. Also,
152 your hardware under test must be in a DHCP-enabled network that gives
153 it the same IP address for each reboot.
154
155 If you choose "SystemdbootTarget", there are additional requirements
156 and considerations. See the
157 ":ref:`test-manual/runtime-testing:selecting systemdboottarget`" section, which
158 follows, for more information.
159
160- *"BeagleBoneTarget":* Choose "BeagleBoneTarget" if you are deploying
161 images and running tests on the BeagleBone "Black" or original
162 "White" hardware. For information on how to use these tests, see the
163 comments at the top of the BeagleBoneTarget
164 ``meta-yocto-bsp/lib/oeqa/controllers/beaglebonetarget.py`` file.
165
166- *"GrubTarget":* Choose "GrubTarget" if you are deploying images and running
167 tests on any generic PC that boots using GRUB. For information on how
168 to use these tests, see the comments at the top of the GrubTarget
169 ``meta-yocto-bsp/lib/oeqa/controllers/grubtarget.py`` file.
170
171- *"your-target":* Create your own custom target if you want to run
172 tests when you are deploying images and running tests on a custom
173 machine within your BSP layer. To do this, you need to add a Python
174 unit that defines the target class under ``lib/oeqa/controllers/``
175 within your layer. You must also provide an empty ``__init__.py``.
176 For examples, see files in ``meta-yocto-bsp/lib/oeqa/controllers/``.
177
178Selecting SystemdbootTarget
179---------------------------
180
181If you did not set :term:`TEST_TARGET` to "SystemdbootTarget", then you do
182not need any information in this section. You can skip down to the
183":ref:`test-manual/runtime-testing:running tests`" section.
184
185If you did set :term:`TEST_TARGET` to "SystemdbootTarget", you also need to
186perform a one-time setup of your controller image by doing the following:
187
188#. *Set EFI_PROVIDER:* Be sure that :term:`EFI_PROVIDER` is as follows::
189
190 EFI_PROVIDER = "systemd-boot"
191
192#. *Build the controller image:* Build the ``core-image-testmaster`` image.
193 The ``core-image-testmaster`` recipe is provided as an example for a
194 "controller" image and you can customize the image recipe as you would
195 any other recipe.
196
197 Image recipe requirements are:
198
199 - Inherits ``core-image`` so that kernel modules are installed.
200
201 - Installs normal linux utilities not BusyBox ones (e.g. ``bash``,
202 ``coreutils``, ``tar``, ``gzip``, and ``kmod``).
203
204 - Uses a custom :term:`Initramfs` image with a custom
205 installer. A normal image that you can install usually creates a
206 single root filesystem partition. This image uses another installer that
207 creates a specific partition layout. Not all Board Support
208 Packages (BSPs) can use an installer. For such cases, you need to
209 manually create the following partition layout on the target:
210
211 - First partition mounted under ``/boot``, labeled "boot".
212
213 - The main root filesystem partition where this image gets installed,
214 which is mounted under ``/``.
215
216 - Another partition labeled "testrootfs" where test images get
217 deployed.
218
219#. *Install image:* Install the image that you just built on the target
220 system.
221
222The final thing you need to do when setting :term:`TEST_TARGET` to
223"SystemdbootTarget" is to set up the test image:
224
225#. *Set up your local.conf file:* Make sure you have the following
226 statements in your ``local.conf`` file::
227
228 IMAGE_FSTYPES += "tar.gz"
229 IMAGE_CLASSES += "testimage"
230 TEST_TARGET = "SystemdbootTarget"
231 TEST_TARGET_IP = "192.168.2.3"
232
233#. *Build your test image:* Use BitBake to build the image::
234
235 $ bitbake core-image-sato
236
237Power Control
238-------------
239
240For most hardware targets other than "simpleremote", you can control
241power:
242
243- You can use :term:`TEST_POWERCONTROL_CMD` together with
244 :term:`TEST_POWERCONTROL_EXTRA_ARGS` as a command that runs on the host
245 and does power cycling. The test code passes one argument to that
246 command: off, on or cycle (off then on). Here is an example that
247 could appear in your ``local.conf`` file::
248
249 TEST_POWERCONTROL_CMD = "powercontrol.exp test 10.11.12.1 nuc1"
250
251 In this example, the expect
252 script does the following:
253
254 .. code-block:: shell
255
256 ssh test@10.11.12.1 "pyctl nuc1 arg"
257
258 It then runs a Python script that controls power for a label called
259 ``nuc1``.
260
261 .. note::
262
263 You need to customize :term:`TEST_POWERCONTROL_CMD` and
264 :term:`TEST_POWERCONTROL_EXTRA_ARGS` for your own setup. The one requirement
265 is that it accepts "on", "off", and "cycle" as the last argument.
266
267- When no command is defined, it connects to the device over SSH and
268 uses the classic reboot command to reboot the device. Classic reboot
269 is fine as long as the machine actually reboots (i.e. the SSH test
270 has not failed). It is useful for scenarios where you have a simple
271 setup, typically with a single board, and where some manual
272 interaction is okay from time to time.
273
274If you have no hardware to automatically perform power control but still
275wish to experiment with automated hardware testing, you can use the
276``dialog-power-control`` script that shows a dialog prompting you to perform
277the required power action. This script requires either KDialog or Zenity
278to be installed. To use this script, set the
279:term:`TEST_POWERCONTROL_CMD`
280variable as follows::
281
282 TEST_POWERCONTROL_CMD = "${COREBASE}/scripts/contrib/dialog-power-control"
283
284Serial Console Connection
285-------------------------
286
287For test target classes requiring a serial console to interact with the
288bootloader (e.g. BeagleBoneTarget and GrubTarget),
289you need to specify a command to use to connect to the serial console of
290the target machine by using the
291:term:`TEST_SERIALCONTROL_CMD`
292variable and optionally the
293:term:`TEST_SERIALCONTROL_EXTRA_ARGS`
294variable.
295
296These cases could be a serial terminal program if the machine is
297connected to a local serial port, or a ``telnet`` or ``ssh`` command
298connecting to a remote console server. Regardless of the case, the
299command simply needs to connect to the serial console and forward that
300connection to standard input and output as any normal terminal program
301does. For example, to use the picocom terminal program on serial device
302``/dev/ttyUSB0`` at 115200bps, you would set the variable as follows::
303
304 TEST_SERIALCONTROL_CMD = "picocom /dev/ttyUSB0 -b 115200"
305
306For local
307devices where the serial port device disappears when the device reboots,
308an additional "serdevtry" wrapper script is provided. To use this
309wrapper, simply prefix the terminal command with
310``${COREBASE}/scripts/contrib/serdevtry``::
311
312 TEST_SERIALCONTROL_CMD = "${COREBASE}/scripts/contrib/serdevtry picocom -b 115200 /dev/ttyUSB0"
313
314Running Tests
315=============
316
317You can start the tests automatically or manually:
318
319- *Automatically running tests:* To run the tests automatically after the
320 OpenEmbedded build system successfully creates an image, first set the
321 :term:`TESTIMAGE_AUTO` variable to "1" in your ``local.conf`` file in the
322 :term:`Build Directory`::
323
324 TESTIMAGE_AUTO = "1"
325
326 Next, build your image. If the image successfully builds, the
327 tests run::
328
329 bitbake core-image-sato
330
331- *Manually running tests:* To manually run the tests, first globally
332 inherit the :ref:`ref-classes-testimage` class by editing your
333 ``local.conf`` file::
334
335 IMAGE_CLASSES += "testimage"
336
337 Next, use BitBake to run the tests::
338
339 bitbake -c testimage image
340
341All test files reside in ``meta/lib/oeqa/runtime/cases`` in the
342:term:`Source Directory`. A test name maps
343directly to a Python module. Each test module may contain a number of
344individual tests. Tests are usually grouped together by the area tested
345(e.g tests for systemd reside in ``meta/lib/oeqa/runtime/cases/systemd.py``).
346
347You can add tests to any layer provided you place them in the proper
348area and you extend :term:`BBPATH` in
349the ``local.conf`` file as normal. Be sure that tests reside in
350``layer/lib/oeqa/runtime/cases``.
351
352.. note::
353
354 Be sure that module names do not collide with module names used in
355 the default set of test modules in ``meta/lib/oeqa/runtime/cases``.
356
357You can change the set of tests run by appending or overriding
358:term:`TEST_SUITES` variable in
359``local.conf``. Each name in :term:`TEST_SUITES` represents a required test
360for the image. Test modules named within :term:`TEST_SUITES` cannot be
361skipped even if a test is not suitable for an image (e.g. running the
362RPM tests on an image without ``rpm``). Appending "auto" to
363:term:`TEST_SUITES` causes the build system to try to run all tests that are
364suitable for the image (i.e. each test module may elect to skip itself).
365
366The order you list tests in :term:`TEST_SUITES` is important and influences
367test dependencies. Consequently, tests that depend on other tests should
368be added after the test on which they depend. For example, since the
369``ssh`` test depends on the ``ping`` test, "ssh" needs to come after
370"ping" in the list. The test class provides no re-ordering or dependency
371handling.
372
373.. note::
374
375 Each module can have multiple classes with multiple test methods.
376 And, Python ``unittest`` rules apply.
377
378Here are some things to keep in mind when running tests:
379
380- The default tests for the image are defined as::
381
382 DEFAULT_TEST_SUITES:pn-image = "ping ssh df connman syslog xorg scp vnc date rpm dnf dmesg"
383
384- Add your own test to the list of the by using the following::
385
386 TEST_SUITES:append = " mytest"
387
388- Run a specific list of tests as follows::
389
390 TEST_SUITES = "test1 test2 test3"
391
392 Remember, order is important. Be sure to place a test that is
393 dependent on another test later in the order.
394
395Exporting Tests
396===============
397
398You can export tests so that they can run independently of the build
399system. Exporting tests is required if you want to be able to hand the
400test execution off to a scheduler. You can only export tests that are
401defined in :term:`TEST_SUITES`.
402
403If your image is already built, make sure the following are set in your
404``local.conf`` file::
405
406 INHERIT += "testexport"
407 TEST_TARGET_IP = "IP-address-for-the-test-target"
408 TEST_SERVER_IP = "IP-address-for-the-test-server"
409
410You can then export the tests with the
411following BitBake command form::
412
413 $ bitbake image -c testexport
414
415Exporting the tests places them in the :term:`Build Directory` in
416``tmp/testexport/``\ image, which is controlled by the :term:`TEST_EXPORT_DIR`
417variable.
418
419You can now run the tests outside of the build environment::
420
421 $ cd tmp/testexport/image
422 $ ./runexported.py testdata.json
423
424Here is a complete example that shows IP addresses and uses the
425``core-image-sato`` image::
426
427 INHERIT += "testexport"
428 TEST_TARGET_IP = "192.168.7.2"
429 TEST_SERVER_IP = "192.168.7.1"
430
431Use BitBake to export the tests::
432
433 $ bitbake core-image-sato -c testexport
434
435Run the tests outside of
436the build environment using the following::
437
438 $ cd tmp/testexport/core-image-sato
439 $ ./runexported.py testdata.json
440
441Writing New Tests
442=================
443
444As mentioned previously, all new test files need to be in the proper
445place for the build system to find them. New tests for additional
446functionality outside of the core should be added to the layer that adds
447the functionality, in ``layer/lib/oeqa/runtime/cases`` (as long as
448:term:`BBPATH` is extended in the
449layer's ``layer.conf`` file as normal). Just remember the following:
450
451- Filenames need to map directly to test (module) names.
452
453- Do not use module names that collide with existing core tests.
454
455- Minimally, an empty ``__init__.py`` file must be present in the runtime
456 directory.
457
458To create a new test, start by copying an existing module (e.g.
459``oe_syslog.py`` or ``gcc.py`` are good ones to use). Test modules can use
460code from ``meta/lib/oeqa/utils``, which are helper classes.
461
462.. note::
463
464 Structure shell commands such that you rely on them and they return a
465 single code for success. Be aware that sometimes you will need to
466 parse the output. See the ``df.py`` and ``date.py`` modules for examples.
467
468You will notice that all test classes inherit ``oeRuntimeTest``, which
469is found in ``meta/lib/oetest.py``. This base class offers some helper
470attributes, which are described in the following sections:
471
472Class Methods
473-------------
474
475Class methods are as follows:
476
477- *hasPackage(pkg):* Returns "True" if ``pkg`` is in the installed
478 package list of the image, which is based on the manifest file that
479 is generated during the :ref:`ref-tasks-rootfs` task.
480
481- *hasFeature(feature):* Returns "True" if the feature is in
482 :term:`IMAGE_FEATURES` or
483 :term:`DISTRO_FEATURES`.
484
485Class Attributes
486----------------
487
488Class attributes are as follows:
489
490- *pscmd:* Equals "ps -ef" if ``procps`` is installed in the image.
491 Otherwise, ``pscmd`` equals "ps" (busybox).
492
493- *tc:* The called test context, which gives access to the
494 following attributes:
495
496 - *d:* The BitBake datastore, which allows you to use stuff such
497 as ``oeRuntimeTest.tc.d.getVar("VIRTUAL-RUNTIME_init_manager")``.
498
499 - *testslist and testsrequired:* Used internally. The tests
500 do not need these.
501
502 - *filesdir:* The absolute path to
503 ``meta/lib/oeqa/runtime/files``, which contains helper files for
504 tests meant for copying on the target such as small files written
505 in C for compilation.
506
507 - *target:* The target controller object used to deploy and
508 start an image on a particular target (e.g. Qemu, SimpleRemote,
509 and SystemdbootTarget). Tests usually use the following:
510
511 - *ip:* The target's IP address.
512
513 - *server_ip:* The host's IP address, which is usually used
514 by the DNF test suite.
515
516 - *run(cmd, timeout=None):* The single, most used method.
517 This command is a wrapper for: ``ssh root@host "cmd"``. The
518 command returns a tuple: (status, output), which are what their
519 names imply - the return code of "cmd" and whatever output it
520 produces. The optional timeout argument represents the number
521 of seconds the test should wait for "cmd" to return. If the
522 argument is "None", the test uses the default instance's
523 timeout period, which is 300 seconds. If the argument is "0",
524 the test runs until the command returns.
525
526 - *copy_to(localpath, remotepath):*
527 ``scp localpath root@ip:remotepath``.
528
529 - *copy_from(remotepath, localpath):*
530 ``scp root@host:remotepath localpath``.
531
532Instance Attributes
533-------------------
534
535There is a single instance attribute, which is ``target``. The ``target``
536instance attribute is identical to the class attribute of the same name,
537which is described in the previous section. This attribute exists as
538both an instance and class attribute so tests can use
539``self.target.run(cmd)`` in instance methods instead of
540``oeRuntimeTest.tc.target.run(cmd)``.
541
542Installing Packages in the DUT Without the Package Manager
543==========================================================
544
545When a test requires a package built by BitBake, it is possible to
546install that package. Installing the package does not require a package
547manager be installed in the device under test (DUT). It does, however,
548require an SSH connection and the target must be using the
549``sshcontrol`` class.
550
551.. note::
552
553 This method uses ``scp`` to copy files from the host to the target, which
554 causes permissions and special attributes to be lost.
555
556A JSON file is used to define the packages needed by a test. This file
557must be in the same path as the file used to define the tests.
558Furthermore, the filename must map directly to the test module name with
559a ``.json`` extension.
560
561The JSON file must include an object with the test name as keys of an
562object or an array. This object (or array of objects) uses the following
563data:
564
565- "pkg" --- a mandatory string that is the name of the package to be
566 installed.
567
568- "rm" --- an optional boolean, which defaults to "false", that specifies
569 to remove the package after the test.
570
571- "extract" --- an optional boolean, which defaults to "false", that
572 specifies if the package must be extracted from the package format.
573 When set to "true", the package is not automatically installed into
574 the DUT.
575
576Here is an example JSON file that handles test "foo" installing
577package "bar" and test "foobar" installing packages "foo" and "bar".
578Once the test is complete, the packages are removed from the DUT::
579
580 {
581 "foo": {
582 "pkg": "bar"
583 },
584 "foobar": [
585 {
586 "pkg": "foo",
587 "rm": true
588 },
589 {
590 "pkg": "bar",
591 "rm": true
592 }
593 ]
594 }
595
diff --git a/documentation/test-manual/test-process.rst b/documentation/test-manual/test-process.rst
index 7bec5ba828..945b56830f 100644
--- a/documentation/test-manual/test-process.rst
+++ b/documentation/test-manual/test-process.rst
@@ -20,7 +20,7 @@ helps review and test patches and this is his testing tree).
20We have two broad categories of test builds, including "full" and 20We have two broad categories of test builds, including "full" and
21"quick". On the Autobuilder, these can be seen as "a-quick" and 21"quick". On the Autobuilder, these can be seen as "a-quick" and
22"a-full", simply for ease of sorting in the UI. Use our Autobuilder 22"a-full", simply for ease of sorting in the UI. Use our Autobuilder
23:yocto_ab:`console view </typhoon/#/console>` to see where we manage most 23:yocto_ab:`console view </valkyrie/#/console>` to see where we manage most
24test-related items. 24test-related items.
25 25
26Builds are triggered manually when the test branches are ready. The 26Builds are triggered manually when the test branches are ready. The
diff --git a/documentation/test-manual/understand-autobuilder.rst b/documentation/test-manual/understand-autobuilder.rst
index 6b4fab4f0b..7f4d1be3cd 100644
--- a/documentation/test-manual/understand-autobuilder.rst
+++ b/documentation/test-manual/understand-autobuilder.rst
@@ -10,7 +10,7 @@ Execution Flow within the Autobuilder
10The "a-full" and "a-quick" targets are the usual entry points into the 10The "a-full" and "a-quick" targets are the usual entry points into the
11Autobuilder and it makes sense to follow the process through the system 11Autobuilder and it makes sense to follow the process through the system
12starting there. This is best visualized from the :yocto_ab:`Autobuilder 12starting there. This is best visualized from the :yocto_ab:`Autobuilder
13Console view </typhoon/#/console>`. 13Console view </valkyrie/#/console>`.
14 14
15Each item along the top of that view represents some "target build" and 15Each item along the top of that view represents some "target build" and
16these targets are all run in parallel. The 'full' build will trigger the 16these targets are all run in parallel. The 'full' build will trigger the