summaryrefslogtreecommitdiffstats
path: root/documentation/test-manual/intro.rst
diff options
context:
space:
mode:
Diffstat (limited to 'documentation/test-manual/intro.rst')
-rw-r--r--documentation/test-manual/intro.rst528
1 files changed, 528 insertions, 0 deletions
diff --git a/documentation/test-manual/intro.rst b/documentation/test-manual/intro.rst
new file mode 100644
index 0000000000..6168ad7700
--- /dev/null
+++ b/documentation/test-manual/intro.rst
@@ -0,0 +1,528 @@
1.. SPDX-License-Identifier: CC-BY-SA-2.0-UK
2
3*****************************************
4The Yocto Project Test Environment Manual
5*****************************************
6
7Welcome
8=======
9
10Welcome to the Yocto Project Test Environment Manual! This manual is a
11work in progress. The manual contains information about the testing
12environment used by the Yocto Project to make sure each major and minor
13release works as intended. All the project's testing infrastructure and
14processes are publicly visible and available so that the community can
15see what testing is being performed, how it's being done and the current
16status of the tests and the project at any given time. It is intended
17that Other organizations can leverage off the process and testing
18environment used by the Yocto Project to create their own automated,
19production test environment, building upon the foundations from the
20project core.
21
22Currently, the Yocto Project Test Environment Manual has no projected
23release date. This manual is a work-in-progress and is being initially
24loaded with information from the README files and notes from key
25engineers:
26
27- *yocto-autobuilder2:* This
28 :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>`
29 is the main README which detials how to set up the Yocto Project
30 Autobuilder. The ``yocto-autobuilder2`` repository represents the
31 Yocto Project's console UI plugin to Buildbot and the configuration
32 necessary to configure Buildbot to perform the testing the project
33 requires.
34
35- *yocto-autobuilder-helper:* This :yocto_git:`README </yocto-autobuilder-helper/tree/README/>`
36 and repository contains Yocto Project Autobuilder Helper scripts and
37 configuration. The ``yocto-autobuilder-helper`` repository contains
38 the "glue" logic that defines which tests to run and how to run them.
39 As a result, it can be used by any Continuous Improvement (CI) system
40 to run builds, support getting the correct code revisions, configure
41 builds and layers, run builds, and collect results. The code is
42 independent of any CI system, which means the code can work `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__,
43 Jenkins, or others. This repository has a branch per release of the
44 project defining the tests to run on a per release basis.
45
46Yocto Project Autobuilder Overview
47==================================
48
49The Yocto Project Autobuilder collectively refers to the software,
50tools, scripts, and procedures used by the Yocto Project to test
51released software across supported hardware in an automated and regular
52fashion. Basically, during the development of a Yocto Project release,
53the Autobuilder tests if things work. The Autobuilder builds all test
54targets and runs all the tests.
55
56The Yocto Project uses now uses standard upstream
57`Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__ (version 9) to
58drive its integration and testing. Buildbot Nine has a plug-in interface
59that the Yocto Project customizes using code from the
60``yocto-autobuilder2`` repository, adding its own console UI plugin. The
61resulting UI plug-in allows you to visualize builds in a way suited to
62the project's needs.
63
64A ``helper`` layer provides configuration and job management through
65scripts found in the ``yocto-autobuilder-helper`` repository. The
66``helper`` layer contains the bulk of the build configuration
67information and is release-specific, which makes it highly customizable
68on a per-project basis. The layer is CI system-agnostic and contains a
69number of Helper scripts that can generate build configurations from
70simple JSON files.
71
72.. note::
73
74 The project uses Buildbot for historical reasons but also because
75 many of the project developers have knowledge of python. It is
76 possible to use the outer layers from another Continuous Integration
77 (CI) system such as
78 `Jenkins <https://en.wikipedia.org/wiki/Jenkins_(software)>`__
79 instead of Buildbot.
80
81The following figure shows the Yocto Project Autobuilder stack with a
82topology that includes a controller and a cluster of workers:
83
84.. image:: figures/ab-test-cluster.png
85 :align: center
86
87Yocto Project Tests - Types of Testing Overview
88===============================================
89
90The Autobuilder tests different elements of the project by using
91thefollowing types of tests:
92
93- *Build Testing:* Tests whether specific configurations build by
94 varying :term:`MACHINE`,
95 :term:`DISTRO`, other configuration
96 options, and the specific target images being built (or world). Used
97 to trigger builds of all the different test configurations on the
98 Autobuilder. Builds usually cover many different targets for
99 different architectures, machines, and distributions, as well as
100 different configurations, such as different init systems. The
101 Autobuilder tests literally hundreds of configurations and targets.
102
103 - *Sanity Checks During the Build Process:* Tests initiated through
104 the :ref:`insane <ref-classes-insane>`
105 class. These checks ensure the output of the builds are correct.
106 For example, does the ELF architecture in the generated binaries
107 match the target system? ARM binaries would not work in a MIPS
108 system!
109
110- *Build Performance Testing:* Tests whether or not commonly used steps
111 during builds work efficiently and avoid regressions. Tests to time
112 commonly used usage scenarios are run through ``oe-build-perf-test``.
113 These tests are run on isolated machines so that the time
114 measurements of the tests are accurate and no other processes
115 interfere with the timing results. The project currently tests
116 performance on two different distributions, Fedora and Ubuntu, to
117 ensure we have no single point of failure and can ensure the
118 different distros work effectively.
119
120- *eSDK Testing:* Image tests initiated through the following command::
121
122 $ bitbake image -c testsdkext
123
124 The tests utilize the ``testsdkext`` class and the ``do_testsdkext`` task.
125
126- *Feature Testing:* Various scenario-based tests are run through the
127 :ref:`OpenEmbedded Self test (oe-selftest) <ref-manual/ref-release-process:Testing and Quality Assurance>`. We test oe-selftest on each of the main distrubutions
128 we support.
129
130- *Image Testing:* Image tests initiated through the following command::
131
132 $ bitbake image -c testimage
133
134 The tests utilize the :ref:`testimage* <ref-classes-testimage*>`
135 classes and the :ref:`ref-tasks-testimage` task.
136
137- *Layer Testing:* The Autobuilder has the possibility to test whether
138 specific layers work with the test of the system. The layers tested
139 may be selected by members of the project. Some key community layers
140 are also tested periodically.
141
142- *Package Testing:* A Package Test (ptest) runs tests against packages
143 built by the OpenEmbedded build system on the target machine. See the
144 :ref:`Testing Packages With
145 ptest <dev-manual/dev-manual-common-tasks:Testing Packages With ptest>` section
146 in the Yocto Project Development Tasks Manual and the
147 ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more
148 information on Ptest.
149
150- *SDK Testing:* Image tests initiated through the following command::
151
152 $ bitbake image -c testsdk
153
154 The tests utilize the :ref:`testsdk <ref-classes-testsdk>` class and
155 the ``do_testsdk`` task.
156
157- *Unit Testing:* Unit tests on various components of the system run
158 through :ref:`bitbake-selftest <ref-manual/ref-release-process:Testing and Quality Assurance>` and
159 :ref:`oe-selftest <ref-manual/ref-release-process:Testing and Quality Assurance>`.
160
161- *Automatic Upgrade Helper:* This target tests whether new versions of
162 software are available and whether we can automatically upgrade to
163 those new versions. If so, this target emails the maintainers with a
164 patch to let them know this is possible.
165
166How Tests Map to Areas of Code
167==============================
168
169Tests map into the codebase as follows:
170
171- *bitbake-selftest:*
172
173 These tests are self-contained and test BitBake as well as its APIs,
174 which include the fetchers. The tests are located in
175 ``bitbake/lib/*/tests``.
176
177 From within the BitBake repository, run the following::
178
179 $ bitbake-selftest
180
181 To skip tests that access the Internet, use the ``BB_SKIP_NETTEST``
182 variable when running "bitbake-selftest" as follows::
183
184 $ BB_SKIP_NETTEST=yes bitbake-selftest
185
186 The default output is quiet and just prints a summary of what was
187 run. To see more information, there is a verbose option::
188
189 $ bitbake-selftest -v
190
191 Use this option when you wish to skip tests that access the network,
192 which are mostly necessary to test the fetcher modules. To specify
193 individual test modules to run, append the test module name to the
194 "bitbake-selftest" command. For example, to specify the tests for the
195 bb.data.module, run::
196
197 $ bitbake-selftest bb.test.data.module
198
199 You can also specify individual tests by defining the full name and module
200 plus the class path of the test, for example::
201
202 $ bitbake-selftest bb.tests.data.TestOverrides.test_one_override
203
204 The tests are based on `Python
205 unittest <https://docs.python.org/3/library/unittest.html>`__.
206
207- *oe-selftest:*
208
209 - These tests use OE to test the workflows, which include testing
210 specific features, behaviors of tasks, and API unit tests.
211
212 - The tests can take advantage of parallelism through the "-j"
213 option, which can specify a number of threads to spread the tests
214 across. Note that all tests from a given class of tests will run
215 in the same thread. To parallelize large numbers of tests you can
216 split the class into multiple units.
217
218 - The tests are based on Python unittest.
219
220 - The code for the tests resides in
221 ``meta/lib/oeqa/selftest/cases/``.
222
223 - To run all the tests, enter the following command::
224
225 $ oe-selftest -a
226
227 - To run a specific test, use the following command form where
228 testname is the name of the specific test::
229
230 $ oe-selftest -r <testname>
231
232 For example, the following command would run the tinfoil
233 getVar API test::
234
235 $ oe-selftest -r tinfoil.TinfoilTests.test_getvar
236
237 It is also possible to run a set
238 of tests. For example the following command will run all of the
239 tinfoil tests::
240
241 $ oe-selftest -r tinfoil
242
243- *testimage:*
244
245 - These tests build an image, boot it, and run tests against the
246 image's content.
247
248 - The code for these tests resides in ``meta/lib/oeqa/runtime/cases/``.
249
250 - You need to set the :term:`IMAGE_CLASSES` variable as follows::
251
252 IMAGE_CLASSES += "testimage"
253
254 - Run the tests using the following command form::
255
256 $ bitbake image -c testimage
257
258- *testsdk:*
259
260 - These tests build an SDK, install it, and then run tests against
261 that SDK.
262
263 - The code for these tests resides in ``meta/lib/oeqa/sdk/cases/``.
264
265 - Run the test using the following command form::
266
267 $ bitbake image -c testsdk
268
269- *testsdk_ext:*
270
271 - These tests build an extended SDK (eSDK), install that eSDK, and
272 run tests against the eSDK.
273
274 - The code for these tests resides in ``meta/lib/oeqa/esdk``.
275
276 - To run the tests, use the following command form::
277
278 $ bitbake image -c testsdkext
279
280- *oe-build-perf-test:*
281
282 - These tests run through commonly used usage scenarios and measure
283 the performance times.
284
285 - The code for these tests resides in ``meta/lib/oeqa/buildperf``.
286
287 - To run the tests, use the following command form::
288
289 $ oe-build-perf-test <options>
290
291 The command takes a number of options,
292 such as where to place the test results. The Autobuilder Helper
293 Scripts include the ``build-perf-test-wrapper`` script with
294 examples of how to use the oe-build-perf-test from the command
295 line.
296
297 Use the ``oe-git-archive`` command to store test results into a
298 Git repository.
299
300 Use the ``oe-build-perf-report`` command to generate text reports
301 and HTML reports with graphs of the performance data. For
302 examples, see
303 :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.html`
304 and
305 :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.txt`.
306
307 - The tests are contained in ``lib/oeqa/buildperf/test_basic.py``.
308
309Test Examples
310=============
311
312This section provides example tests for each of the tests listed in the
313:ref:`test-manual/intro:How Tests Map to Areas of Code` section.
314
315For oeqa tests, testcases for each area reside in the main test
316directory at ``meta/lib/oeqa/selftest/cases`` directory.
317
318For oe-selftest. bitbake testcases reside in the ``lib/bb/tests/``
319directory.
320
321``bitbake-selftest``
322--------------------
323
324A simple test example from ``lib/bb/tests/data.py`` is::
325
326 class DataExpansions(unittest.TestCase):
327 def setUp(self):
328 self.d = bb.data.init()
329 self.d["foo"] = "value_of_foo"
330 self.d["bar"] = "value_of_bar"
331 self.d["value_of_foo"] = "value_of_'value_of_foo'"
332
333 def test_one_var(self):
334 val = self.d.expand("${foo}")
335 self.assertEqual(str(val), "value_of_foo")
336
337In this example, a ``DataExpansions`` class of tests is created,
338derived from standard python unittest. The class has a common ``setUp``
339function which is shared by all the tests in the class. A simple test is
340then added to test that when a variable is expanded, the correct value
341is found.
342
343Bitbake selftests are straightforward python unittest. Refer to the
344Python unittest documentation for additional information on writing
345these tests at: https://docs.python.org/3/library/unittest.html.
346
347``oe-selftest``
348---------------
349
350These tests are more complex due to the setup required behind the scenes
351for full builds. Rather than directly using Python's unittest, the code
352wraps most of the standard objects. The tests can be simple, such as
353testing a command from within the OE build environment using the
354following example::
355
356 class BitbakeLayers(OESelftestTestCase):
357 def test_bitbakelayers_showcrossdepends(self):
358 result = runCmd('bitbake-layers show-cross-depends')
359 self.assertTrue('aspell' in result.output, msg = "No dependencies were shown. bitbake-layers show-cross-depends output: %s"% result.output)
360
361This example, taken from ``meta/lib/oeqa/selftest/cases/bblayers.py``,
362creates a testcase from the ``OESelftestTestCase`` class, derived
363from ``unittest.TestCase``, which runs the ``bitbake-layers`` command
364and checks the output to ensure it contains something we know should be
365here.
366
367The ``oeqa.utils.commands`` module contains Helpers which can assist
368with common tasks, including:
369
370- *Obtaining the value of a bitbake variable:* Use
371 ``oeqa.utils.commands.get_bb_var()`` or use
372 ``oeqa.utils.commands.get_bb_vars()`` for more than one variable
373
374- *Running a bitbake invocation for a build:* Use
375 ``oeqa.utils.commands.bitbake()``
376
377- *Running a command:* Use ``oeqa.utils.commandsrunCmd()``
378
379There is also a ``oeqa.utils.commands.runqemu()`` function for launching
380the ``runqemu`` command for testing things within a running, virtualized
381image.
382
383You can run these tests in parallel. Parallelism works per test class,
384so tests within a given test class should always run in the same build,
385while tests in different classes or modules may be split into different
386builds. There is no data store available for these tests since the tests
387launch the ``bitbake`` command and exist outside of its context. As a
388result, common bitbake library functions (bb.\*) are also unavailable.
389
390``testimage``
391-------------
392
393These tests are run once an image is up and running, either on target
394hardware or under QEMU. As a result, they are assumed to be running in a
395target image environment, as opposed to a host build environment. A
396simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains
397the following::
398
399 class PythonTest(OERuntimeTestCase):
400 @OETestDepends(['ssh.SSHTest.test_ssh'])
401 @OEHasPackage(['python3-core'])
402 def test_python3(self):
403 cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
404 status, output = self.target.run(cmd)
405 msg = 'Exit status was not 0. Output: %s' % output
406 self.assertEqual(status, 0, msg=msg)
407
408In this example, the ``OERuntimeTestCase`` class wraps
409``unittest.TestCase``. Within the test, ``self.target`` represents the
410target system, where commands can be run on it using the ``run()``
411method.
412
413To ensure certain test or package dependencies are met, you can use the
414``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test
415in this example would only make sense if python3-core is installed in
416the image.
417
418``testsdk_ext``
419---------------
420
421These tests are run against built extensible SDKs (eSDKs). The tests can
422assume that the eSDK environment has already been setup. An example from
423``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following::
424
425 class DevtoolTest(OESDKExtTestCase):
426 @classmethod def setUpClass(cls):
427 myapp_src = os.path.join(cls.tc.esdk_files_dir, "myapp")
428 cls.myapp_dst = os.path.join(cls.tc.sdk_dir, "myapp")
429 shutil.copytree(myapp_src, cls.myapp_dst)
430 subprocess.check_output(['git', 'init', '.'], cwd=cls.myapp_dst)
431 subprocess.check_output(['git', 'add', '.'], cwd=cls.myapp_dst)
432 subprocess.check_output(['git', 'commit', '-m', "'test commit'"], cwd=cls.myapp_dst)
433
434 @classmethod
435 def tearDownClass(cls):
436 shutil.rmtree(cls.myapp_dst)
437 def _test_devtool_build(self, directory):
438 self._run('devtool add myapp %s' % directory)
439 try:
440 self._run('devtool build myapp')
441 finally:
442 self._run('devtool reset myapp')
443 def test_devtool_build_make(self):
444 self._test_devtool_build(self.myapp_dst)
445
446In this example, the ``devtool``
447command is tested to see whether a sample application can be built with
448the ``devtool build`` command within the eSDK.
449
450``testsdk``
451-----------
452
453These tests are run against built SDKs. The tests can assume that an SDK
454has already been extracted and its environment file has been sourced. A
455simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the
456following::
457
458 class Python3Test(OESDKTestCase):
459 def setUp(self):
460 if not (self.tc.hasHostPackage("nativesdk-python3-core") or
461 self.tc.hasHostPackage("python3-core-native")):
462 raise unittest.SkipTest("No python3 package in the SDK")
463
464 def test_python3(self):
465 cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
466 output = self._run(cmd)
467 self.assertEqual(output, "Hello, world\n")
468
469In this example, if nativesdk-python3-core has been installed into the SDK, the code runs
470the python3 interpreter with a basic command to check it is working
471correctly. The test would only run if python3 is installed in the SDK.
472
473``oe-build-perf-test``
474----------------------
475
476The performance tests usually measure how long operations take and the
477resource utilisation as that happens. An example from
478``meta/lib/oeqa/buildperf/test_basic.py`` contains the following::
479
480 class Test3(BuildPerfTestCase):
481 def test3(self):
482 """Bitbake parsing (bitbake -p)"""
483 # Drop all caches and parse
484 self.rm_cache()
485 oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
486 self.measure_cmd_resources(['bitbake', '-p'], 'parse_1',
487 'bitbake -p (no caches)')
488 # Drop tmp/cache
489 oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
490 self.measure_cmd_resources(['bitbake', '-p'], 'parse_2',
491 'bitbake -p (no tmp/cache)')
492 # Parse with fully cached data
493 self.measure_cmd_resources(['bitbake', '-p'], 'parse_3',
494 'bitbake -p (cached)')
495
496This example shows how three specific parsing timings are
497measured, with and without various caches, to show how BitBake's parsing
498performance trends over time.
499
500Considerations When Writing Tests
501=================================
502
503When writing good tests, there are several things to keep in mind. Since
504things running on the Autobuilder are accessed concurrently by multiple
505workers, consider the following:
506
507**Running "cleanall" is not permitted.**
508
509This can delete files from DL_DIR which would potentially break other
510builds running in parallel. If this is required, DL_DIR must be set to
511an isolated directory.
512
513**Running "cleansstate" is not permitted.**
514
515This can delete files from SSTATE_DIR which would potentially break
516other builds running in parallel. If this is required, SSTATE_DIR must
517be set to an isolated directory. Alternatively, you can use the "-f"
518option with the ``bitbake`` command to "taint" tasks by changing the
519sstate checksums to ensure sstate cache items will not be reused.
520
521**Tests should not change the metadata.**
522
523This is particularly true for oe-selftests since these can run in
524parallel and changing metadata leads to changing checksums, which
525confuses BitBake while running in parallel. If this is necessary, copy
526layers to a temporary location and modify them. Some tests need to
527change metadata, such as the devtool tests. To prevent the metadate from
528changes, set up temporary copies of that data first.