diff options
Diffstat (limited to 'documentation/test-manual')
-rw-r--r-- | documentation/test-manual/history.rst | 16 | ||||
-rw-r--r-- | documentation/test-manual/index.rst | 5 | ||||
-rw-r--r-- | documentation/test-manual/intro.rst | 182 | ||||
-rw-r--r-- | documentation/test-manual/ptest.rst | 135 | ||||
-rw-r--r-- | documentation/test-manual/reproducible-builds.rst | 167 | ||||
-rw-r--r-- | documentation/test-manual/runtime-testing.rst | 595 | ||||
-rw-r--r-- | documentation/test-manual/test-process.rst | 50 | ||||
-rw-r--r-- | documentation/test-manual/understand-autobuilder.rst | 80 | ||||
-rw-r--r-- | documentation/test-manual/yocto-project-compatible.rst | 129 |
9 files changed, 1189 insertions, 170 deletions
diff --git a/documentation/test-manual/history.rst b/documentation/test-manual/history.rst deleted file mode 100644 index 89d4aad21c..0000000000 --- a/documentation/test-manual/history.rst +++ /dev/null | |||
@@ -1,16 +0,0 @@ | |||
1 | .. SPDX-License-Identifier: CC-BY-SA-2.0-UK | ||
2 | |||
3 | *********************** | ||
4 | Manual Revision History | ||
5 | *********************** | ||
6 | |||
7 | .. list-table:: | ||
8 | :widths: 10 15 40 | ||
9 | :header-rows: 1 | ||
10 | |||
11 | * - Revision | ||
12 | - Date | ||
13 | - Note | ||
14 | * - 3.2 | ||
15 | - October 2020 | ||
16 | - The initial document released with the Yocto Project 3.2 Release | ||
diff --git a/documentation/test-manual/index.rst b/documentation/test-manual/index.rst index e2198c4c39..d365d337ea 100644 --- a/documentation/test-manual/index.rst +++ b/documentation/test-manual/index.rst | |||
@@ -12,7 +12,10 @@ Yocto Project Test Environment Manual | |||
12 | 12 | ||
13 | intro | 13 | intro |
14 | test-process | 14 | test-process |
15 | ptest | ||
16 | runtime-testing | ||
15 | understand-autobuilder | 17 | understand-autobuilder |
16 | history | 18 | reproducible-builds |
19 | yocto-project-compatible | ||
17 | 20 | ||
18 | .. include:: /boilerplate.rst | 21 | .. include:: /boilerplate.rst |
diff --git a/documentation/test-manual/intro.rst b/documentation/test-manual/intro.rst index 81c24a8c3f..d55540c8df 100644 --- a/documentation/test-manual/intro.rst +++ b/documentation/test-manual/intro.rst | |||
@@ -14,19 +14,17 @@ release works as intended. All the project's testing infrastructure and | |||
14 | processes are publicly visible and available so that the community can | 14 | processes are publicly visible and available so that the community can |
15 | see what testing is being performed, how it's being done and the current | 15 | see what testing is being performed, how it's being done and the current |
16 | status of the tests and the project at any given time. It is intended | 16 | status of the tests and the project at any given time. It is intended |
17 | that Other organizations can leverage off the process and testing | 17 | that other organizations can leverage off the process and testing |
18 | environment used by the Yocto Project to create their own automated, | 18 | environment used by the Yocto Project to create their own automated, |
19 | production test environment, building upon the foundations from the | 19 | production test environment, building upon the foundations from the |
20 | project core. | 20 | project core. |
21 | 21 | ||
22 | Currently, the Yocto Project Test Environment Manual has no projected | 22 | This manual is a work-in-progress and is being initially loaded with |
23 | release date. This manual is a work-in-progress and is being initially | 23 | information from the README files and notes from key engineers: |
24 | loaded with information from the README files and notes from key | ||
25 | engineers: | ||
26 | 24 | ||
27 | - *yocto-autobuilder2:* This | 25 | - *yocto-autobuilder2:* This |
28 | :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>` | 26 | :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>` |
29 | is the main README which detials how to set up the Yocto Project | 27 | is the main README which details how to set up the Yocto Project |
30 | Autobuilder. The ``yocto-autobuilder2`` repository represents the | 28 | Autobuilder. The ``yocto-autobuilder2`` repository represents the |
31 | Yocto Project's console UI plugin to Buildbot and the configuration | 29 | Yocto Project's console UI plugin to Buildbot and the configuration |
32 | necessary to configure Buildbot to perform the testing the project | 30 | necessary to configure Buildbot to perform the testing the project |
@@ -39,7 +37,7 @@ engineers: | |||
39 | As a result, it can be used by any Continuous Improvement (CI) system | 37 | As a result, it can be used by any Continuous Improvement (CI) system |
40 | to run builds, support getting the correct code revisions, configure | 38 | to run builds, support getting the correct code revisions, configure |
41 | builds and layers, run builds, and collect results. The code is | 39 | builds and layers, run builds, and collect results. The code is |
42 | independent of any CI system, which means the code can work `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__, | 40 | independent of any CI system, which means the code can work `Buildbot <https://docs.buildbot.net/current/>`__, |
43 | Jenkins, or others. This repository has a branch per release of the | 41 | Jenkins, or others. This repository has a branch per release of the |
44 | project defining the tests to run on a per release basis. | 42 | project defining the tests to run on a per release basis. |
45 | 43 | ||
@@ -53,13 +51,11 @@ fashion. Basically, during the development of a Yocto Project release, | |||
53 | the Autobuilder tests if things work. The Autobuilder builds all test | 51 | the Autobuilder tests if things work. The Autobuilder builds all test |
54 | targets and runs all the tests. | 52 | targets and runs all the tests. |
55 | 53 | ||
56 | The Yocto Project uses now uses standard upstream | 54 | The Yocto Project uses standard upstream Buildbot to drive its integration and |
57 | `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__ (version 9) to | 55 | testing. Buildbot has a plug-in interface that the Yocto Project customizes |
58 | drive its integration and testing. Buildbot Nine has a plug-in interface | 56 | using code from the :yocto_git:`yocto-autobuilder2 </yocto-autobuilder2>` |
59 | that the Yocto Project customizes using code from the | 57 | repository, adding its own console UI plugin. The resulting UI plug-in allows |
60 | ``yocto-autobuilder2`` repository, adding its own console UI plugin. The | 58 | you to visualize builds in a way suited to the project's needs. |
61 | resulting UI plug-in allows you to visualize builds in a way suited to | ||
62 | the project's needs. | ||
63 | 59 | ||
64 | A ``helper`` layer provides configuration and job management through | 60 | A ``helper`` layer provides configuration and job management through |
65 | scripts found in the ``yocto-autobuilder-helper`` repository. The | 61 | scripts found in the ``yocto-autobuilder-helper`` repository. The |
@@ -72,10 +68,9 @@ simple JSON files. | |||
72 | .. note:: | 68 | .. note:: |
73 | 69 | ||
74 | The project uses Buildbot for historical reasons but also because | 70 | The project uses Buildbot for historical reasons but also because |
75 | many of the project developers have knowledge of python. It is | 71 | many of the project developers have knowledge of Python. It is |
76 | possible to use the outer layers from another Continuous Integration | 72 | possible to use the outer layers from another Continuous Integration |
77 | (CI) system such as | 73 | (CI) system such as :wikipedia:`Jenkins <Jenkins_(software)>` |
78 | `Jenkins <https://en.wikipedia.org/wiki/Jenkins_(software)>`__ | ||
79 | instead of Buildbot. | 74 | instead of Buildbot. |
80 | 75 | ||
81 | The following figure shows the Yocto Project Autobuilder stack with a | 76 | The following figure shows the Yocto Project Autobuilder stack with a |
@@ -83,29 +78,29 @@ topology that includes a controller and a cluster of workers: | |||
83 | 78 | ||
84 | .. image:: figures/ab-test-cluster.png | 79 | .. image:: figures/ab-test-cluster.png |
85 | :align: center | 80 | :align: center |
81 | :width: 70% | ||
86 | 82 | ||
87 | Yocto Project Tests - Types of Testing Overview | 83 | Yocto Project Tests --- Types of Testing Overview |
88 | =============================================== | 84 | ================================================= |
89 | 85 | ||
90 | The Autobuilder tests different elements of the project by using | 86 | The Autobuilder tests different elements of the project by using |
91 | thefollowing types of tests: | 87 | the following types of tests: |
92 | 88 | ||
93 | - *Build Testing:* Tests whether specific configurations build by | 89 | - *Build Testing:* Tests whether specific configurations build by |
94 | varying :term:`MACHINE`, | 90 | varying :term:`MACHINE`, |
95 | :term:`DISTRO`, other configuration | 91 | :term:`DISTRO`, other configuration |
96 | options, and the specific target images being built (or world). Used | 92 | options, and the specific target images being built (or ``world``). This is |
97 | to trigger builds of all the different test configurations on the | 93 | used to trigger builds of all the different test configurations on the |
98 | Autobuilder. Builds usually cover many different targets for | 94 | Autobuilder. Builds usually cover many different targets for |
99 | different architectures, machines, and distributions, as well as | 95 | different architectures, machines, and distributions, as well as |
100 | different configurations, such as different init systems. The | 96 | different configurations, such as different init systems. The |
101 | Autobuilder tests literally hundreds of configurations and targets. | 97 | Autobuilder tests literally hundreds of configurations and targets. |
102 | 98 | ||
103 | - *Sanity Checks During the Build Process:* Tests initiated through | 99 | - *Sanity Checks During the Build Process:* Tests initiated through the |
104 | the :ref:`insane <ref-classes-insane>` | 100 | :ref:`ref-classes-insane` class. These checks ensure the output of the |
105 | class. These checks ensure the output of the builds are correct. | 101 | builds are correct. For example, does the ELF architecture in the |
106 | For example, does the ELF architecture in the generated binaries | 102 | generated binaries match the target system? ARM binaries would not work |
107 | match the target system? ARM binaries would not work in a MIPS | 103 | in a MIPS system! |
108 | system! | ||
109 | 104 | ||
110 | - *Build Performance Testing:* Tests whether or not commonly used steps | 105 | - *Build Performance Testing:* Tests whether or not commonly used steps |
111 | during builds work efficiently and avoid regressions. Tests to time | 106 | during builds work efficiently and avoid regressions. Tests to time |
@@ -121,18 +116,21 @@ thefollowing types of tests: | |||
121 | 116 | ||
122 | $ bitbake image -c testsdkext | 117 | $ bitbake image -c testsdkext |
123 | 118 | ||
124 | The tests utilize the ``testsdkext`` class and the ``do_testsdkext`` task. | 119 | The tests use the :ref:`ref-classes-testsdk` class and the |
120 | ``do_testsdkext`` task. | ||
125 | 121 | ||
126 | - *Feature Testing:* Various scenario-based tests are run through the | 122 | - *Feature Testing:* Various scenario-based tests are run through the |
127 | :ref:`OpenEmbedded Self test (oe-selftest) <ref-manual/release-process:Testing and Quality Assurance>`. We test oe-selftest on each of the main distrubutions | 123 | :ref:`OpenEmbedded Self test (oe-selftest) <ref-manual/release-process:Testing and Quality Assurance>`. We test oe-selftest on each of the main distributions |
128 | we support. | 124 | we support. |
129 | 125 | ||
130 | - *Image Testing:* Image tests initiated through the following command:: | 126 | - *Image Testing:* Image tests initiated through the following command:: |
131 | 127 | ||
132 | $ bitbake image -c testimage | 128 | $ bitbake image -c testimage |
133 | 129 | ||
134 | The tests utilize the :ref:`testimage* <ref-classes-testimage*>` | 130 | The tests use the :ref:`ref-classes-testimage` |
135 | classes and the :ref:`ref-tasks-testimage` task. | 131 | class and the :ref:`ref-tasks-testimage` task. See the |
132 | :ref:`test-manual/runtime-testing:Performing Automated Runtime Testing` | ||
133 | section of the Yocto Project Test Environment Manual for more information. | ||
136 | 134 | ||
137 | - *Layer Testing:* The Autobuilder has the possibility to test whether | 135 | - *Layer Testing:* The Autobuilder has the possibility to test whether |
138 | specific layers work with the test of the system. The layers tested | 136 | specific layers work with the test of the system. The layers tested |
@@ -142,7 +140,7 @@ thefollowing types of tests: | |||
142 | - *Package Testing:* A Package Test (ptest) runs tests against packages | 140 | - *Package Testing:* A Package Test (ptest) runs tests against packages |
143 | built by the OpenEmbedded build system on the target machine. See the | 141 | built by the OpenEmbedded build system on the target machine. See the |
144 | :ref:`Testing Packages With | 142 | :ref:`Testing Packages With |
145 | ptest <dev-manual/common-tasks:Testing Packages With ptest>` section | 143 | ptest <test-manual/ptest:Testing Packages With ptest>` section |
146 | in the Yocto Project Development Tasks Manual and the | 144 | in the Yocto Project Development Tasks Manual and the |
147 | ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more | 145 | ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more |
148 | information on Ptest. | 146 | information on Ptest. |
@@ -151,7 +149,7 @@ thefollowing types of tests: | |||
151 | 149 | ||
152 | $ bitbake image -c testsdk | 150 | $ bitbake image -c testsdk |
153 | 151 | ||
154 | The tests utilize the :ref:`testsdk <ref-classes-testsdk>` class and | 152 | The tests use the :ref:`ref-classes-testsdk` class and |
155 | the ``do_testsdk`` task. | 153 | the ``do_testsdk`` task. |
156 | 154 | ||
157 | - *Unit Testing:* Unit tests on various components of the system run | 155 | - *Unit Testing:* Unit tests on various components of the system run |
@@ -174,48 +172,55 @@ Tests map into the codebase as follows: | |||
174 | which include the fetchers. The tests are located in | 172 | which include the fetchers. The tests are located in |
175 | ``bitbake/lib/*/tests``. | 173 | ``bitbake/lib/*/tests``. |
176 | 174 | ||
175 | Some of these tests run the ``bitbake`` command, so ``bitbake/bin`` | ||
176 | must be added to the ``PATH`` before running ``bitbake-selftest``. | ||
177 | From within the BitBake repository, run the following:: | 177 | From within the BitBake repository, run the following:: |
178 | 178 | ||
179 | $ bitbake-selftest | 179 | $ export PATH=$PWD/bin:$PATH |
180 | 180 | ||
181 | To skip tests that access the Internet, use the ``BB_SKIP_NETTEST`` | 181 | After that, you can run the selftest script:: |
182 | variable when running "bitbake-selftest" as follows:: | ||
183 | 182 | ||
184 | $ BB_SKIP_NETTEST=yes bitbake-selftest | 183 | $ bitbake-selftest |
185 | 184 | ||
186 | The default output is quiet and just prints a summary of what was | 185 | The default output is quiet and just prints a summary of what was |
187 | run. To see more information, there is a verbose option:: | 186 | run. To see more information, there is a verbose option:: |
188 | 187 | ||
189 | $ bitbake-selftest -v | 188 | $ bitbake-selftest -v |
190 | 189 | ||
190 | To skip tests that access the Internet, use the ``BB_SKIP_NETTESTS`` | ||
191 | variable when running ``bitbake-selftest`` as follows:: | ||
192 | |||
193 | $ BB_SKIP_NETTESTS=yes bitbake-selftest | ||
194 | |||
191 | Use this option when you wish to skip tests that access the network, | 195 | Use this option when you wish to skip tests that access the network, |
192 | which are mostly necessary to test the fetcher modules. To specify | 196 | which are mostly necessary to test the fetcher modules. To specify |
193 | individual test modules to run, append the test module name to the | 197 | individual test modules to run, append the test module name to the |
194 | "bitbake-selftest" command. For example, to specify the tests for the | 198 | ``bitbake-selftest`` command. For example, to specify the tests for |
195 | bb.data.module, run:: | 199 | ``bb.tests.data.DataExpansions``, run:: |
196 | 200 | ||
197 | $ bitbake-selftest bb.test.data.module | 201 | $ bitbake-selftest bb.tests.data.DataExpansions |
198 | 202 | ||
199 | You can also specify individual tests by defining the full name and module | 203 | You can also specify individual tests by defining the full name and module |
200 | plus the class path of the test, for example:: | 204 | plus the class path of the test, for example:: |
201 | 205 | ||
202 | $ bitbake-selftest bb.tests.data.TestOverrides.test_one_override | 206 | $ bitbake-selftest bb.tests.data.DataExpansions.test_one_var |
203 | 207 | ||
204 | The tests are based on `Python | 208 | The tests are based on |
205 | unittest <https://docs.python.org/3/library/unittest.html>`__. | 209 | `Python unittest <https://docs.python.org/3/library/unittest.html>`__. |
206 | 210 | ||
207 | - *oe-selftest:* | 211 | - *oe-selftest:* |
208 | 212 | ||
209 | - These tests use OE to test the workflows, which include testing | 213 | - These tests use OE to test the workflows, which include testing |
210 | specific features, behaviors of tasks, and API unit tests. | 214 | specific features, behaviors of tasks, and API unit tests. |
211 | 215 | ||
212 | - The tests can take advantage of parallelism through the "-j" | 216 | - The tests can take advantage of parallelism through the ``-j`` |
213 | option, which can specify a number of threads to spread the tests | 217 | option, which can specify a number of threads to spread the tests |
214 | across. Note that all tests from a given class of tests will run | 218 | across. Note that all tests from a given class of tests will run |
215 | in the same thread. To parallelize large numbers of tests you can | 219 | in the same thread. To parallelize large numbers of tests you can |
216 | split the class into multiple units. | 220 | split the class into multiple units. |
217 | 221 | ||
218 | - The tests are based on Python unittest. | 222 | - The tests are based on |
223 | `Python unittest <https://docs.python.org/3/library/unittest.html>`__. | ||
219 | 224 | ||
220 | - The code for the tests resides in | 225 | - The code for the tests resides in |
221 | ``meta/lib/oeqa/selftest/cases/``. | 226 | ``meta/lib/oeqa/selftest/cases/``. |
@@ -225,18 +230,18 @@ Tests map into the codebase as follows: | |||
225 | $ oe-selftest -a | 230 | $ oe-selftest -a |
226 | 231 | ||
227 | - To run a specific test, use the following command form where | 232 | - To run a specific test, use the following command form where |
228 | testname is the name of the specific test:: | 233 | ``testname`` is the name of the specific test:: |
229 | 234 | ||
230 | $ oe-selftest -r <testname> | 235 | $ oe-selftest -r <testname> |
231 | 236 | ||
232 | For example, the following command would run the tinfoil | 237 | For example, the following command would run the ``tinfoil`` |
233 | getVar API test:: | 238 | ``getVar`` API test:: |
234 | 239 | ||
235 | $ oe-selftest -r tinfoil.TinfoilTests.test_getvar | 240 | $ oe-selftest -r tinfoil.TinfoilTests.test_getvar |
236 | 241 | ||
237 | It is also possible to run a set | 242 | It is also possible to run a set |
238 | of tests. For example the following command will run all of the | 243 | of tests. For example the following command will run all of the |
239 | tinfoil tests:: | 244 | ``tinfoil`` tests:: |
240 | 245 | ||
241 | $ oe-selftest -r tinfoil | 246 | $ oe-selftest -r tinfoil |
242 | 247 | ||
@@ -271,7 +276,7 @@ Tests map into the codebase as follows: | |||
271 | - These tests build an extended SDK (eSDK), install that eSDK, and | 276 | - These tests build an extended SDK (eSDK), install that eSDK, and |
272 | run tests against the eSDK. | 277 | run tests against the eSDK. |
273 | 278 | ||
274 | - The code for these tests resides in ``meta/lib/oeqa/esdk``. | 279 | - The code for these tests resides in ``meta/lib/oeqa/sdkext/cases/``. |
275 | 280 | ||
276 | - To run the tests, use the following command form:: | 281 | - To run the tests, use the following command form:: |
277 | 282 | ||
@@ -298,13 +303,13 @@ Tests map into the codebase as follows: | |||
298 | Git repository. | 303 | Git repository. |
299 | 304 | ||
300 | Use the ``oe-build-perf-report`` command to generate text reports | 305 | Use the ``oe-build-perf-report`` command to generate text reports |
301 | and HTML reports with graphs of the performance data. For | 306 | and HTML reports with graphs of the performance data. See |
302 | examples, see | 307 | :yocto_dl:`html </releases/yocto/yocto-4.3/testresults/buildperf-debian11/perf-debian11_nanbield_20231019191258_15b576c410.html>` |
303 | :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.html` | ||
304 | and | 308 | and |
305 | :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.txt`. | 309 | :yocto_dl:`txt </releases/yocto/yocto-4.3/testresults/buildperf-debian11/perf-debian11_nanbield_20231019191258_15b576c410.txt>` |
310 | examples. | ||
306 | 311 | ||
307 | - The tests are contained in ``lib/oeqa/buildperf/test_basic.py``. | 312 | - The tests are contained in ``meta/lib/oeqa/buildperf/test_basic.py``. |
308 | 313 | ||
309 | Test Examples | 314 | Test Examples |
310 | ============= | 315 | ============= |
@@ -312,16 +317,14 @@ Test Examples | |||
312 | This section provides example tests for each of the tests listed in the | 317 | This section provides example tests for each of the tests listed in the |
313 | :ref:`test-manual/intro:How Tests Map to Areas of Code` section. | 318 | :ref:`test-manual/intro:How Tests Map to Areas of Code` section. |
314 | 319 | ||
315 | For oeqa tests, testcases for each area reside in the main test | 320 | - ``oe-selftest`` testcases reside in the ``meta/lib/oeqa/selftest/cases`` directory. |
316 | directory at ``meta/lib/oeqa/selftest/cases`` directory. | ||
317 | 321 | ||
318 | For oe-selftest. bitbake testcases reside in the ``lib/bb/tests/`` | 322 | - ``bitbake-selftest`` testcases reside in the ``bitbake/lib/bb/tests/`` directory. |
319 | directory. | ||
320 | 323 | ||
321 | ``bitbake-selftest`` | 324 | ``bitbake-selftest`` |
322 | -------------------- | 325 | -------------------- |
323 | 326 | ||
324 | A simple test example from ``lib/bb/tests/data.py`` is:: | 327 | A simple test example from ``bitbake/lib/bb/tests/data.py`` is:: |
325 | 328 | ||
326 | class DataExpansions(unittest.TestCase): | 329 | class DataExpansions(unittest.TestCase): |
327 | def setUp(self): | 330 | def setUp(self): |
@@ -334,21 +337,24 @@ A simple test example from ``lib/bb/tests/data.py`` is:: | |||
334 | val = self.d.expand("${foo}") | 337 | val = self.d.expand("${foo}") |
335 | self.assertEqual(str(val), "value_of_foo") | 338 | self.assertEqual(str(val), "value_of_foo") |
336 | 339 | ||
337 | In this example, a ``DataExpansions`` class of tests is created, | 340 | In this example, a ``DataExpansions`` class of tests is created, derived from |
338 | derived from standard python unittest. The class has a common ``setUp`` | 341 | standard `Python unittest <https://docs.python.org/3/library/unittest.html>`__. |
339 | function which is shared by all the tests in the class. A simple test is | 342 | The class has a common ``setUp`` function which is shared by all the tests in |
340 | then added to test that when a variable is expanded, the correct value | 343 | the class. A simple test is then added to test that when a variable is |
341 | is found. | 344 | expanded, the correct value is found. |
342 | 345 | ||
343 | Bitbake selftests are straightforward python unittest. Refer to the | 346 | BitBake selftests are straightforward |
344 | Python unittest documentation for additional information on writing | 347 | `Python unittest <https://docs.python.org/3/library/unittest.html>`__. |
345 | these tests at: https://docs.python.org/3/library/unittest.html. | 348 | Refer to the `Python unittest documentation |
349 | <https://docs.python.org/3/library/unittest.html>`__ for additional information | ||
350 | on writing such tests. | ||
346 | 351 | ||
347 | ``oe-selftest`` | 352 | ``oe-selftest`` |
348 | --------------- | 353 | --------------- |
349 | 354 | ||
350 | These tests are more complex due to the setup required behind the scenes | 355 | These tests are more complex due to the setup required behind the scenes |
351 | for full builds. Rather than directly using Python's unittest, the code | 356 | for full builds. Rather than directly using `Python unittest |
357 | <https://docs.python.org/3/library/unittest.html>`__, the code | ||
352 | wraps most of the standard objects. The tests can be simple, such as | 358 | wraps most of the standard objects. The tests can be simple, such as |
353 | testing a command from within the OE build environment using the | 359 | testing a command from within the OE build environment using the |
354 | following example:: | 360 | following example:: |
@@ -374,7 +380,7 @@ with common tasks, including: | |||
374 | - *Running a bitbake invocation for a build:* Use | 380 | - *Running a bitbake invocation for a build:* Use |
375 | ``oeqa.utils.commands.bitbake()`` | 381 | ``oeqa.utils.commands.bitbake()`` |
376 | 382 | ||
377 | - *Running a command:* Use ``oeqa.utils.commandsrunCmd()`` | 383 | - *Running a command:* Use ``oeqa.utils.commands.runCmd()`` |
378 | 384 | ||
379 | There is also a ``oeqa.utils.commands.runqemu()`` function for launching | 385 | There is also a ``oeqa.utils.commands.runqemu()`` function for launching |
380 | the ``runqemu`` command for testing things within a running, virtualized | 386 | the ``runqemu`` command for testing things within a running, virtualized |
@@ -385,14 +391,14 @@ so tests within a given test class should always run in the same build, | |||
385 | while tests in different classes or modules may be split into different | 391 | while tests in different classes or modules may be split into different |
386 | builds. There is no data store available for these tests since the tests | 392 | builds. There is no data store available for these tests since the tests |
387 | launch the ``bitbake`` command and exist outside of its context. As a | 393 | launch the ``bitbake`` command and exist outside of its context. As a |
388 | result, common bitbake library functions (bb.\*) are also unavailable. | 394 | result, common BitBake library functions (``bb.\*``) are also unavailable. |
389 | 395 | ||
390 | ``testimage`` | 396 | ``testimage`` |
391 | ------------- | 397 | ------------- |
392 | 398 | ||
393 | These tests are run once an image is up and running, either on target | 399 | These tests are run once an image is up and running, either on target |
394 | hardware or under QEMU. As a result, they are assumed to be running in a | 400 | hardware or under QEMU. As a result, they are assumed to be running in a |
395 | target image environment, as opposed to a host build environment. A | 401 | target image environment, as opposed to in a host build environment. A |
396 | simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains | 402 | simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains |
397 | the following:: | 403 | the following:: |
398 | 404 | ||
@@ -407,19 +413,19 @@ the following:: | |||
407 | 413 | ||
408 | In this example, the ``OERuntimeTestCase`` class wraps | 414 | In this example, the ``OERuntimeTestCase`` class wraps |
409 | ``unittest.TestCase``. Within the test, ``self.target`` represents the | 415 | ``unittest.TestCase``. Within the test, ``self.target`` represents the |
410 | target system, where commands can be run on it using the ``run()`` | 416 | target system, where commands can be run using the ``run()`` |
411 | method. | 417 | method. |
412 | 418 | ||
413 | To ensure certain test or package dependencies are met, you can use the | 419 | To ensure certain tests or package dependencies are met, you can use the |
414 | ``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test | 420 | ``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test |
415 | in this example would only make sense if python3-core is installed in | 421 | in this example would only make sense if ``python3-core`` is installed in |
416 | the image. | 422 | the image. |
417 | 423 | ||
418 | ``testsdk_ext`` | 424 | ``testsdk_ext`` |
419 | --------------- | 425 | --------------- |
420 | 426 | ||
421 | These tests are run against built extensible SDKs (eSDKs). The tests can | 427 | These tests are run against built extensible SDKs (eSDKs). The tests can |
422 | assume that the eSDK environment has already been setup. An example from | 428 | assume that the eSDK environment has already been set up. An example from |
423 | ``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following:: | 429 | ``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following:: |
424 | 430 | ||
425 | class DevtoolTest(OESDKExtTestCase): | 431 | class DevtoolTest(OESDKExtTestCase): |
@@ -452,7 +458,7 @@ the ``devtool build`` command within the eSDK. | |||
452 | 458 | ||
453 | These tests are run against built SDKs. The tests can assume that an SDK | 459 | These tests are run against built SDKs. The tests can assume that an SDK |
454 | has already been extracted and its environment file has been sourced. A | 460 | has already been extracted and its environment file has been sourced. A |
455 | simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the | 461 | simple example from ``meta/lib/oeqa/sdk/cases/python.py`` contains the |
456 | following:: | 462 | following:: |
457 | 463 | ||
458 | class Python3Test(OESDKTestCase): | 464 | class Python3Test(OESDKTestCase): |
@@ -466,15 +472,15 @@ following:: | |||
466 | output = self._run(cmd) | 472 | output = self._run(cmd) |
467 | self.assertEqual(output, "Hello, world\n") | 473 | self.assertEqual(output, "Hello, world\n") |
468 | 474 | ||
469 | In this example, if nativesdk-python3-core has been installed into the SDK, the code runs | 475 | In this example, if ``nativesdk-python3-core`` has been installed into the SDK, |
470 | the python3 interpreter with a basic command to check it is working | 476 | the code runs the ``python3`` interpreter with a basic command to check it is |
471 | correctly. The test would only run if python3 is installed in the SDK. | 477 | working correctly. The test would only run if Python3 is installed in the SDK. |
472 | 478 | ||
473 | ``oe-build-perf-test`` | 479 | ``oe-build-perf-test`` |
474 | ---------------------- | 480 | ---------------------- |
475 | 481 | ||
476 | The performance tests usually measure how long operations take and the | 482 | The performance tests usually measure how long operations take and the |
477 | resource utilisation as that happens. An example from | 483 | resource utilization as that happens. An example from |
478 | ``meta/lib/oeqa/buildperf/test_basic.py`` contains the following:: | 484 | ``meta/lib/oeqa/buildperf/test_basic.py`` contains the following:: |
479 | 485 | ||
480 | class Test3(BuildPerfTestCase): | 486 | class Test3(BuildPerfTestCase): |
@@ -506,15 +512,15 @@ workers, consider the following: | |||
506 | 512 | ||
507 | **Running "cleanall" is not permitted.** | 513 | **Running "cleanall" is not permitted.** |
508 | 514 | ||
509 | This can delete files from DL_DIR which would potentially break other | 515 | This can delete files from :term:`DL_DIR` which would potentially break other |
510 | builds running in parallel. If this is required, DL_DIR must be set to | 516 | builds running in parallel. If this is required, :term:`DL_DIR` must be set to |
511 | an isolated directory. | 517 | an isolated directory. |
512 | 518 | ||
513 | **Running "cleansstate" is not permitted.** | 519 | **Running "cleansstate" is not permitted.** |
514 | 520 | ||
515 | This can delete files from SSTATE_DIR which would potentially break | 521 | This can delete files from :term:`SSTATE_DIR` which would potentially break |
516 | other builds running in parallel. If this is required, SSTATE_DIR must | 522 | other builds running in parallel. If this is required, :term:`SSTATE_DIR` must |
517 | be set to an isolated directory. Alternatively, you can use the "-f" | 523 | be set to an isolated directory. Alternatively, you can use the ``-f`` |
518 | option with the ``bitbake`` command to "taint" tasks by changing the | 524 | option with the ``bitbake`` command to "taint" tasks by changing the |
519 | sstate checksums to ensure sstate cache items will not be reused. | 525 | sstate checksums to ensure sstate cache items will not be reused. |
520 | 526 | ||
@@ -524,5 +530,5 @@ This is particularly true for oe-selftests since these can run in | |||
524 | parallel and changing metadata leads to changing checksums, which | 530 | parallel and changing metadata leads to changing checksums, which |
525 | confuses BitBake while running in parallel. If this is necessary, copy | 531 | confuses BitBake while running in parallel. If this is necessary, copy |
526 | layers to a temporary location and modify them. Some tests need to | 532 | layers to a temporary location and modify them. Some tests need to |
527 | change metadata, such as the devtool tests. To prevent the metadate from | 533 | change metadata, such as the devtool tests. To protect the metadata from |
528 | changes, set up temporary copies of that data first. | 534 | changes, set up temporary copies of that data first. |
diff --git a/documentation/test-manual/ptest.rst b/documentation/test-manual/ptest.rst new file mode 100644 index 0000000000..4e6be35df5 --- /dev/null +++ b/documentation/test-manual/ptest.rst | |||
@@ -0,0 +1,135 @@ | |||
1 | .. SPDX-License-Identifier: CC-BY-SA-2.0-UK | ||
2 | |||
3 | *************************** | ||
4 | Testing Packages With ptest | ||
5 | *************************** | ||
6 | |||
7 | A Package Test (ptest) runs tests against packages built by the | ||
8 | OpenEmbedded build system on the target machine. A ptest contains at | ||
9 | least two items: the actual test, and a shell script (``run-ptest``) | ||
10 | that starts the test. The shell script that starts the test must not | ||
11 | contain the actual test --- the script only starts the test. On the other | ||
12 | hand, the test can be anything from a simple shell script that runs a | ||
13 | binary and checks the output to an elaborate system of test binaries and | ||
14 | data files. | ||
15 | |||
16 | The test generates output in the format used by Automake:: | ||
17 | |||
18 | result: testname | ||
19 | |||
20 | where the result can be ``PASS``, ``FAIL``, or ``SKIP``, and | ||
21 | the testname can be any identifying string. | ||
22 | |||
23 | For a list of Yocto Project recipes that are already enabled with ptest, | ||
24 | see the :yocto_wiki:`Ptest </Ptest>` wiki page. | ||
25 | |||
26 | .. note:: | ||
27 | |||
28 | A recipe is "ptest-enabled" if it inherits the :ref:`ref-classes-ptest` | ||
29 | class. | ||
30 | |||
31 | Adding ptest to Your Build | ||
32 | ========================== | ||
33 | |||
34 | To add package testing to your build, add the :term:`DISTRO_FEATURES` and | ||
35 | :term:`EXTRA_IMAGE_FEATURES` variables to your ``local.conf`` file, which | ||
36 | is found in the :term:`Build Directory`:: | ||
37 | |||
38 | DISTRO_FEATURES:append = " ptest" | ||
39 | EXTRA_IMAGE_FEATURES += "ptest-pkgs" | ||
40 | |||
41 | Once your build is complete, the ptest files are installed into the | ||
42 | ``/usr/lib/package/ptest`` directory within the image, where ``package`` | ||
43 | is the name of the package. | ||
44 | |||
45 | Running ptest | ||
46 | ============= | ||
47 | |||
48 | The ``ptest-runner`` package installs a shell script that loops through | ||
49 | all installed ptest test suites and runs them in sequence. | ||
50 | |||
51 | During the execution ``ptest-runner`` keeps count of total and failed | ||
52 | ``ptests``. At end the execution summary is written to the console. | ||
53 | If any of the ``run-ptest`` fails, ``ptest-runner`` returns '1'. | ||
54 | |||
55 | Consequently, you might want to add ``ptest-runner`` to your image. | ||
56 | |||
57 | |||
58 | Getting Your Package Ready | ||
59 | ========================== | ||
60 | |||
61 | In order to enable a recipe to run installed ``ptests`` on target hardware, | ||
62 | you need to prepare the recipes that build the packages you want to | ||
63 | test. Here is what you have to do for each recipe: | ||
64 | |||
65 | - *Be sure the recipe inherits the* :ref:`ref-classes-ptest` *class:* | ||
66 | Include the following line in each recipe:: | ||
67 | |||
68 | inherit ptest | ||
69 | |||
70 | .. note:: | ||
71 | |||
72 | Classes for common frameworks already exist in :term:`OpenEmbedded-Core | ||
73 | (OE-Core)`, such as: | ||
74 | |||
75 | - :oe_git:`go-ptest </openembedded-core/tree/meta/classes-recipe/go-ptest.bbclass>` | ||
76 | - :ref:`ref-classes-ptest-cargo` | ||
77 | - :ref:`ref-classes-ptest-gnome` | ||
78 | - :oe_git:`ptest-perl </openembedded-core/tree/meta/classes-recipe/ptest-perl.bbclass>` | ||
79 | - :oe_git:`ptest-python-pytest </openembedded-core/tree/meta/classes-recipe/ptest-python-pytest.bbclass>` | ||
80 | |||
81 | Inheriting these classes with the ``inherit`` keyword in your recipe will | ||
82 | make the next steps automatic. | ||
83 | |||
84 | - *Create run-ptest:* This script starts your test. Locate the | ||
85 | script where you will refer to it using | ||
86 | :term:`SRC_URI`. Be sure ``run-ptest`` exits with 0 to mark it | ||
87 | as successfully executed otherwise will be marked as fail. | ||
88 | Here is an example that starts a test for ``dbus``:: | ||
89 | |||
90 | #!/bin/sh | ||
91 | cd test | ||
92 | make -k runtest-TESTS | ||
93 | |||
94 | - *Ensure dependencies are met:* If the test adds build or runtime | ||
95 | dependencies that normally do not exist for the package (such as | ||
96 | requiring "make" to run the test suite), use the | ||
97 | :term:`DEPENDS` and | ||
98 | :term:`RDEPENDS` variables in | ||
99 | your recipe in order for the package to meet the dependencies. Here | ||
100 | is an example where the package has a runtime dependency on "make":: | ||
101 | |||
102 | RDEPENDS:${PN}-ptest += "make" | ||
103 | |||
104 | - *Add a function to build the test suite:* Not many packages support | ||
105 | cross-compilation of their test suites. Consequently, you usually | ||
106 | need to add a cross-compilation function to the package. | ||
107 | |||
108 | Many packages based on Automake compile and run the test suite by | ||
109 | using a single command such as ``make check``. However, the host | ||
110 | ``make check`` builds and runs on the same computer, while | ||
111 | cross-compiling requires that the package is built on the host but | ||
112 | executed for the target architecture (though often, as in the case | ||
113 | for ptest, the execution occurs on the host). The built version of | ||
114 | Automake that ships with the Yocto Project includes a patch that | ||
115 | separates building and execution. Consequently, packages that use the | ||
116 | unaltered, patched version of ``make check`` automatically | ||
117 | cross-compiles. | ||
118 | |||
119 | Regardless, you still must add a ``do_compile_ptest`` function to | ||
120 | build the test suite. Add a function similar to the following to your | ||
121 | recipe:: | ||
122 | |||
123 | do_compile_ptest() { | ||
124 | oe_runmake buildtest-TESTS | ||
125 | } | ||
126 | |||
127 | - *Ensure special configurations are set:* If the package requires | ||
128 | special configurations prior to compiling the test code, you must | ||
129 | insert a ``do_configure_ptest`` function into the recipe. | ||
130 | |||
131 | - *Install the test suite:* The :ref:`ref-classes-ptest` class | ||
132 | automatically copies the file ``run-ptest`` to the target and then runs make | ||
133 | ``install-ptest`` to run the tests. If this is not enough, you need | ||
134 | to create a ``do_install_ptest`` function and make sure it gets | ||
135 | called after the "make install-ptest" completes. | ||
diff --git a/documentation/test-manual/reproducible-builds.rst b/documentation/test-manual/reproducible-builds.rst new file mode 100644 index 0000000000..b913aa4eaf --- /dev/null +++ b/documentation/test-manual/reproducible-builds.rst | |||
@@ -0,0 +1,167 @@ | |||
1 | .. SPDX-License-Identifier: CC-BY-SA-2.0-UK | ||
2 | |||
3 | ******************* | ||
4 | Reproducible Builds | ||
5 | ******************* | ||
6 | |||
7 | ================ | ||
8 | How we define it | ||
9 | ================ | ||
10 | |||
11 | The Yocto Project defines reproducibility as where a given input build | ||
12 | configuration will give the same binary output regardless of when it is built | ||
13 | (now or in 5 years time), regardless of the path on the filesystem the build is | ||
14 | run in, and regardless of the distro and tools on the underlying host system the | ||
15 | build is running on. | ||
16 | |||
17 | ============== | ||
18 | Why it matters | ||
19 | ============== | ||
20 | |||
21 | The project aligns with the `Reproducible Builds project | ||
22 | <https://reproducible-builds.org/>`__, which shares information about why | ||
23 | reproducibility matters. The primary focus of the project is the ability to | ||
24 | detect security issues being introduced. However, from a Yocto Project | ||
25 | perspective, it is also hugely important that our builds are deterministic. When | ||
26 | you build a given input set of metadata, we expect you to get consistent output. | ||
27 | This has always been a key focus but, :ref:`since release 3.1 ("dunfell") | ||
28 | <migration-guides/migration-3.1:reproducible builds now enabled by default>`, | ||
29 | it is now true down to the binary level including timestamps. | ||
30 | |||
31 | For example, at some point in the future life of a product, you find that you | ||
32 | need to rebuild to add a security fix. If this happens, only the components that | ||
33 | have been modified should change at the binary level. This would lead to much | ||
34 | easier and clearer bounds on where validation is needed. | ||
35 | |||
36 | This also gives an additional benefit to the project builds themselves, our | ||
37 | :ref:`overview-manual/concepts:Hash Equivalence` for | ||
38 | :ref:`overview-manual/concepts:Shared State` object reuse works much more | ||
39 | effectively when the binary output remains the same. | ||
40 | |||
41 | .. note:: | ||
42 | |||
43 | We strongly advise you to make sure your project builds reproducibly | ||
44 | before finalizing your production images. It would be too late if you | ||
45 | only address this issue when the first updates are required. | ||
46 | |||
47 | =================== | ||
48 | How we implement it | ||
49 | =================== | ||
50 | |||
51 | There are many different aspects to build reproducibility, but some particular | ||
52 | things we do within the build system to ensure reproducibility include: | ||
53 | |||
54 | - Adding mappings to the compiler options to ensure debug filepaths are mapped | ||
55 | to consistent target compatible paths. This is done through the | ||
56 | :term:`DEBUG_PREFIX_MAP` variable which sets the ``-fmacro-prefix-map`` and | ||
57 | ``-fdebug-prefix-map`` compiler options correctly to map to target paths. | ||
58 | - Being explicit about recipe dependencies and their configuration (no floating | ||
59 | configure options or host dependencies creeping in). In particular this means | ||
60 | making sure :term:`PACKAGECONFIG` coverage covers configure options which may | ||
61 | otherwise try and auto-detect host dependencies. | ||
62 | - Using recipe specific sysroots to isolate recipes so they only see their | ||
63 | dependencies. These are visible as ``recipe-sysroot`` and | ||
64 | ``recipe-sysroot-native`` directories within the :term:`WORKDIR` of a given | ||
65 | recipe and are populated only with the dependencies a recipe has. | ||
66 | - Build images from a reduced package set: only packages from recipes the image | ||
67 | depends upon. | ||
68 | - Filtering the tools available from the host's ``PATH`` to only a specific set | ||
69 | of tools, set using the :term:`HOSTTOOLS` variable. | ||
70 | |||
71 | ========================================= | ||
72 | Can we prove the project is reproducible? | ||
73 | ========================================= | ||
74 | |||
75 | Yes, we can prove it and we regularly test this on the Autobuilder. At the | ||
76 | time of writing (release 3.3, "hardknott"), :term:`OpenEmbedded-Core (OE-Core)` | ||
77 | is 100% reproducible for all its recipes (i.e. world builds) apart from the Go | ||
78 | language and Ruby documentation packages. Unfortunately, the current | ||
79 | implementation of the Go language has fundamental reproducibility problems as | ||
80 | it always depends upon the paths it is built in. | ||
81 | |||
82 | .. note:: | ||
83 | |||
84 | Only BitBake and :term:`OpenEmbedded-Core (OE-Core)`, which is the ``meta`` | ||
85 | layer in Poky, guarantee complete reproducibility. The moment you add | ||
86 | another layer, this warranty is voided, because of additional configuration | ||
87 | files, ``bbappend`` files, overridden classes, etc. | ||
88 | |||
89 | To run our automated selftest, as we use in our CI on the Autobuilder, you can | ||
90 | run:: | ||
91 | |||
92 | oe-selftest -r reproducible.ReproducibleTests.test_reproducible_builds | ||
93 | |||
94 | This defaults to including a ``world`` build so, if other layers are added, it | ||
95 | would also run the tests for recipes in the additional layers. Different build | ||
96 | targets can be defined using the :term:`OEQA_REPRODUCIBLE_TEST_TARGET` variable | ||
97 | in ``local.conf``. For example, running reproducibility tests for only the | ||
98 | ``python3-numpy`` recipe can be done by setting:: | ||
99 | |||
100 | OEQA_REPRODUCIBLE_TEST_TARGET = "python3-numpy" | ||
101 | |||
102 | in local.conf before running the ``oe-selftest`` command shown above. | ||
103 | |||
104 | Reproducibility builds the target list twice. The first build will be run using | ||
105 | :ref:`Shared State <overview-manual/concepts:Shared State>` if available, the | ||
106 | second build explicitly disables :ref:`Shared State | ||
107 | <overview-manual/concepts:Shared State>` except for recipes defined in the | ||
108 | :term:`OEQA_REPRODUCIBLE_TEST_SSTATE_TARGETS` variable, and builds on the | ||
109 | specific host the build is running on. This means we can test reproducibility | ||
110 | builds between different host distributions over time on the Autobuilder. | ||
111 | |||
112 | If ``OEQA_DEBUGGING_SAVED_OUTPUT`` is set, any differing packages will be saved | ||
113 | here. The test is also able to run the ``diffoscope`` command on the output to | ||
114 | generate HTML files showing the differences between the packages, to aid | ||
115 | debugging. On the Autobuilder, these appear under | ||
116 | https://autobuilder.yocto.io/pub/repro-fail/ in the form ``oe-reproducible + | ||
117 | <date> + <random ID>``, e.g. ``oe-reproducible-20200202-1lm8o1th``. | ||
118 | |||
119 | The project's current reproducibility status can be seen at | ||
120 | :yocto_home:`/reproducible-build-results/` | ||
121 | |||
122 | You can also check the reproducibility status on the Autobuilder: | ||
123 | :yocto_ab:`/valkyrie/#/builders/reproducible`. | ||
124 | |||
125 | =================================== | ||
126 | How can I test my layer or recipes? | ||
127 | =================================== | ||
128 | |||
129 | With world build | ||
130 | ~~~~~~~~~~~~~~~~ | ||
131 | |||
132 | Once again, you can run a ``world`` test using the | ||
133 | :ref:`oe-selftest <ref-manual/release-process:Testing and Quality Assurance>` | ||
134 | command provided above. This functionality is implemented | ||
135 | in :oe_git:`meta/lib/oeqa/selftest/cases/reproducible.py | ||
136 | </openembedded-core/tree/meta/lib/oeqa/selftest/cases/reproducible.py>`. | ||
137 | |||
138 | Subclassing the test | ||
139 | ~~~~~~~~~~~~~~~~~~~~ | ||
140 | |||
141 | You could subclass the test and change ``targets`` to a different target. | ||
142 | |||
143 | You may also change ``sstate_targets`` which would allow you to "pre-cache" some | ||
144 | set of recipes before the test, meaning they are excluded from reproducibility | ||
145 | testing. As a practical example, you could set ``sstate_targets`` to | ||
146 | ``core-image-sato``, then setting ``targets`` to ``core-image-sato-sdk`` would | ||
147 | run reproducibility tests only on the targets belonging only to ``core-image-sato-sdk``. | ||
148 | |||
149 | Using :term:`OEQA_REPRODUCIBLE_TEST_* <OEQA_REPRODUCIBLE_TEST_LEAF_TARGETS>` variables | ||
150 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
151 | |||
152 | If you want to test the reproducibility of a set of recipes, you can define | ||
153 | :term:`OEQA_REPRODUCIBLE_TEST_LEAF_TARGETS`, in your local.conf:: | ||
154 | |||
155 | OEQA_REPRODUCIBLE_TEST_LEAF_TARGETS = "my-recipe" | ||
156 | |||
157 | This will test the reproducibility of ``my-recipe`` but will use the | ||
158 | :ref:`Shared State <overview-manual/concepts:Shared State>` for most its | ||
159 | dependencies (i.e. the ones explicitly listed in DEPENDS, which may not be all | ||
160 | dependencies, c.f. [depends] varflags, PACKAGE_DEPENDS and other | ||
161 | implementations). | ||
162 | |||
163 | You can have finer control on the test with: | ||
164 | |||
165 | - :term:`OEQA_REPRODUCIBLE_TEST_TARGET`: lists recipes to be built, | ||
166 | - :term:`OEQA_REPRODUCIBLE_TEST_SSTATE_TARGETS`: lists recipes that will | ||
167 | be built using :ref:`Shared State <overview-manual/concepts:Shared State>`. | ||
diff --git a/documentation/test-manual/runtime-testing.rst b/documentation/test-manual/runtime-testing.rst new file mode 100644 index 0000000000..557e0530b0 --- /dev/null +++ b/documentation/test-manual/runtime-testing.rst | |||
@@ -0,0 +1,595 @@ | |||
1 | .. SPDX-License-Identifier: CC-BY-SA-2.0-UK | ||
2 | |||
3 | ************************************ | ||
4 | Performing Automated Runtime Testing | ||
5 | ************************************ | ||
6 | |||
7 | The OpenEmbedded build system makes available a series of automated | ||
8 | tests for images to verify runtime functionality. You can run these | ||
9 | tests on either QEMU or actual target hardware. Tests are written in | ||
10 | Python making use of the ``unittest`` module, and the majority of them | ||
11 | run commands on the target system over SSH. This section describes how | ||
12 | you set up the environment to use these tests, run available tests, and | ||
13 | write and add your own tests. | ||
14 | |||
15 | For information on the test and QA infrastructure available within the | ||
16 | Yocto Project, see the ":ref:`ref-manual/release-process:testing and quality assurance`" | ||
17 | section in the Yocto Project Reference Manual. | ||
18 | |||
19 | Enabling Tests | ||
20 | ============== | ||
21 | |||
22 | Depending on whether you are planning to run tests using QEMU or on the | ||
23 | hardware, you have to take different steps to enable the tests. See the | ||
24 | following subsections for information on how to enable both types of | ||
25 | tests. | ||
26 | |||
27 | Enabling Runtime Tests on QEMU | ||
28 | ------------------------------ | ||
29 | |||
30 | In order to run tests, you need to do the following: | ||
31 | |||
32 | - *Set up to avoid interaction with sudo for networking:* To | ||
33 | accomplish this, you must do one of the following: | ||
34 | |||
35 | - Add ``NOPASSWD`` for your user in ``/etc/sudoers`` either for all | ||
36 | commands or just for ``runqemu-ifup``. You must provide the full | ||
37 | path as that can change if you are using multiple clones of the | ||
38 | source repository. | ||
39 | |||
40 | .. note:: | ||
41 | |||
42 | On some distributions, you also need to comment out "Defaults | ||
43 | requiretty" in ``/etc/sudoers``. | ||
44 | |||
45 | - Manually configure a tap interface for your system. | ||
46 | |||
47 | - Run as root the script in ``scripts/runqemu-gen-tapdevs``, which | ||
48 | should generate a list of tap devices. This is the option | ||
49 | typically chosen for Autobuilder-type environments. | ||
50 | |||
51 | .. note:: | ||
52 | |||
53 | - Be sure to use an absolute path when calling this script | ||
54 | with sudo. | ||
55 | |||
56 | - Ensure that your host has the package ``iptables`` installed. | ||
57 | |||
58 | - The package recipe ``qemu-helper-native`` is required to run | ||
59 | this script. Build the package using the following command:: | ||
60 | |||
61 | $ bitbake qemu-helper-native | ||
62 | |||
63 | - *Set the DISPLAY variable:* You need to set this variable so that | ||
64 | you have an X server available (e.g. start ``vncserver`` for a | ||
65 | headless machine). | ||
66 | |||
67 | - *Be sure your host's firewall accepts incoming connections from | ||
68 | 192.168.7.0/24:* Some of the tests (in particular DNF tests) start an | ||
69 | HTTP server on a random high number port, which is used to serve | ||
70 | files to the target. The DNF module serves | ||
71 | ``${WORKDIR}/oe-rootfs-repo`` so it can run DNF channel commands. | ||
72 | That means your host's firewall must accept incoming connections from | ||
73 | 192.168.7.0/24, which is the default IP range used for tap devices by | ||
74 | ``runqemu``. | ||
75 | |||
76 | - *Be sure your host has the correct packages installed:* Depending | ||
77 | your host's distribution, you need to have the following packages | ||
78 | installed: | ||
79 | |||
80 | - Ubuntu and Debian: ``sysstat`` and ``iproute2`` | ||
81 | |||
82 | - openSUSE: ``sysstat`` and ``iproute2`` | ||
83 | |||
84 | - Fedora: ``sysstat`` and ``iproute`` | ||
85 | |||
86 | - CentOS: ``sysstat`` and ``iproute`` | ||
87 | |||
88 | Once you start running the tests, the following happens: | ||
89 | |||
90 | #. A copy of the root filesystem is written to ``${WORKDIR}/testimage``. | ||
91 | |||
92 | #. The image is booted under QEMU using the standard ``runqemu`` script. | ||
93 | |||
94 | #. A default timeout of 500 seconds occurs to allow for the boot process | ||
95 | to reach the login prompt. You can change the timeout period by | ||
96 | setting | ||
97 | :term:`TEST_QEMUBOOT_TIMEOUT` | ||
98 | in the ``local.conf`` file. | ||
99 | |||
100 | #. Once the boot process is reached and the login prompt appears, the | ||
101 | tests run. The full boot log is written to | ||
102 | ``${WORKDIR}/testimage/qemu_boot_log``. | ||
103 | |||
104 | #. Each test module loads in the order found in :term:`TEST_SUITES`. You can | ||
105 | find the full output of the commands run over SSH in | ||
106 | ``${WORKDIR}/testimgage/ssh_target_log``. | ||
107 | |||
108 | #. If no failures occur, the task running the tests ends successfully. | ||
109 | You can find the output from the ``unittest`` in the task log at | ||
110 | ``${WORKDIR}/temp/log.do_testimage``. | ||
111 | |||
112 | Enabling Runtime Tests on Hardware | ||
113 | ---------------------------------- | ||
114 | |||
115 | The OpenEmbedded build system can run tests on real hardware, and for | ||
116 | certain devices it can also deploy the image to be tested onto the | ||
117 | device beforehand. | ||
118 | |||
119 | For automated deployment, a "controller image" is installed onto the | ||
120 | hardware once as part of setup. Then, each time tests are to be run, the | ||
121 | following occurs: | ||
122 | |||
123 | #. The controller image is booted into and used to write the image to be | ||
124 | tested to a second partition. | ||
125 | |||
126 | #. The device is then rebooted using an external script that you need to | ||
127 | provide. | ||
128 | |||
129 | #. The device boots into the image to be tested. | ||
130 | |||
131 | When running tests (independent of whether the image has been deployed | ||
132 | automatically or not), the device is expected to be connected to a | ||
133 | network on a pre-determined IP address. You can either use static IP | ||
134 | addresses written into the image, or set the image to use DHCP and have | ||
135 | your DHCP server on the test network assign a known IP address based on | ||
136 | the MAC address of the device. | ||
137 | |||
138 | In order to run tests on hardware, you need to set :term:`TEST_TARGET` to an | ||
139 | appropriate value. For QEMU, you do not have to change anything, the | ||
140 | default value is "qemu". For running tests on hardware, the following | ||
141 | options are available: | ||
142 | |||
143 | - *"simpleremote":* Choose "simpleremote" if you are going to run tests | ||
144 | on a target system that is already running the image to be tested and | ||
145 | is available on the network. You can use "simpleremote" in | ||
146 | conjunction with either real hardware or an image running within a | ||
147 | separately started QEMU or any other virtual machine manager. | ||
148 | |||
149 | - *"SystemdbootTarget":* Choose "SystemdbootTarget" if your hardware is | ||
150 | an EFI-based machine with ``systemd-boot`` as bootloader and | ||
151 | ``core-image-testmaster`` (or something similar) is installed. Also, | ||
152 | your hardware under test must be in a DHCP-enabled network that gives | ||
153 | it the same IP address for each reboot. | ||
154 | |||
155 | If you choose "SystemdbootTarget", there are additional requirements | ||
156 | and considerations. See the | ||
157 | ":ref:`test-manual/runtime-testing:selecting systemdboottarget`" section, which | ||
158 | follows, for more information. | ||
159 | |||
160 | - *"BeagleBoneTarget":* Choose "BeagleBoneTarget" if you are deploying | ||
161 | images and running tests on the BeagleBone "Black" or original | ||
162 | "White" hardware. For information on how to use these tests, see the | ||
163 | comments at the top of the BeagleBoneTarget | ||
164 | ``meta-yocto-bsp/lib/oeqa/controllers/beaglebonetarget.py`` file. | ||
165 | |||
166 | - *"GrubTarget":* Choose "GrubTarget" if you are deploying images and running | ||
167 | tests on any generic PC that boots using GRUB. For information on how | ||
168 | to use these tests, see the comments at the top of the GrubTarget | ||
169 | ``meta-yocto-bsp/lib/oeqa/controllers/grubtarget.py`` file. | ||
170 | |||
171 | - *"your-target":* Create your own custom target if you want to run | ||
172 | tests when you are deploying images and running tests on a custom | ||
173 | machine within your BSP layer. To do this, you need to add a Python | ||
174 | unit that defines the target class under ``lib/oeqa/controllers/`` | ||
175 | within your layer. You must also provide an empty ``__init__.py``. | ||
176 | For examples, see files in ``meta-yocto-bsp/lib/oeqa/controllers/``. | ||
177 | |||
178 | Selecting SystemdbootTarget | ||
179 | --------------------------- | ||
180 | |||
181 | If you did not set :term:`TEST_TARGET` to "SystemdbootTarget", then you do | ||
182 | not need any information in this section. You can skip down to the | ||
183 | ":ref:`test-manual/runtime-testing:running tests`" section. | ||
184 | |||
185 | If you did set :term:`TEST_TARGET` to "SystemdbootTarget", you also need to | ||
186 | perform a one-time setup of your controller image by doing the following: | ||
187 | |||
188 | #. *Set EFI_PROVIDER:* Be sure that :term:`EFI_PROVIDER` is as follows:: | ||
189 | |||
190 | EFI_PROVIDER = "systemd-boot" | ||
191 | |||
192 | #. *Build the controller image:* Build the ``core-image-testmaster`` image. | ||
193 | The ``core-image-testmaster`` recipe is provided as an example for a | ||
194 | "controller" image and you can customize the image recipe as you would | ||
195 | any other recipe. | ||
196 | |||
197 | Image recipe requirements are: | ||
198 | |||
199 | - Inherits ``core-image`` so that kernel modules are installed. | ||
200 | |||
201 | - Installs normal linux utilities not BusyBox ones (e.g. ``bash``, | ||
202 | ``coreutils``, ``tar``, ``gzip``, and ``kmod``). | ||
203 | |||
204 | - Uses a custom :term:`Initramfs` image with a custom | ||
205 | installer. A normal image that you can install usually creates a | ||
206 | single root filesystem partition. This image uses another installer that | ||
207 | creates a specific partition layout. Not all Board Support | ||
208 | Packages (BSPs) can use an installer. For such cases, you need to | ||
209 | manually create the following partition layout on the target: | ||
210 | |||
211 | - First partition mounted under ``/boot``, labeled "boot". | ||
212 | |||
213 | - The main root filesystem partition where this image gets installed, | ||
214 | which is mounted under ``/``. | ||
215 | |||
216 | - Another partition labeled "testrootfs" where test images get | ||
217 | deployed. | ||
218 | |||
219 | #. *Install image:* Install the image that you just built on the target | ||
220 | system. | ||
221 | |||
222 | The final thing you need to do when setting :term:`TEST_TARGET` to | ||
223 | "SystemdbootTarget" is to set up the test image: | ||
224 | |||
225 | #. *Set up your local.conf file:* Make sure you have the following | ||
226 | statements in your ``local.conf`` file:: | ||
227 | |||
228 | IMAGE_FSTYPES += "tar.gz" | ||
229 | IMAGE_CLASSES += "testimage" | ||
230 | TEST_TARGET = "SystemdbootTarget" | ||
231 | TEST_TARGET_IP = "192.168.2.3" | ||
232 | |||
233 | #. *Build your test image:* Use BitBake to build the image:: | ||
234 | |||
235 | $ bitbake core-image-sato | ||
236 | |||
237 | Power Control | ||
238 | ------------- | ||
239 | |||
240 | For most hardware targets other than "simpleremote", you can control | ||
241 | power: | ||
242 | |||
243 | - You can use :term:`TEST_POWERCONTROL_CMD` together with | ||
244 | :term:`TEST_POWERCONTROL_EXTRA_ARGS` as a command that runs on the host | ||
245 | and does power cycling. The test code passes one argument to that | ||
246 | command: off, on or cycle (off then on). Here is an example that | ||
247 | could appear in your ``local.conf`` file:: | ||
248 | |||
249 | TEST_POWERCONTROL_CMD = "powercontrol.exp test 10.11.12.1 nuc1" | ||
250 | |||
251 | In this example, the expect | ||
252 | script does the following: | ||
253 | |||
254 | .. code-block:: shell | ||
255 | |||
256 | ssh test@10.11.12.1 "pyctl nuc1 arg" | ||
257 | |||
258 | It then runs a Python script that controls power for a label called | ||
259 | ``nuc1``. | ||
260 | |||
261 | .. note:: | ||
262 | |||
263 | You need to customize :term:`TEST_POWERCONTROL_CMD` and | ||
264 | :term:`TEST_POWERCONTROL_EXTRA_ARGS` for your own setup. The one requirement | ||
265 | is that it accepts "on", "off", and "cycle" as the last argument. | ||
266 | |||
267 | - When no command is defined, it connects to the device over SSH and | ||
268 | uses the classic reboot command to reboot the device. Classic reboot | ||
269 | is fine as long as the machine actually reboots (i.e. the SSH test | ||
270 | has not failed). It is useful for scenarios where you have a simple | ||
271 | setup, typically with a single board, and where some manual | ||
272 | interaction is okay from time to time. | ||
273 | |||
274 | If you have no hardware to automatically perform power control but still | ||
275 | wish to experiment with automated hardware testing, you can use the | ||
276 | ``dialog-power-control`` script that shows a dialog prompting you to perform | ||
277 | the required power action. This script requires either KDialog or Zenity | ||
278 | to be installed. To use this script, set the | ||
279 | :term:`TEST_POWERCONTROL_CMD` | ||
280 | variable as follows:: | ||
281 | |||
282 | TEST_POWERCONTROL_CMD = "${COREBASE}/scripts/contrib/dialog-power-control" | ||
283 | |||
284 | Serial Console Connection | ||
285 | ------------------------- | ||
286 | |||
287 | For test target classes requiring a serial console to interact with the | ||
288 | bootloader (e.g. BeagleBoneTarget and GrubTarget), | ||
289 | you need to specify a command to use to connect to the serial console of | ||
290 | the target machine by using the | ||
291 | :term:`TEST_SERIALCONTROL_CMD` | ||
292 | variable and optionally the | ||
293 | :term:`TEST_SERIALCONTROL_EXTRA_ARGS` | ||
294 | variable. | ||
295 | |||
296 | These cases could be a serial terminal program if the machine is | ||
297 | connected to a local serial port, or a ``telnet`` or ``ssh`` command | ||
298 | connecting to a remote console server. Regardless of the case, the | ||
299 | command simply needs to connect to the serial console and forward that | ||
300 | connection to standard input and output as any normal terminal program | ||
301 | does. For example, to use the picocom terminal program on serial device | ||
302 | ``/dev/ttyUSB0`` at 115200bps, you would set the variable as follows:: | ||
303 | |||
304 | TEST_SERIALCONTROL_CMD = "picocom /dev/ttyUSB0 -b 115200" | ||
305 | |||
306 | For local | ||
307 | devices where the serial port device disappears when the device reboots, | ||
308 | an additional "serdevtry" wrapper script is provided. To use this | ||
309 | wrapper, simply prefix the terminal command with | ||
310 | ``${COREBASE}/scripts/contrib/serdevtry``:: | ||
311 | |||
312 | TEST_SERIALCONTROL_CMD = "${COREBASE}/scripts/contrib/serdevtry picocom -b 115200 /dev/ttyUSB0" | ||
313 | |||
314 | Running Tests | ||
315 | ============= | ||
316 | |||
317 | You can start the tests automatically or manually: | ||
318 | |||
319 | - *Automatically running tests:* To run the tests automatically after the | ||
320 | OpenEmbedded build system successfully creates an image, first set the | ||
321 | :term:`TESTIMAGE_AUTO` variable to "1" in your ``local.conf`` file in the | ||
322 | :term:`Build Directory`:: | ||
323 | |||
324 | TESTIMAGE_AUTO = "1" | ||
325 | |||
326 | Next, build your image. If the image successfully builds, the | ||
327 | tests run:: | ||
328 | |||
329 | bitbake core-image-sato | ||
330 | |||
331 | - *Manually running tests:* To manually run the tests, first globally | ||
332 | inherit the :ref:`ref-classes-testimage` class by editing your | ||
333 | ``local.conf`` file:: | ||
334 | |||
335 | IMAGE_CLASSES += "testimage" | ||
336 | |||
337 | Next, use BitBake to run the tests:: | ||
338 | |||
339 | bitbake -c testimage image | ||
340 | |||
341 | All test files reside in ``meta/lib/oeqa/runtime/cases`` in the | ||
342 | :term:`Source Directory`. A test name maps | ||
343 | directly to a Python module. Each test module may contain a number of | ||
344 | individual tests. Tests are usually grouped together by the area tested | ||
345 | (e.g tests for systemd reside in ``meta/lib/oeqa/runtime/cases/systemd.py``). | ||
346 | |||
347 | You can add tests to any layer provided you place them in the proper | ||
348 | area and you extend :term:`BBPATH` in | ||
349 | the ``local.conf`` file as normal. Be sure that tests reside in | ||
350 | ``layer/lib/oeqa/runtime/cases``. | ||
351 | |||
352 | .. note:: | ||
353 | |||
354 | Be sure that module names do not collide with module names used in | ||
355 | the default set of test modules in ``meta/lib/oeqa/runtime/cases``. | ||
356 | |||
357 | You can change the set of tests run by appending or overriding | ||
358 | :term:`TEST_SUITES` variable in | ||
359 | ``local.conf``. Each name in :term:`TEST_SUITES` represents a required test | ||
360 | for the image. Test modules named within :term:`TEST_SUITES` cannot be | ||
361 | skipped even if a test is not suitable for an image (e.g. running the | ||
362 | RPM tests on an image without ``rpm``). Appending "auto" to | ||
363 | :term:`TEST_SUITES` causes the build system to try to run all tests that are | ||
364 | suitable for the image (i.e. each test module may elect to skip itself). | ||
365 | |||
366 | The order you list tests in :term:`TEST_SUITES` is important and influences | ||
367 | test dependencies. Consequently, tests that depend on other tests should | ||
368 | be added after the test on which they depend. For example, since the | ||
369 | ``ssh`` test depends on the ``ping`` test, "ssh" needs to come after | ||
370 | "ping" in the list. The test class provides no re-ordering or dependency | ||
371 | handling. | ||
372 | |||
373 | .. note:: | ||
374 | |||
375 | Each module can have multiple classes with multiple test methods. | ||
376 | And, Python ``unittest`` rules apply. | ||
377 | |||
378 | Here are some things to keep in mind when running tests: | ||
379 | |||
380 | - The default tests for the image are defined as:: | ||
381 | |||
382 | DEFAULT_TEST_SUITES:pn-image = "ping ssh df connman syslog xorg scp vnc date rpm dnf dmesg" | ||
383 | |||
384 | - Add your own test to the list of the by using the following:: | ||
385 | |||
386 | TEST_SUITES:append = " mytest" | ||
387 | |||
388 | - Run a specific list of tests as follows:: | ||
389 | |||
390 | TEST_SUITES = "test1 test2 test3" | ||
391 | |||
392 | Remember, order is important. Be sure to place a test that is | ||
393 | dependent on another test later in the order. | ||
394 | |||
395 | Exporting Tests | ||
396 | =============== | ||
397 | |||
398 | You can export tests so that they can run independently of the build | ||
399 | system. Exporting tests is required if you want to be able to hand the | ||
400 | test execution off to a scheduler. You can only export tests that are | ||
401 | defined in :term:`TEST_SUITES`. | ||
402 | |||
403 | If your image is already built, make sure the following are set in your | ||
404 | ``local.conf`` file:: | ||
405 | |||
406 | INHERIT += "testexport" | ||
407 | TEST_TARGET_IP = "IP-address-for-the-test-target" | ||
408 | TEST_SERVER_IP = "IP-address-for-the-test-server" | ||
409 | |||
410 | You can then export the tests with the | ||
411 | following BitBake command form:: | ||
412 | |||
413 | $ bitbake image -c testexport | ||
414 | |||
415 | Exporting the tests places them in the :term:`Build Directory` in | ||
416 | ``tmp/testexport/``\ image, which is controlled by the :term:`TEST_EXPORT_DIR` | ||
417 | variable. | ||
418 | |||
419 | You can now run the tests outside of the build environment:: | ||
420 | |||
421 | $ cd tmp/testexport/image | ||
422 | $ ./runexported.py testdata.json | ||
423 | |||
424 | Here is a complete example that shows IP addresses and uses the | ||
425 | ``core-image-sato`` image:: | ||
426 | |||
427 | INHERIT += "testexport" | ||
428 | TEST_TARGET_IP = "192.168.7.2" | ||
429 | TEST_SERVER_IP = "192.168.7.1" | ||
430 | |||
431 | Use BitBake to export the tests:: | ||
432 | |||
433 | $ bitbake core-image-sato -c testexport | ||
434 | |||
435 | Run the tests outside of | ||
436 | the build environment using the following:: | ||
437 | |||
438 | $ cd tmp/testexport/core-image-sato | ||
439 | $ ./runexported.py testdata.json | ||
440 | |||
441 | Writing New Tests | ||
442 | ================= | ||
443 | |||
444 | As mentioned previously, all new test files need to be in the proper | ||
445 | place for the build system to find them. New tests for additional | ||
446 | functionality outside of the core should be added to the layer that adds | ||
447 | the functionality, in ``layer/lib/oeqa/runtime/cases`` (as long as | ||
448 | :term:`BBPATH` is extended in the | ||
449 | layer's ``layer.conf`` file as normal). Just remember the following: | ||
450 | |||
451 | - Filenames need to map directly to test (module) names. | ||
452 | |||
453 | - Do not use module names that collide with existing core tests. | ||
454 | |||
455 | - Minimally, an empty ``__init__.py`` file must be present in the runtime | ||
456 | directory. | ||
457 | |||
458 | To create a new test, start by copying an existing module (e.g. | ||
459 | ``oe_syslog.py`` or ``gcc.py`` are good ones to use). Test modules can use | ||
460 | code from ``meta/lib/oeqa/utils``, which are helper classes. | ||
461 | |||
462 | .. note:: | ||
463 | |||
464 | Structure shell commands such that you rely on them and they return a | ||
465 | single code for success. Be aware that sometimes you will need to | ||
466 | parse the output. See the ``df.py`` and ``date.py`` modules for examples. | ||
467 | |||
468 | You will notice that all test classes inherit ``oeRuntimeTest``, which | ||
469 | is found in ``meta/lib/oetest.py``. This base class offers some helper | ||
470 | attributes, which are described in the following sections: | ||
471 | |||
472 | Class Methods | ||
473 | ------------- | ||
474 | |||
475 | Class methods are as follows: | ||
476 | |||
477 | - *hasPackage(pkg):* Returns "True" if ``pkg`` is in the installed | ||
478 | package list of the image, which is based on the manifest file that | ||
479 | is generated during the :ref:`ref-tasks-rootfs` task. | ||
480 | |||
481 | - *hasFeature(feature):* Returns "True" if the feature is in | ||
482 | :term:`IMAGE_FEATURES` or | ||
483 | :term:`DISTRO_FEATURES`. | ||
484 | |||
485 | Class Attributes | ||
486 | ---------------- | ||
487 | |||
488 | Class attributes are as follows: | ||
489 | |||
490 | - *pscmd:* Equals "ps -ef" if ``procps`` is installed in the image. | ||
491 | Otherwise, ``pscmd`` equals "ps" (busybox). | ||
492 | |||
493 | - *tc:* The called test context, which gives access to the | ||
494 | following attributes: | ||
495 | |||
496 | - *d:* The BitBake datastore, which allows you to use stuff such | ||
497 | as ``oeRuntimeTest.tc.d.getVar("VIRTUAL-RUNTIME_init_manager")``. | ||
498 | |||
499 | - *testslist and testsrequired:* Used internally. The tests | ||
500 | do not need these. | ||
501 | |||
502 | - *filesdir:* The absolute path to | ||
503 | ``meta/lib/oeqa/runtime/files``, which contains helper files for | ||
504 | tests meant for copying on the target such as small files written | ||
505 | in C for compilation. | ||
506 | |||
507 | - *target:* The target controller object used to deploy and | ||
508 | start an image on a particular target (e.g. Qemu, SimpleRemote, | ||
509 | and SystemdbootTarget). Tests usually use the following: | ||
510 | |||
511 | - *ip:* The target's IP address. | ||
512 | |||
513 | - *server_ip:* The host's IP address, which is usually used | ||
514 | by the DNF test suite. | ||
515 | |||
516 | - *run(cmd, timeout=None):* The single, most used method. | ||
517 | This command is a wrapper for: ``ssh root@host "cmd"``. The | ||
518 | command returns a tuple: (status, output), which are what their | ||
519 | names imply - the return code of "cmd" and whatever output it | ||
520 | produces. The optional timeout argument represents the number | ||
521 | of seconds the test should wait for "cmd" to return. If the | ||
522 | argument is "None", the test uses the default instance's | ||
523 | timeout period, which is 300 seconds. If the argument is "0", | ||
524 | the test runs until the command returns. | ||
525 | |||
526 | - *copy_to(localpath, remotepath):* | ||
527 | ``scp localpath root@ip:remotepath``. | ||
528 | |||
529 | - *copy_from(remotepath, localpath):* | ||
530 | ``scp root@host:remotepath localpath``. | ||
531 | |||
532 | Instance Attributes | ||
533 | ------------------- | ||
534 | |||
535 | There is a single instance attribute, which is ``target``. The ``target`` | ||
536 | instance attribute is identical to the class attribute of the same name, | ||
537 | which is described in the previous section. This attribute exists as | ||
538 | both an instance and class attribute so tests can use | ||
539 | ``self.target.run(cmd)`` in instance methods instead of | ||
540 | ``oeRuntimeTest.tc.target.run(cmd)``. | ||
541 | |||
542 | Installing Packages in the DUT Without the Package Manager | ||
543 | ========================================================== | ||
544 | |||
545 | When a test requires a package built by BitBake, it is possible to | ||
546 | install that package. Installing the package does not require a package | ||
547 | manager be installed in the device under test (DUT). It does, however, | ||
548 | require an SSH connection and the target must be using the | ||
549 | ``sshcontrol`` class. | ||
550 | |||
551 | .. note:: | ||
552 | |||
553 | This method uses ``scp`` to copy files from the host to the target, which | ||
554 | causes permissions and special attributes to be lost. | ||
555 | |||
556 | A JSON file is used to define the packages needed by a test. This file | ||
557 | must be in the same path as the file used to define the tests. | ||
558 | Furthermore, the filename must map directly to the test module name with | ||
559 | a ``.json`` extension. | ||
560 | |||
561 | The JSON file must include an object with the test name as keys of an | ||
562 | object or an array. This object (or array of objects) uses the following | ||
563 | data: | ||
564 | |||
565 | - "pkg" --- a mandatory string that is the name of the package to be | ||
566 | installed. | ||
567 | |||
568 | - "rm" --- an optional boolean, which defaults to "false", that specifies | ||
569 | to remove the package after the test. | ||
570 | |||
571 | - "extract" --- an optional boolean, which defaults to "false", that | ||
572 | specifies if the package must be extracted from the package format. | ||
573 | When set to "true", the package is not automatically installed into | ||
574 | the DUT. | ||
575 | |||
576 | Here is an example JSON file that handles test "foo" installing | ||
577 | package "bar" and test "foobar" installing packages "foo" and "bar". | ||
578 | Once the test is complete, the packages are removed from the DUT:: | ||
579 | |||
580 | { | ||
581 | "foo": { | ||
582 | "pkg": "bar" | ||
583 | }, | ||
584 | "foobar": [ | ||
585 | { | ||
586 | "pkg": "foo", | ||
587 | "rm": true | ||
588 | }, | ||
589 | { | ||
590 | "pkg": "bar", | ||
591 | "rm": true | ||
592 | } | ||
593 | ] | ||
594 | } | ||
595 | |||
diff --git a/documentation/test-manual/test-process.rst b/documentation/test-manual/test-process.rst index 8a5e29d922..945b56830f 100644 --- a/documentation/test-manual/test-process.rst +++ b/documentation/test-manual/test-process.rst | |||
@@ -20,8 +20,8 @@ helps review and test patches and this is his testing tree). | |||
20 | We have two broad categories of test builds, including "full" and | 20 | We have two broad categories of test builds, including "full" and |
21 | "quick". On the Autobuilder, these can be seen as "a-quick" and | 21 | "quick". On the Autobuilder, these can be seen as "a-quick" and |
22 | "a-full", simply for ease of sorting in the UI. Use our Autobuilder | 22 | "a-full", simply for ease of sorting in the UI. Use our Autobuilder |
23 | console view to see where me manage most test-related items, available | 23 | :yocto_ab:`console view </valkyrie/#/console>` to see where we manage most |
24 | at: :yocto_ab:`/typhoon/#/console`. | 24 | test-related items. |
25 | 25 | ||
26 | Builds are triggered manually when the test branches are ready. The | 26 | Builds are triggered manually when the test branches are ready. The |
27 | builds are monitored by the SWAT team. For additional information, see | 27 | builds are monitored by the SWAT team. For additional information, see |
@@ -34,24 +34,21 @@ which the result was required. | |||
34 | 34 | ||
35 | The Autobuilder does build the ``master`` branch once daily for several | 35 | The Autobuilder does build the ``master`` branch once daily for several |
36 | reasons, in particular, to ensure the current ``master`` branch does | 36 | reasons, in particular, to ensure the current ``master`` branch does |
37 | build, but also to keep ``yocto-testresults`` | 37 | build, but also to keep (:yocto_git:`yocto-testresults </yocto-testresults/>`), |
38 | (:yocto_git:`/yocto-testresults/`), | 38 | (:yocto_git:`buildhistory </poky-buildhistory/>`), and |
39 | buildhistory | 39 | our sstate up to date. On the weekend, there is a ``master-next`` build |
40 | (:yocto_git:`/poky-buildhistory/`), and | ||
41 | our sstate up to date. On the weekend, there is a master-next build | ||
42 | instead to ensure the test results are updated for the less frequently | 40 | instead to ensure the test results are updated for the less frequently |
43 | run targets. | 41 | run targets. |
44 | 42 | ||
45 | Performance builds (buildperf-\* targets in the console) are triggered | 43 | Performance builds (``buildperf-\*`` targets in the console) are triggered |
46 | separately every six hours and automatically push their results to the | 44 | separately every six hours and automatically push their results to the |
47 | buildstats repository at: | 45 | :yocto_git:`buildstats </yocto-buildstats/>` repository. |
48 | :yocto_git:`/yocto-buildstats/`. | ||
49 | 46 | ||
50 | The 'quick' targets have been selected to be the ones which catch the | 47 | The "quick" targets have been selected to be the ones which catch the |
51 | most failures or give the most valuable data. We run 'fast' ptests in | 48 | most failures or give the most valuable data. We run "fast" ptests in |
52 | this case for example but not the ones which take a long time. The quick | 49 | this case for example but not the ones which take a long time. The quick |
53 | target doesn't include \*-lsb builds for all architectures, some world | 50 | target doesn't include ``\*-lsb`` builds for all architectures, some ``world`` |
54 | builds and doesn't trigger performance tests or ltp testing. The full | 51 | builds and doesn't trigger performance tests or ``ltp`` testing. The full |
55 | build includes all these things and is slower but more comprehensive. | 52 | build includes all these things and is slower but more comprehensive. |
56 | 53 | ||
57 | Release Builds | 54 | Release Builds |
@@ -59,20 +56,20 @@ Release Builds | |||
59 | 56 | ||
60 | The project typically has two major releases a year with a six month | 57 | The project typically has two major releases a year with a six month |
61 | cadence in April and October. Between these there would be a number of | 58 | cadence in April and October. Between these there would be a number of |
62 | milestone releases (usually four) with the final one being stablization | 59 | milestone releases (usually four) with the final one being stabilization |
63 | only along with point releases of our stable branches. | 60 | only along with point releases of our stable branches. |
64 | 61 | ||
65 | The build and release process for these project releases is similar to | 62 | The build and release process for these project releases is similar to |
66 | that in `Day to Day Development <#test-daily-devel>`__, in that the | 63 | that in :ref:`test-manual/test-process:day to day development`, in that the |
67 | a-full target of the Autobuilder is used but in addition the form is | 64 | a-full target of the Autobuilder is used but in addition the form is |
68 | configured to generate and publish artefacts and the milestone number, | 65 | configured to generate and publish artifacts and the milestone number, |
69 | version, release candidate number and other information is entered. The | 66 | version, release candidate number and other information is entered. The |
70 | box to "generate an email to QA"is also checked. | 67 | box to "generate an email to QA" is also checked. |
71 | 68 | ||
72 | When the build completes, an email is sent out using the send-qa-email | 69 | When the build completes, an email is sent out using the ``send-qa-email`` |
73 | script in the ``yocto-autobuilder-helper`` repository to the list of | 70 | script in the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>` |
74 | people configured for that release. Release builds are placed into a | 71 | repository to the list of people configured for that release. Release builds |
75 | directory in https://autobuilder.yocto.io/pub/releases on the | 72 | are placed into a directory in https://autobuilder.yocto.io/pub/releases on the |
76 | Autobuilder which is included in the email. The process from here is | 73 | Autobuilder which is included in the email. The process from here is |
77 | more manual and control is effectively passed to release engineering. | 74 | more manual and control is effectively passed to release engineering. |
78 | The next steps include: | 75 | The next steps include: |
@@ -80,14 +77,15 @@ The next steps include: | |||
80 | - QA teams respond to the email saying which tests they plan to run and | 77 | - QA teams respond to the email saying which tests they plan to run and |
81 | when the results will be available. | 78 | when the results will be available. |
82 | 79 | ||
83 | - QA teams run their tests and share their results in the yocto- | 80 | - QA teams run their tests and share their results in the |
84 | testresults-contrib repository, along with a summary of their | 81 | :yocto_git:`yocto-testresults-contrib </yocto-testresults-contrib>` |
85 | findings. | 82 | repository, along with a summary of their findings. |
86 | 83 | ||
87 | - Release engineering prepare the release as per their process. | 84 | - Release engineering prepare the release as per their process. |
88 | 85 | ||
89 | - Test results from the QA teams are included into the release in | 86 | - Test results from the QA teams are included into the release in |
90 | separate directories and also uploaded to the yocto-testresults | 87 | separate directories and also uploaded to the |
88 | :yocto_git:`yocto-testresults </yocto-testresults>` | ||
91 | repository alongside the other test results for the given revision. | 89 | repository alongside the other test results for the given revision. |
92 | 90 | ||
93 | - The QA report in the final release is regenerated using resulttool to | 91 | - The QA report in the final release is regenerated using resulttool to |
diff --git a/documentation/test-manual/understand-autobuilder.rst b/documentation/test-manual/understand-autobuilder.rst index 199cc97a85..7f4d1be3cd 100644 --- a/documentation/test-manual/understand-autobuilder.rst +++ b/documentation/test-manual/understand-autobuilder.rst | |||
@@ -9,31 +9,31 @@ Execution Flow within the Autobuilder | |||
9 | 9 | ||
10 | The "a-full" and "a-quick" targets are the usual entry points into the | 10 | The "a-full" and "a-quick" targets are the usual entry points into the |
11 | Autobuilder and it makes sense to follow the process through the system | 11 | Autobuilder and it makes sense to follow the process through the system |
12 | starting there. This is best visualised from the Autobuilder Console | 12 | starting there. This is best visualized from the :yocto_ab:`Autobuilder |
13 | view (:yocto_ab:`/typhoon/#/console`). | 13 | Console view </valkyrie/#/console>`. |
14 | 14 | ||
15 | Each item along the top of that view represents some "target build" and | 15 | Each item along the top of that view represents some "target build" and |
16 | these targets are all run in parallel. The 'full' build will trigger the | 16 | these targets are all run in parallel. The 'full' build will trigger the |
17 | majority of them, the "quick" build will trigger some subset of them. | 17 | majority of them, the "quick" build will trigger some subset of them. |
18 | The Autobuilder effectively runs whichever configuration is defined for | 18 | The Autobuilder effectively runs whichever configuration is defined for |
19 | each of those targets on a seperate buildbot worker. To understand the | 19 | each of those targets on a separate buildbot worker. To understand the |
20 | configuration, you need to look at the entry on ``config.json`` file | 20 | configuration, you need to look at the entry on ``config.json`` file |
21 | within the ``yocto-autobuilder-helper`` repository. The targets are | 21 | within the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>` |
22 | defined in the ‘overrides' section, a quick example could be qemux86-64 | 22 | repository. The targets are defined in the ``overrides`` section, a quick |
23 | which looks like:: | 23 | example could be ``qemux86-64`` which looks like:: |
24 | 24 | ||
25 | "qemux86-64" : { | 25 | "qemux86-64" : { |
26 | "MACHINE" : "qemux86-64", | 26 | "MACHINE" : "qemux86-64", |
27 | "TEMPLATE" : "arch-qemu", | 27 | "TEMPLATE" : "arch-qemu", |
28 | "step1" : { | 28 | "step1" : { |
29 | "extravars" : [ | 29 | "extravars" : [ |
30 | "IMAGE_FSTYPES_append = ' wic wic.bmap'" | 30 | "IMAGE_FSTYPES:append = ' wic wic.bmap'" |
31 | ] | 31 | ] |
32 | } | 32 | } |
33 | }, | 33 | }, |
34 | 34 | ||
35 | And to expand that, you need the "arch-qemu" entry from | 35 | And to expand that, you need the ``arch-qemu`` entry from |
36 | the "templates" section, which looks like:: | 36 | the ``templates`` section, which looks like:: |
37 | 37 | ||
38 | "arch-qemu" : { | 38 | "arch-qemu" : { |
39 | "BUILDINFO" : true, | 39 | "BUILDINFO" : true, |
@@ -54,20 +54,20 @@ the "templates" section, which looks like:: | |||
54 | } | 54 | } |
55 | }, | 55 | }, |
56 | 56 | ||
57 | Combining these two entries you can see that "qemux86-64" is a three step build where the | 57 | Combining these two entries you can see that ``qemux86-64`` is a three step |
58 | ``bitbake BBTARGETS`` would be run, then ``bitbake SANITYTARGETS`` for each step; all for | 58 | build where ``bitbake BBTARGETS`` would be run, then ``bitbake SANITYTARGETS`` |
59 | ``MACHINE="qemx86-64"`` but with differing SDKMACHINE settings. In step | 59 | for each step; all for ``MACHINE="qemux86-64"`` but with differing |
60 | 1 an extra variable is added to the ``auto.conf`` file to enable wic | 60 | :term:`SDKMACHINE` settings. In step 1, an extra variable is added to the |
61 | image generation. | 61 | ``auto.conf`` file to enable wic image generation. |
62 | 62 | ||
63 | While not every detail of this is covered here, you can see how the | 63 | While not every detail of this is covered here, you can see how the |
64 | template mechanism allows quite complex configurations to be built up | 64 | template mechanism allows quite complex configurations to be built up |
65 | yet allows duplication and repetition to be kept to a minimum. | 65 | yet allows duplication and repetition to be kept to a minimum. |
66 | 66 | ||
67 | The different build targets are designed to allow for parallelisation, | 67 | The different build targets are designed to allow for parallelization, |
68 | so different machines are usually built in parallel, operations using | 68 | so different machines are usually built in parallel, operations using |
69 | the same machine and metadata are built sequentially, with the aim of | 69 | the same machine and metadata are built sequentially, with the aim of |
70 | trying to optimise build efficiency as much as possible. | 70 | trying to optimize build efficiency as much as possible. |
71 | 71 | ||
72 | The ``config.json`` file is processed by the scripts in the Helper | 72 | The ``config.json`` file is processed by the scripts in the Helper |
73 | repository in the ``scripts`` directory. The following section details | 73 | repository in the ``scripts`` directory. The following section details |
@@ -88,9 +88,9 @@ roughly consist of: | |||
88 | 88 | ||
89 | #. *Obtain yocto-autobuilder-helper* | 89 | #. *Obtain yocto-autobuilder-helper* |
90 | 90 | ||
91 | This step clones the ``yocto-autobuilder-helper`` git repository. | 91 | This step clones the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>` |
92 | This is necessary to prevent the requirement to maintain all the | 92 | git repository. This is necessary to avoid the requirement to maintain all |
93 | release or project-specific code within Buildbot. The branch chosen | 93 | the release or project-specific code within Buildbot. The branch chosen |
94 | matches the release being built so we can support older releases and | 94 | matches the release being built so we can support older releases and |
95 | still make changes in newer ones. | 95 | still make changes in newer ones. |
96 | 96 | ||
@@ -111,7 +111,7 @@ roughly consist of: | |||
111 | :ref:`test-manual/understand-autobuilder:Autobuilder Clone Cache`. | 111 | :ref:`test-manual/understand-autobuilder:Autobuilder Clone Cache`. |
112 | 112 | ||
113 | This step has two possible modes of operation. If the build is part | 113 | This step has two possible modes of operation. If the build is part |
114 | of a parent build, its possible that all the repositories needed may | 114 | of a parent build, it's possible that all the repositories needed may |
115 | already be available, ready in a pre-prepared directory. An "a-quick" | 115 | already be available, ready in a pre-prepared directory. An "a-quick" |
116 | or "a-full" build would prepare this before starting the other | 116 | or "a-full" build would prepare this before starting the other |
117 | sub-target builds. This is done for two reasons: | 117 | sub-target builds. This is done for two reasons: |
@@ -130,7 +130,7 @@ roughly consist of: | |||
130 | 130 | ||
131 | #. *Call scripts/run-config* | 131 | #. *Call scripts/run-config* |
132 | 132 | ||
133 | This is another call into the Helper scripts where its expected that | 133 | This is another call into the Helper scripts where it's expected that |
134 | the main functionality of this target will be executed. | 134 | the main functionality of this target will be executed. |
135 | 135 | ||
136 | Autobuilder Technology | 136 | Autobuilder Technology |
@@ -163,16 +163,17 @@ Autobuilder Worker Janitor | |||
163 | -------------------------- | 163 | -------------------------- |
164 | 164 | ||
165 | This is a process running on each Worker that performs two basic | 165 | This is a process running on each Worker that performs two basic |
166 | operations, including background file deletion at IO idle (see :ref:`test-manual/understand-autobuilder:Autobuilder Target Execution Overview`: Run clobberdir) and | 166 | operations, including background file deletion at IO idle (see |
167 | maintainenance of a cache of cloned repositories to improve the speed | 167 | "Run clobberdir" in :ref:`test-manual/understand-autobuilder:Autobuilder Target Execution Overview`) |
168 | and maintenance of a cache of cloned repositories to improve the speed | ||
168 | the system can checkout repositories. | 169 | the system can checkout repositories. |
169 | 170 | ||
170 | Shared DL_DIR | 171 | Shared DL_DIR |
171 | ------------- | 172 | ------------- |
172 | 173 | ||
173 | The Workers are all connected over NFS which allows DL_DIR to be shared | 174 | The Workers are all connected over NFS which allows :term:`DL_DIR` to be shared |
174 | between them. This reduces network accesses from the system and allows | 175 | between them. This reduces network accesses from the system and allows |
175 | the build to be sped up. Usage of the directory within the build system | 176 | the build to be sped up. The usage of the directory within the build system |
176 | is designed to be able to be shared over NFS. | 177 | is designed to be able to be shared over NFS. |
177 | 178 | ||
178 | Shared SSTATE_DIR | 179 | Shared SSTATE_DIR |
@@ -180,8 +181,8 @@ Shared SSTATE_DIR | |||
180 | 181 | ||
181 | The Workers are all connected over NFS which allows the ``sstate`` | 182 | The Workers are all connected over NFS which allows the ``sstate`` |
182 | directory to be shared between them. This means once a Worker has built | 183 | directory to be shared between them. This means once a Worker has built |
183 | an artifact, all the others can benefit from it. Usage of the directory | 184 | an artifact, all the others can benefit from it. The usage of the directory |
184 | within the directory is designed for sharing over NFS. | 185 | within the build system is designed for sharing over NFS. |
185 | 186 | ||
186 | Resulttool | 187 | Resulttool |
187 | ---------- | 188 | ---------- |
@@ -192,7 +193,7 @@ in a given build and their status. Additional information, such as | |||
192 | failure logs or the time taken to run the tests, may also be included. | 193 | failure logs or the time taken to run the tests, may also be included. |
193 | 194 | ||
194 | Resulttool is part of OpenEmbedded-Core and is used to manipulate these | 195 | Resulttool is part of OpenEmbedded-Core and is used to manipulate these |
195 | json results files. It has the ability to merge files together, display | 196 | JSON results files. It has the ability to merge files together, display |
196 | reports of the test results and compare different result files. | 197 | reports of the test results and compare different result files. |
197 | 198 | ||
198 | For details, see :yocto_wiki:`/Resulttool`. | 199 | For details, see :yocto_wiki:`/Resulttool`. |
@@ -204,9 +205,9 @@ The ``scripts/run-config`` execution is where most of the work within | |||
204 | the Autobuilder happens. It runs through a number of steps; the first | 205 | the Autobuilder happens. It runs through a number of steps; the first |
205 | are general setup steps that are run once and include: | 206 | are general setup steps that are run once and include: |
206 | 207 | ||
207 | #. Set up any ``buildtools-tarball`` if configured. | 208 | #. Set up any :term:`buildtools` tarball if configured. |
208 | 209 | ||
209 | #. Call "buildhistory-init" if buildhistory is configured. | 210 | #. Call ``buildhistory-init`` if :ref:`ref-classes-buildhistory` is configured. |
210 | 211 | ||
211 | For each step that is configured in ``config.json``, it will perform the | 212 | For each step that is configured in ``config.json``, it will perform the |
212 | following: | 213 | following: |
@@ -242,7 +243,7 @@ of post-build steps, including: | |||
242 | #. Call ``scripts/upload-error-reports`` to send any error reports | 243 | #. Call ``scripts/upload-error-reports`` to send any error reports |
243 | generated to the remote server. | 244 | generated to the remote server. |
244 | 245 | ||
245 | #. Cleanup the build directory using | 246 | #. Cleanup the :term:`Build Directory` using |
246 | :ref:`test-manual/understand-autobuilder:clobberdir` if the build was successful, | 247 | :ref:`test-manual/understand-autobuilder:clobberdir` if the build was successful, |
247 | else rename it to "build-renamed" for potential future debugging. | 248 | else rename it to "build-renamed" for potential future debugging. |
248 | 249 | ||
@@ -250,15 +251,16 @@ Deploying Yocto Autobuilder | |||
250 | =========================== | 251 | =========================== |
251 | 252 | ||
252 | The most up to date information about how to setup and deploy your own | 253 | The most up to date information about how to setup and deploy your own |
253 | Autbuilder can be found in README.md in the ``yocto-autobuilder2`` | 254 | Autobuilder can be found in :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>` |
254 | repository. | 255 | in the :yocto_git:`yocto-autobuilder2 </yocto-autobuilder2>` repository. |
255 | 256 | ||
256 | We hope that people can use the ``yocto-autobuilder2`` code directly but | 257 | We hope that people can use the :yocto_git:`yocto-autobuilder2 </yocto-autobuilder2>` |
257 | it is inevitable that users will end up needing to heavily customise the | 258 | code directly but it is inevitable that users will end up needing to heavily |
258 | ``yocto-autobuilder-helper`` repository, particularly the | 259 | customize the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>` |
259 | ``config.json`` file as they will want to define their own test matrix. | 260 | repository, particularly the ``config.json`` file as they will want to define |
261 | their own test matrix. | ||
260 | 262 | ||
261 | The Autobuilder supports wo customization options: | 263 | The Autobuilder supports two customization options: |
262 | 264 | ||
263 | - variable substitution | 265 | - variable substitution |
264 | 266 | ||
@@ -278,7 +280,7 @@ environment:: | |||
278 | $ ABHELPER_JSON="config.json /some/location/local.json" | 280 | $ ABHELPER_JSON="config.json /some/location/local.json" |
279 | 281 | ||
280 | One issue users often run into is validation of the ``config.json`` files. A | 282 | One issue users often run into is validation of the ``config.json`` files. A |
281 | tip for minimizing issues from invalid json files is to use a Git | 283 | tip for minimizing issues from invalid JSON files is to use a Git |
282 | ``pre-commit-hook.sh`` script to verify the JSON file before committing | 284 | ``pre-commit-hook.sh`` script to verify the JSON file before committing |
283 | it. Create a symbolic link as follows:: | 285 | it. Create a symbolic link as follows:: |
284 | 286 | ||
diff --git a/documentation/test-manual/yocto-project-compatible.rst b/documentation/test-manual/yocto-project-compatible.rst new file mode 100644 index 0000000000..65d924fad9 --- /dev/null +++ b/documentation/test-manual/yocto-project-compatible.rst | |||
@@ -0,0 +1,129 @@ | |||
1 | .. SPDX-License-Identifier: CC-BY-SA-2.0-UK | ||
2 | |||
3 | ************************ | ||
4 | Yocto Project Compatible | ||
5 | ************************ | ||
6 | |||
7 | ============ | ||
8 | Introduction | ||
9 | ============ | ||
10 | |||
11 | After the introduction of layers to OpenEmbedded, it quickly became clear | ||
12 | that while some layers were popular and worked well, others developed a | ||
13 | reputation for being "problematic". Those were layers which didn't | ||
14 | interoperate well with others and tended to assume they controlled all | ||
15 | the aspects of the final output. This usually isn't intentional but happens | ||
16 | because such layers are often created by developers with a particular focus | ||
17 | (e.g. a company's :term:`BSP<Board Support Package (BSP)>`) whilst the end | ||
18 | users have a different one (e.g. integrating that | ||
19 | :term:`BSP<Board Support Package (BSP)>` into a product). | ||
20 | |||
21 | As a result of noticing such patterns and friction between layers, the project | ||
22 | developed the "Yocto Project Compatible" badge program, allowing layers | ||
23 | following the best known practises to be marked as being widely compatible | ||
24 | with other ones. This takes the form of a set of "yes/no" binary answer | ||
25 | questions where layers can declare if they meet the appropriate criteria. | ||
26 | In the second version of the program, a script was added to make validation | ||
27 | easier and clearer, the script is called ``yocto-check-layer`` and is | ||
28 | available in :term:`OpenEmbedded-Core (OE-Core)`. | ||
29 | |||
30 | See :ref:`dev-manual/layers:making sure your layer is compatible with yocto project` | ||
31 | for details. | ||
32 | |||
33 | ======== | ||
34 | Benefits | ||
35 | ======== | ||
36 | |||
37 | :ref:`overview-manual/yp-intro:the yocto project layer model` is powerful | ||
38 | and flexible: it gives users the ultimate power to change pretty much any | ||
39 | aspect of the system but as with most things, power comes with responsibility. | ||
40 | The Yocto Project would like to see people able to mix and match BSPs with | ||
41 | distro configs or software stacks and be able to merge succesfully. | ||
42 | Over time, the project identified characteristics in layers that allow them | ||
43 | to operate well together. "anti-patterns" were also found, preventing layers | ||
44 | from working well together. | ||
45 | |||
46 | The intent of the compatibility program is simple: if the layer passes the | ||
47 | compatibility tests, it is considered "well behaved" and should operate | ||
48 | and cooperate well with other compatible layers. | ||
49 | |||
50 | The benefits of compatibility can be seen from multiple different user and | ||
51 | member perspectives. From a hardware perspective | ||
52 | (a :ref:`overview-manual/concepts:bsp layer`), compatibility means the | ||
53 | hardware can be used in many different products and use cases without | ||
54 | impacting the software stacks being run with it. For a company developing | ||
55 | a product, compatibility gives you a specification / standard you can | ||
56 | require in a contract and then know it will have certain desired | ||
57 | characteristics for interoperability. It also puts constraints on how invasive | ||
58 | the code bases are into the rest of the system, meaning that multiple | ||
59 | different separate hardware support layers can coexist (e.g. for multiple | ||
60 | product lines from different hardware manufacturers). This can also make it | ||
61 | easier for one or more parties to upgrade those system components for security | ||
62 | purposes during the lifecycle of a product. | ||
63 | |||
64 | ================== | ||
65 | Validating a layer | ||
66 | ================== | ||
67 | |||
68 | The badges are available to members of the Yocto Project (as member benefit) | ||
69 | and to open source projects run on a non-commercial basis. However, anyone can | ||
70 | answer the questions and run the script. | ||
71 | |||
72 | The project encourages all layer maintainers to review the questions and the | ||
73 | output from the script against their layer, as the way some layers are | ||
74 | constructed often has unintended consequences. The questions and the script | ||
75 | are designed to highlight known issues which are often easy to solve. This | ||
76 | makes layers easier to use and therefore more popular. | ||
77 | |||
78 | It is intended that over time, the tests will evolve as new best known | ||
79 | practices are identified, and as new interoperability issues are found, | ||
80 | unnecessarily restricting layer interoperability. If anyone becomes aware of | ||
81 | either type, please let the project know through the | ||
82 | :yocto_home:`technical calls </public-virtual-meetings/>`, | ||
83 | the :yocto_home:`mailing lists </community/mailing-lists/>` | ||
84 | or through the :oe_wiki:`Technical Steering Committee (TSC) </TSC>`. | ||
85 | The TSC is responsible for the technical criteria used by the program. | ||
86 | |||
87 | Layers are divided into three types: | ||
88 | |||
89 | - :ref:`"BSP" or "hardware support"<overview-manual/concepts:bsp layer>` | ||
90 | layers contain support for particular pieces of hardware. This includes | ||
91 | kernel and boot loader configuration, and any recipes for firmware or | ||
92 | kernel modules needed for the hardware. Such layers usually correspond | ||
93 | to a :term:`MACHINE` setting. | ||
94 | |||
95 | - :ref:`"distro" layers<overview-manual/concepts:distro layer>` defined | ||
96 | as layers providing configuration options and settings such as the | ||
97 | choice of init system, compiler and optimisation options, and | ||
98 | configuration and choices of software components. This would usually | ||
99 | correspond to a :term:`DISTRO` setting. | ||
100 | |||
101 | - "software" layers are usually recipes. A layer might target a | ||
102 | particular graphical UI or software stack component. | ||
103 | |||
104 | Here are key best practices the program tries to encourage: | ||
105 | |||
106 | - A layer should clearly show who maintains it, and who change | ||
107 | submissions and bug reports should be sent to. | ||
108 | |||
109 | - Where multiple types of functionality are present, the layer should | ||
110 | be internally divided into sublayers to separate these components. | ||
111 | That's because some users may only need one of them and separability | ||
112 | is a key best practice. | ||
113 | |||
114 | - Adding a layer to a build should not modify that build, unless the | ||
115 | user changes a configuration setting to activate the layer, by selecting | ||
116 | a :term:`MACHINE`, a :term:`DISTRO` or a :term:`DISTRO_FEATURES` setting. | ||
117 | |||
118 | - Layers should be documenting where they don’t support normal "core" | ||
119 | functionality such as where debug symbols are disabled or missing, where | ||
120 | development headers and on-target library usage may not work or where | ||
121 | functionality like the SDK/eSDK would not be expected to work. | ||
122 | |||
123 | The project does test the compatibility status of the core project layers on | ||
124 | its :doc:`Autobuilder </test-manual/understand-autobuilder>`. | ||
125 | |||
126 | The official form to submit compatibility requests with is at | ||
127 | :yocto_home:`/ecosystem/branding/compatible-registration/`. | ||
128 | Applicants can display the badge they get when their application is successful. | ||
129 | |||