diff options
Diffstat (limited to 'documentation/test-manual/test-manual-intro.rst')
-rw-r--r-- | documentation/test-manual/test-manual-intro.rst | 486 |
1 files changed, 486 insertions, 0 deletions
diff --git a/documentation/test-manual/test-manual-intro.rst b/documentation/test-manual/test-manual-intro.rst new file mode 100644 index 0000000000..491c4bad9a --- /dev/null +++ b/documentation/test-manual/test-manual-intro.rst | |||
@@ -0,0 +1,486 @@ | |||
1 | ***************************************** | ||
2 | The Yocto Project Test Environment Manual | ||
3 | ***************************************** | ||
4 | |||
5 | .. _test-welcome: | ||
6 | |||
7 | Welcome | ||
8 | ======= | ||
9 | |||
10 | Welcome to the Yocto Project Test Environment Manual! This manual is a | ||
11 | work in progress. The manual contains information about the testing | ||
12 | environment used by the Yocto Project to make sure each major and minor | ||
13 | release works as intended. All the project’s testing infrastructure and | ||
14 | processes are publicly visible and available so that the community can | ||
15 | see what testing is being performed, how it’s being done and the current | ||
16 | status of the tests and the project at any given time. It is intended | ||
17 | that Other organizations can leverage off the process and testing | ||
18 | environment used by the Yocto Project to create their own automated, | ||
19 | production test environment, building upon the foundations from the | ||
20 | project core. | ||
21 | |||
22 | Currently, the Yocto Project Test Environment Manual has no projected | ||
23 | release date. This manual is a work-in-progress and is being initially | ||
24 | loaded with information from the `README <>`__ files and notes from key | ||
25 | engineers: | ||
26 | |||
27 | - *``yocto-autobuilder2``:* This | ||
28 | ```README.md`` <http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder2/tree/README.md>`__ | ||
29 | is the main README which detials how to set up the Yocto Project | ||
30 | Autobuilder. The ``yocto-autobuilder2`` repository represents the | ||
31 | Yocto Project's console UI plugin to Buildbot and the configuration | ||
32 | necessary to configure Buildbot to perform the testing the project | ||
33 | requires. | ||
34 | |||
35 | - *``yocto-autobuilder-helper``:* This | ||
36 | ```README`` <http://git.yoctoproject.org/clean/cgit.cgi/yocto-autobuilder-helper/tree/README>`__ | ||
37 | and repository contains Yocto Project Autobuilder Helper scripts and | ||
38 | configuration. The ``yocto-autobuilder-helper`` repository contains | ||
39 | the "glue" logic that defines which tests to run and how to run them. | ||
40 | As a result, it can be used by any Continuous Improvement (CI) system | ||
41 | to run builds, support getting the correct code revisions, configure | ||
42 | builds and layers, run builds, and collect results. The code is | ||
43 | independent of any CI system, which means the code can work Buildbot, | ||
44 | Jenkins, or others. This repository has a branch per release of the | ||
45 | project defining the tests to run on a per release basis. | ||
46 | |||
47 | .. _test-yocto-project-autobuilder-overview: | ||
48 | |||
49 | Yocto Project Autobuilder Overview | ||
50 | ================================== | ||
51 | |||
52 | The Yocto Project Autobuilder collectively refers to the software, | ||
53 | tools, scripts, and procedures used by the Yocto Project to test | ||
54 | released software across supported hardware in an automated and regular | ||
55 | fashion. Basically, during the development of a Yocto Project release, | ||
56 | the Autobuilder tests if things work. The Autobuilder builds all test | ||
57 | targets and runs all the tests. | ||
58 | |||
59 | The Yocto Project uses now uses standard upstream | ||
60 | `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__ (version 9) to | ||
61 | drive its integration and testing. Buildbot Nine has a plug-in interface | ||
62 | that the Yocto Project customizes using code from the | ||
63 | ``yocto-autobuilder2`` repository, adding its own console UI plugin. The | ||
64 | resulting UI plug-in allows you to visualize builds in a way suited to | ||
65 | the project's needs. | ||
66 | |||
67 | A ``helper`` layer provides configuration and job management through | ||
68 | scripts found in the ``yocto-autobuilder-helper`` repository. The | ||
69 | ``helper`` layer contains the bulk of the build configuration | ||
70 | information and is release-specific, which makes it highly customizable | ||
71 | on a per-project basis. The layer is CI system-agnostic and contains a | ||
72 | number of Helper scripts that can generate build configurations from | ||
73 | simple JSON files. | ||
74 | |||
75 | .. note:: | ||
76 | |||
77 | The project uses Buildbot for historical reasons but also because | ||
78 | many of the project developers have knowledge of python. It is | ||
79 | possible to use the outer layers from another Continuous Integration | ||
80 | (CI) system such as | ||
81 | `Jenkins <https://en.wikipedia.org/wiki/Jenkins_(software)>`__ | ||
82 | instead of Buildbot. | ||
83 | |||
84 | The following figure shows the Yocto Project Autobuilder stack with a | ||
85 | topology that includes a controller and a cluster of workers: | ||
86 | |||
87 | .. _test-project-tests: | ||
88 | |||
89 | Yocto Project Tests - Types of Testing Overview | ||
90 | =============================================== | ||
91 | |||
92 | The Autobuilder tests different elements of the project by using | ||
93 | thefollowing types of tests: | ||
94 | |||
95 | - *Build Testing:* Tests whether specific configurations build by | ||
96 | varying ```MACHINE`` <&YOCTO_DOCS_REF_URL;#var-MACHINE>`__, | ||
97 | ```DISTRO`` <&YOCTO_DOCS_REF_URL;#var-DISTRO>`__, other configuration | ||
98 | options, and the specific target images being built (or world). Used | ||
99 | to trigger builds of all the different test configurations on the | ||
100 | Autobuilder. Builds usually cover many different targets for | ||
101 | different architectures, machines, and distributions, as well as | ||
102 | different configurations, such as different init systems. The | ||
103 | Autobuilder tests literally hundreds of configurations and targets. | ||
104 | |||
105 | - *Sanity Checks During the Build Process:* Tests initiated through | ||
106 | the ```insane`` <&YOCTO_DOCS_REF_URL;#ref-classes-insane>`__ | ||
107 | class. These checks ensure the output of the builds are correct. | ||
108 | For example, does the ELF architecture in the generated binaries | ||
109 | match the target system? ARM binaries would not work in a MIPS | ||
110 | system! | ||
111 | |||
112 | - *Build Performance Testing:* Tests whether or not commonly used steps | ||
113 | during builds work efficiently and avoid regressions. Tests to time | ||
114 | commonly used usage scenarios are run through ``oe-build-perf-test``. | ||
115 | These tests are run on isolated machines so that the time | ||
116 | measurements of the tests are accurate and no other processes | ||
117 | interfere with the timing results. The project currently tests | ||
118 | performance on two different distributions, Fedora and Ubuntu, to | ||
119 | ensure we have no single point of failure and can ensure the | ||
120 | different distros work effectively. | ||
121 | |||
122 | - *eSDK Testing:* Image tests initiated through the following command: | ||
123 | $ bitbake image -c testsdkext The tests utilize the ``testsdkext`` | ||
124 | class and the ``do_testsdkext`` task. | ||
125 | |||
126 | - *Feature Testing:* Various scenario-based tests are run through the | ||
127 | `OpenEmbedded | ||
128 | Self-Test <&YOCTO_DOCS_REF_URL;#testing-and-quality-assurance>`__ | ||
129 | (oe-selftest). We test oe-selftest on each of the main distrubutions | ||
130 | we support. | ||
131 | |||
132 | - *Image Testing:* Image tests initiated through the following command: | ||
133 | $ bitbake image -c testimage The tests utilize the | ||
134 | ```testimage*`` <&YOCTO_DOCS_REF_URL;#ref-classes-testimage*>`__ | ||
135 | classes and the | ||
136 | ```do_testimage`` <&YOCTO_DOCS_REF_URL;#ref-tasks-testimage>`__ task. | ||
137 | |||
138 | - *Layer Testing:* The Autobuilder has the possibility to test whether | ||
139 | specific layers work with the test of the system. The layers tested | ||
140 | may be selected by members of the project. Some key community layers | ||
141 | are also tested periodically. | ||
142 | |||
143 | - *Package Testing:* A Package Test (ptest) runs tests against packages | ||
144 | built by the OpenEmbedded build system on the target machine. See the | ||
145 | "`Testing Packages With | ||
146 | ptest <&YOCTO_DOCS_DEV_URL;#testing-packages-with-ptest>`__" section | ||
147 | in the Yocto Project Development Tasks Manual and the | ||
148 | "`Ptest <&YOCTO_WIKI_URL;/wiki/Ptest>`__" Wiki page for more | ||
149 | information on Ptest. | ||
150 | |||
151 | - *SDK Testing:* Image tests initiated through the following command: $ | ||
152 | bitbake image -c testsdk The tests utilize the | ||
153 | ```testsdk`` <&YOCTO_DOCS_REF_URL;#ref-classes-testsdk>`__ class and | ||
154 | the ``do_testsdk`` task. | ||
155 | |||
156 | - *Unit Testing:* Unit tests on various components of the system run | ||
157 | through ``oe-selftest`` and | ||
158 | ```bitbake-selftest`` <&YOCTO_DOCS_REF_URL;#testing-and-quality-assurance>`__. | ||
159 | |||
160 | - *Automatic Upgrade Helper:* This target tests whether new versions of | ||
161 | software are available and whether we can automatically upgrade to | ||
162 | those new versions. If so, this target emails the maintainers with a | ||
163 | patch to let them know this is possible. | ||
164 | |||
165 | .. _test-test-mapping: | ||
166 | |||
167 | How Tests Map to Areas of Code | ||
168 | ============================== | ||
169 | |||
170 | Tests map into the codebase as follows: | ||
171 | |||
172 | - *bitbake-selftest*: | ||
173 | |||
174 | These tests are self-contained and test BitBake as well as its APIs, | ||
175 | which include the fetchers. The tests are located in | ||
176 | ``bitbake/lib/*/tests``. | ||
177 | |||
178 | From within the BitBake repository, run the following: $ | ||
179 | bitbake-selftest | ||
180 | |||
181 | To skip tests that access the Internet, use the ``BB_SKIP_NETTEST`` | ||
182 | variable when running "bitbake-selftest" as follows: $ | ||
183 | BB_SKIP_NETTEST=yes bitbake-selftest | ||
184 | |||
185 | The default output is quiet and just prints a summary of what was | ||
186 | run. To see more information, there is a verbose option:$ | ||
187 | bitbake-selftest -v | ||
188 | |||
189 | Use this option when you wish to skip tests that access the network, | ||
190 | which are mostly necessary to test the fetcher modules. To specify | ||
191 | individual test modules to run, append the test module name to the | ||
192 | "bitbake-selftest" command. For example, to specify the tests for the | ||
193 | bb.data.module, run: $ bitbake-selftest bb.test.data.moduleYou can | ||
194 | also specify individual tests by defining the full name and module | ||
195 | plus the class path of the test, for example: $ bitbake-selftest | ||
196 | bb.tests.data.TestOverrides.test_one_override | ||
197 | |||
198 | The tests are based on `Python | ||
199 | unittest <https://docs.python.org/3/library/unittest.html>`__. | ||
200 | |||
201 | - *oe-selftest*: | ||
202 | |||
203 | - These tests use OE to test the workflows, which include testing | ||
204 | specific features, behaviors of tasks, and API unit tests. | ||
205 | |||
206 | - The tests can take advantage of parallelism through the "-j" | ||
207 | option, which can specify a number of threads to spread the tests | ||
208 | across. Note that all tests from a given class of tests will run | ||
209 | in the same thread. To parallelize large numbers of tests you can | ||
210 | split the class into multiple units. | ||
211 | |||
212 | - The tests are based on Python unittest. | ||
213 | |||
214 | - The code for the tests resides in | ||
215 | ``meta/lib/oeqa/selftest/cases/``. | ||
216 | |||
217 | - To run all the tests, enter the following command: $ oe-selftest | ||
218 | -a | ||
219 | |||
220 | - To run a specific test, use the following command form where | ||
221 | testname is the name of the specific test: $ oe-selftest -r | ||
222 | testname For example, the following command would run the tinfoil | ||
223 | getVar API test:$ oe-selftest -r | ||
224 | tinfoil.TinfoilTests.test_getvarIt is also possible to run a set | ||
225 | of tests. For example the following command will run all of the | ||
226 | tinfoil tests:$ oe-selftest -r tinfoil | ||
227 | |||
228 | - *testimage:* | ||
229 | |||
230 | - These tests build an image, boot it, and run tests against the | ||
231 | image's content. | ||
232 | |||
233 | - The code for these tests resides in | ||
234 | ``meta/lib/oeqa/runtime/cases/``. | ||
235 | |||
236 | - You need to set the | ||
237 | ```IMAGE_CLASSES`` <&YOCTO_DOCS_REF_URL;#var-IMAGE_CLASSES>`__ | ||
238 | variable as follows: IMAGE_CLASSES += "testimage" | ||
239 | |||
240 | - Run the tests using the following command form: $ bitbake image -c | ||
241 | testimage | ||
242 | |||
243 | - *testsdk:* | ||
244 | |||
245 | - These tests build an SDK, install it, and then run tests against | ||
246 | that SDK. | ||
247 | |||
248 | - The code for these tests resides in ``meta/lib/oeqa/sdk/cases/``. | ||
249 | |||
250 | - Run the test using the following command form: $ bitbake image -c | ||
251 | testsdk | ||
252 | |||
253 | - *testsdk_ext:* | ||
254 | |||
255 | - These tests build an extended SDK (eSDK), install that eSDK, and | ||
256 | run tests against the eSDK. | ||
257 | |||
258 | - The code for these tests resides in ``meta/lib/oeqa/esdk``. | ||
259 | |||
260 | - To run the tests, use the following command form: $ bitbake image | ||
261 | -c testsdkext | ||
262 | |||
263 | - *oe-build-perf-test:* | ||
264 | |||
265 | - These tests run through commonly used usage scenarios and measure | ||
266 | the performance times. | ||
267 | |||
268 | - The code for these tests resides in ``meta/lib/oeqa/buildperf``. | ||
269 | |||
270 | - To run the tests, use the following command form: $ | ||
271 | oe-build-perf-test optionsThe command takes a number of options, | ||
272 | such as where to place the test results. The Autobuilder Helper | ||
273 | Scripts include the ``build-perf-test-wrapper`` script with | ||
274 | examples of how to use the oe-build-perf-test from the command | ||
275 | line. | ||
276 | |||
277 | Use the ``oe-git-archive`` command to store test results into a | ||
278 | Git repository. | ||
279 | |||
280 | Use the ``oe-build-perf-report`` command to generate text reports | ||
281 | and HTML reports with graphs of the performance data. For | ||
282 | examples, see | ||
283 | `http://downloads.yoctoproject.org/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.html <#>`__ | ||
284 | and | ||
285 | `http://downloads.yoctoproject.org/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.txt <#>`__. | ||
286 | |||
287 | - The tests are contained in ``lib/oeqa/buildperf/test_basic.py``. | ||
288 | |||
289 | Test Examples | ||
290 | ============= | ||
291 | |||
292 | This section provides example tests for each of the tests listed in the | ||
293 | `How Tests Map to Areas of Code <#test-test-mapping>`__ section. | ||
294 | |||
295 | For oeqa tests, testcases for each area reside in the main test | ||
296 | directory at ``meta/lib/oeqa/selftest/cases`` directory. | ||
297 | |||
298 | For oe-selftest. bitbake testcases reside in the ``lib/bb/tests/`` | ||
299 | directory. | ||
300 | |||
301 | .. _bitbake-selftest-example: | ||
302 | |||
303 | ``bitbake-selftest`` | ||
304 | -------------------- | ||
305 | |||
306 | A simple test example from ``lib/bb/tests/data.py`` is: class | ||
307 | DataExpansions(unittest.TestCase): def setUp(self): self.d = | ||
308 | bb.data.init() self.d["foo"] = "value_of_foo" self.d["bar"] = | ||
309 | "value_of_bar" self.d["value_of_foo"] = "value_of_'value_of_foo'" def | ||
310 | test_one_var(self): val = self.d.expand("${foo}") | ||
311 | self.assertEqual(str(val), "value_of_foo") | ||
312 | |||
313 | In this example, a ```DataExpansions`` <>`__ class of tests is created, | ||
314 | derived from standard python unittest. The class has a common ``setUp`` | ||
315 | function which is shared by all the tests in the class. A simple test is | ||
316 | then added to test that when a variable is expanded, the correct value | ||
317 | is found. | ||
318 | |||
319 | Bitbake selftests are straightforward python unittest. Refer to the | ||
320 | Python unittest documentation for additional information on writing | ||
321 | these tests at: `https://docs.python.org/3/library/unittest.html <#>`__. | ||
322 | |||
323 | .. _oe-selftest-example: | ||
324 | |||
325 | ``oe-selftest`` | ||
326 | --------------- | ||
327 | |||
328 | These tests are more complex due to the setup required behind the scenes | ||
329 | for full builds. Rather than directly using Python's unittest, the code | ||
330 | wraps most of the standard objects. The tests can be simple, such as | ||
331 | testing a command from within the OE build environment using the | ||
332 | following example:class BitbakeLayers(OESelftestTestCase): def | ||
333 | test_bitbakelayers_showcrossdepends(self): result = | ||
334 | runCmd('bitbake-layers show-cross-depends') self.assertTrue('aspell' in | ||
335 | result.output, msg = "No dependencies were shown. bitbake-layers | ||
336 | show-cross-depends output: %s"% result.output) | ||
337 | |||
338 | This example, taken from ``meta/lib/oeqa/selftest/cases/bblayers.py``, | ||
339 | creates a testcase from the ```OESelftestTestCase`` <>`__ class, derived | ||
340 | from ``unittest.TestCase``, which runs the ``bitbake-layers`` command | ||
341 | and checks the output to ensure it contains something we know should be | ||
342 | here. | ||
343 | |||
344 | The ``oeqa.utils.commands`` module contains Helpers which can assist | ||
345 | with common tasks, including: | ||
346 | |||
347 | - *Obtaining the value of a bitbake variable:* Use | ||
348 | ``oeqa.utils.commands.get_bb_var()`` or use | ||
349 | ``oeqa.utils.commands.get_bb_vars()`` for more than one variable | ||
350 | |||
351 | - *Running a bitbake invocation for a build:* Use | ||
352 | ``oeqa.utils.commands.bitbake()`` | ||
353 | |||
354 | - *Running a command:* Use ``oeqa.utils.commandsrunCmd()`` | ||
355 | |||
356 | There is also a ``oeqa.utils.commands.runqemu()`` function for launching | ||
357 | the ``runqemu`` command for testing things within a running, virtualized | ||
358 | image. | ||
359 | |||
360 | You can run these tests in parallel. Parallelism works per test class, | ||
361 | so tests within a given test class should always run in the same build, | ||
362 | while tests in different classes or modules may be split into different | ||
363 | builds. There is no data store available for these tests since the tests | ||
364 | launch the ``bitbake`` command and exist outside of its context. As a | ||
365 | result, common bitbake library functions (bb.*) are also unavailable. | ||
366 | |||
367 | .. _testimage-example: | ||
368 | |||
369 | ``testimage`` | ||
370 | ------------- | ||
371 | |||
372 | These tests are run once an image is up and running, either on target | ||
373 | hardware or under QEMU. As a result, they are assumed to be running in a | ||
374 | target image environment, as opposed to a host build environment. A | ||
375 | simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains | ||
376 | the following:class PythonTest(OERuntimeTestCase): | ||
377 | @OETestDepends(['ssh.SSHTest.test_ssh']) @OEHasPackage(['python3-core']) | ||
378 | def test_python3(self): cmd = "python3 -c \\"import codecs; | ||
379 | print(codecs.encode('Uryyb, jbeyq', 'rot13'))\"" status, output = | ||
380 | self.target.run(cmd) msg = 'Exit status was not 0. Output: %s' % output | ||
381 | self.assertEqual(status, 0, msg=msg) | ||
382 | |||
383 | In this example, the ```OERuntimeTestCase`` <>`__ class wraps | ||
384 | ``unittest.TestCase``. Within the test, ``self.target`` represents the | ||
385 | target system, where commands can be run on it using the ``run()`` | ||
386 | method. | ||
387 | |||
388 | To ensure certain test or package dependencies are met, you can use the | ||
389 | ``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test | ||
390 | in this example would only make sense if python3-core is installed in | ||
391 | the image. | ||
392 | |||
393 | .. _testsdk_ext-example: | ||
394 | |||
395 | ``testsdk_ext`` | ||
396 | --------------- | ||
397 | |||
398 | These tests are run against built extensible SDKs (eSDKs). The tests can | ||
399 | assume that the eSDK environment has already been setup. An example from | ||
400 | ``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following:class | ||
401 | DevtoolTest(OESDKExtTestCase): @classmethod def setUpClass(cls): | ||
402 | myapp_src = os.path.join(cls.tc.esdk_files_dir, "myapp") cls.myapp_dst = | ||
403 | os.path.join(cls.tc.sdk_dir, "myapp") shutil.copytree(myapp_src, | ||
404 | cls.myapp_dst) subprocess.check_output(['git', 'init', '.'], | ||
405 | cwd=cls.myapp_dst) subprocess.check_output(['git', 'add', '.'], | ||
406 | cwd=cls.myapp_dst) subprocess.check_output(['git', 'commit', '-m', | ||
407 | "'test commit'"], cwd=cls.myapp_dst) @classmethod def | ||
408 | tearDownClass(cls): shutil.rmtree(cls.myapp_dst) def | ||
409 | \_test_devtool_build(self, directory): self._run('devtool add myapp %s' | ||
410 | % directory) try: self._run('devtool build myapp') finally: | ||
411 | self._run('devtool reset myapp') def test_devtool_build_make(self): | ||
412 | self._test_devtool_build(self.myapp_dst)In this example, the ``devtool`` | ||
413 | command is tested to see whether a sample application can be built with | ||
414 | the ``devtool build`` command within the eSDK. | ||
415 | |||
416 | .. _testsdk-example: | ||
417 | |||
418 | ``testsdk`` | ||
419 | ----------- | ||
420 | |||
421 | These tests are run against built SDKs. The tests can assume that an SDK | ||
422 | has already been extracted and its environment file has been sourced. A | ||
423 | simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the | ||
424 | following:class Python3Test(OESDKTestCase): def setUp(self): if not | ||
425 | (self.tc.hasHostPackage("nativesdk-python3-core") or | ||
426 | self.tc.hasHostPackage("python3-core-native")): raise | ||
427 | unittest.SkipTest("No python3 package in the SDK") def | ||
428 | test_python3(self): cmd = "python3 -c \\"import codecs; | ||
429 | print(codecs.encode('Uryyb, jbeyq', 'rot13'))\"" output = self._run(cmd) | ||
430 | self.assertEqual(output, "Hello, world\n")In this example, if | ||
431 | nativesdk-python3-core has been installed into the SDK, the code runs | ||
432 | the python3 interpreter with a basic command to check it is working | ||
433 | correctly. The test would only run if python3 is installed in the SDK. | ||
434 | |||
435 | .. _oe-build-perf-test-example: | ||
436 | |||
437 | ``oe-build-perf-test`` | ||
438 | ---------------------- | ||
439 | |||
440 | The performance tests usually measure how long operations take and the | ||
441 | resource utilisation as that happens. An example from | ||
442 | ``meta/lib/oeqa/buildperf/test_basic.py`` contains the following:class | ||
443 | Test3(BuildPerfTestCase): def test3(self): """Bitbake parsing (bitbake | ||
444 | -p)""" # Drop all caches and parse self.rm_cache() | ||
445 | oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True) | ||
446 | self.measure_cmd_resources(['bitbake', '-p'], 'parse_1', 'bitbake -p (no | ||
447 | caches)') # Drop tmp/cache | ||
448 | oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True) | ||
449 | self.measure_cmd_resources(['bitbake', '-p'], 'parse_2', 'bitbake -p (no | ||
450 | tmp/cache)') # Parse with fully cached data | ||
451 | self.measure_cmd_resources(['bitbake', '-p'], 'parse_3', 'bitbake -p | ||
452 | (cached)')This example shows how three specific parsing timings are | ||
453 | measured, with and without various caches, to show how BitBake’s parsing | ||
454 | performance trends over time. | ||
455 | |||
456 | .. _test-writing-considerations: | ||
457 | |||
458 | Considerations When Writing Tests | ||
459 | ================================= | ||
460 | |||
461 | When writing good tests, there are several things to keep in mind. Since | ||
462 | things running on the Autobuilder are accessed concurrently by multiple | ||
463 | workers, consider the following: | ||
464 | |||
465 | **Running "cleanall" is not permitted.** | ||
466 | |||
467 | This can delete files from DL_DIR which would potentially break other | ||
468 | builds running in parallel. If this is required, DL_DIR must be set to | ||
469 | an isolated directory. | ||
470 | |||
471 | **Running "cleansstate" is not permitted.** | ||
472 | |||
473 | This can delete files from SSTATE_DIR which would potentially break | ||
474 | other builds running in parallel. If this is required, SSTATE_DIR must | ||
475 | be set to an isolated directory. Alternatively, you can use the "-f" | ||
476 | option with the ``bitbake`` command to "taint" tasks by changing the | ||
477 | sstate checksums to ensure sstate cache items will not be reused. | ||
478 | |||
479 | **Tests should not change the metadata.** | ||
480 | |||
481 | This is particularly true for oe-selftests since these can run in | ||
482 | parallel and changing metadata leads to changing checksums, which | ||
483 | confuses BitBake while running in parallel. If this is necessary, copy | ||
484 | layers to a temporary location and modify them. Some tests need to | ||
485 | change metadata, such as the devtool tests. To prevent the metadate from | ||
486 | changes, set up temporary copies of that data first. | ||