1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
|
.. SPDX-License-Identifier: CC-BY-SA-2.0-UK
*******************************************
Understanding the Yocto Project Autobuilder
*******************************************
Execution Flow within the Autobuilder
=====================================
The "a-full" and "a-quick" targets are the usual entry points into the
Autobuilder and it makes sense to follow the process through the system
starting there. This is best visualized from the Autobuilder Console
view (:yocto_ab:`/typhoon/#/console`).
Each item along the top of that view represents some "target build" and
these targets are all run in parallel. The 'full' build will trigger the
majority of them, the "quick" build will trigger some subset of them.
The Autobuilder effectively runs whichever configuration is defined for
each of those targets on a separate buildbot worker. To understand the
configuration, you need to look at the entry on ``config.json`` file
within the ``yocto-autobuilder-helper`` repository. The targets are
defined in the ‘overrides' section, a quick example could be qemux86-64
which looks like::
"qemux86-64" : {
"MACHINE" : "qemux86-64",
"TEMPLATE" : "arch-qemu",
"step1" : {
"extravars" : [
"IMAGE_FSTYPES_append = ' wic wic.bmap'"
]
}
},
And to expand that, you need the "arch-qemu" entry from
the "templates" section, which looks like::
"arch-qemu" : {
"BUILDINFO" : true,
"BUILDHISTORY" : true,
"step1" : {
"BBTARGETS" : "core-image-sato core-image-sato-dev core-image-sato-sdk core-image-minimal core-image-minimal-dev core-image-sato:do_populate_sdk",
"SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage core-image-sato:do_testsdk"
},
"step2" : {
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato:do_populate_sdk core-image-minimal:do_populate_sdk_ext core-image-sato:do_populate_sdk_ext",
"SANITYTARGETS" : "core-image-sato:do_testsdk core-image-minimal:do_testsdkext core-image-sato:do_testsdkext"
},
"step3" : {
"BUILDHISTORY" : false,
"EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest ${HELPERSTMACHTARGS} -j 15"],
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
}
},
Combining these two entries you can see that "qemux86-64" is a three step build where the
``bitbake BBTARGETS`` would be run, then ``bitbake SANITYTARGETS`` for each step; all for
``MACHINE="qemx86-64"`` but with differing SDKMACHINE settings. In step
1 an extra variable is added to the ``auto.conf`` file to enable wic
image generation.
While not every detail of this is covered here, you can see how the
template mechanism allows quite complex configurations to be built up
yet allows duplication and repetition to be kept to a minimum.
The different build targets are designed to allow for parallelization,
so different machines are usually built in parallel, operations using
the same machine and metadata are built sequentially, with the aim of
trying to optimize build efficiency as much as possible.
The ``config.json`` file is processed by the scripts in the Helper
repository in the ``scripts`` directory. The following section details
how this works.
Autobuilder Target Execution Overview
=====================================
For each given target in a build, the Autobuilder executes several
steps. These are configured in ``yocto-autobuilder2/builders.py`` and
roughly consist of:
#. *Run clobberdir*.
This cleans out any previous build. Old builds are left around to
allow easier debugging of failed builds. For additional information,
see :ref:`test-manual/understand-autobuilder:clobberdir`.
#. *Obtain yocto-autobuilder-helper*
This step clones the ``yocto-autobuilder-helper`` git repository.
This is necessary to prevent the requirement to maintain all the
release or project-specific code within Buildbot. The branch chosen
matches the release being built so we can support older releases and
still make changes in newer ones.
#. *Write layerinfo.json*
This transfers data in the Buildbot UI when the build was configured
to the Helper.
#. *Call scripts/shared-repo-unpack*
This is a call into the Helper scripts to set up a checkout of all
the pieces this build might need. It might clone the BitBake
repository and the OpenEmbedded-Core repository. It may clone the
Poky repository, as well as additional layers. It will use the data
from the ``layerinfo.json`` file to help understand the
configuration. It will also use a local cache of repositories to
speed up the clone checkouts. For additional information, see
:ref:`test-manual/understand-autobuilder:Autobuilder Clone Cache`.
This step has two possible modes of operation. If the build is part
of a parent build, its possible that all the repositories needed may
already be available, ready in a pre-prepared directory. An "a-quick"
or "a-full" build would prepare this before starting the other
sub-target builds. This is done for two reasons:
- the upstream may change during a build, for example, from a forced
push and this ensures we have matching content for the whole build
- if 15 Workers all tried to pull the same data from the same repos,
we can hit resource limits on upstream servers as they can think
they are under some kind of network attack
This pre-prepared directory is shared among the Workers over NFS. If
the build is an individual build and there is no "shared" directory
available, it would clone from the cache and the upstreams as
necessary. This is considered the fallback mode.
#. *Call scripts/run-config*
This is another call into the Helper scripts where its expected that
the main functionality of this target will be executed.
Autobuilder Technology
======================
The Autobuilder has Yocto Project-specific functionality to allow builds
to operate with increased efficiency and speed.
clobberdir
----------
When deleting files, the Autobuilder uses ``clobberdir``, which is a
special script that moves files to a special location, rather than
deleting them. Files in this location are deleted by an ``rm`` command,
which is run under ``ionice -c 3``. For example, the deletion only
happens when there is idle IO capacity on the Worker. The Autobuilder
Worker Janitor runs this deletion. See :ref:`test-manual/understand-autobuilder:Autobuilder Worker Janitor`.
Autobuilder Clone Cache
-----------------------
Cloning repositories from scratch each time they are required was slow
on the Autobuilder. We therefore have a stash of commonly used
repositories pre-cloned on the Workers. Data is fetched from these
during clones first, then "topped up" with later revisions from any
upstream when necessary. The cache is maintained by the Autobuilder
Worker Janitor. See :ref:`test-manual/understand-autobuilder:Autobuilder Worker Janitor`.
Autobuilder Worker Janitor
--------------------------
This is a process running on each Worker that performs two basic
operations, including background file deletion at IO idle (see :ref:`test-manual/understand-autobuilder:Autobuilder Target Execution Overview`: Run clobberdir) and
maintenance of a cache of cloned repositories to improve the speed
the system can checkout repositories.
Shared DL_DIR
-------------
The Workers are all connected over NFS which allows DL_DIR to be shared
between them. This reduces network accesses from the system and allows
the build to be sped up. Usage of the directory within the build system
is designed to be able to be shared over NFS.
Shared SSTATE_DIR
-----------------
The Workers are all connected over NFS which allows the ``sstate``
directory to be shared between them. This means once a Worker has built
an artifact, all the others can benefit from it. Usage of the directory
within the directory is designed for sharing over NFS.
Resulttool
----------
All of the different tests run as part of the build generate output into
``testresults.json`` files. This allows us to determine which tests ran
in a given build and their status. Additional information, such as
failure logs or the time taken to run the tests, may also be included.
Resulttool is part of OpenEmbedded-Core and is used to manipulate these
json results files. It has the ability to merge files together, display
reports of the test results and compare different result files.
For details, see :yocto_wiki:`/Resulttool`.
run-config Target Execution
===========================
The ``scripts/run-config`` execution is where most of the work within
the Autobuilder happens. It runs through a number of steps; the first
are general setup steps that are run once and include:
#. Set up any ``buildtools-tarball`` if configured.
#. Call "buildhistory-init" if buildhistory is configured.
For each step that is configured in ``config.json``, it will perform the
following:
#. Add any layers that are specified using the
``bitbake-layers add-layer`` command (logging as stepXa)
#. Call the ``scripts/setup-config`` script to generate the necessary
``auto.conf`` configuration file for the build
#. Run the ``bitbake BBTARGETS`` command (logging as stepXb)
#. Run the ``bitbake SANITYTARGETS`` command (logging as stepXc)
#. Run the ``EXTRACMDS`` command, which are run within the BitBake build
environment (logging as stepXd)
#. Run the ``EXTRAPLAINCMDS`` command(s), which are run outside the
BitBake build environment (logging as stepXd)
#. Remove any layers added in step
1 using the ``bitbake-layers remove-layer`` command (logging as stepXa)
Once the execution steps above complete, ``run-config`` executes a set
of post-build steps, including:
#. Call ``scripts/publish-artifacts`` to collect any output which is to
be saved from the build.
#. Call ``scripts/collect-results`` to collect any test results to be
saved from the build.
#. Call ``scripts/upload-error-reports`` to send any error reports
generated to the remote server.
#. Cleanup the build directory using
:ref:`test-manual/understand-autobuilder:clobberdir` if the build was successful,
else rename it to "build-renamed" for potential future debugging.
Deploying Yocto Autobuilder
===========================
The most up to date information about how to setup and deploy your own
Autobuilder can be found in README.md in the ``yocto-autobuilder2``
repository.
We hope that people can use the ``yocto-autobuilder2`` code directly but
it is inevitable that users will end up needing to heavily customise the
``yocto-autobuilder-helper`` repository, particularly the
``config.json`` file as they will want to define their own test matrix.
The Autobuilder supports wo customization options:
- variable substitution
- overlaying configuration files
The standard ``config.json`` minimally attempts to allow substitution of
the paths. The Helper script repository includes a
``local-example.json`` file to show how you could override these from a
separate configuration file. Pass the following into the environment of
the Autobuilder::
$ ABHELPER_JSON="config.json local-example.json"
As another example, you could also pass the following into the
environment::
$ ABHELPER_JSON="config.json /some/location/local.json"
One issue users often run into is validation of the ``config.json`` files. A
tip for minimizing issues from invalid json files is to use a Git
``pre-commit-hook.sh`` script to verify the JSON file before committing
it. Create a symbolic link as follows::
$ ln -s ../../scripts/pre-commit-hook.sh .git/hooks/pre-commit
|