summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--handbook/extendpoky.xml95
-rw-r--r--meta/classes/base.bbclass102
-rw-r--r--meta/classes/insane.bbclass2
-rw-r--r--meta/classes/package.bbclass20
-rw-r--r--meta/classes/package_deb.bbclass4
-rw-r--r--meta/classes/package_ipk.bbclass4
-rw-r--r--meta/classes/packagedata.bbclass82
-rw-r--r--meta/packages/libidl/libidl-native_0.8.12.bb (renamed from meta/packages/libidl/libidl-native_0.8.3.bb)0
-rw-r--r--meta/packages/libidl/libidl_0.8.12.bb (renamed from meta/packages/libidl/libidl_0.8.3.bb)0
-rw-r--r--meta/packages/xorg-xserver/xserver-xf86-dri-lite_1.6.0.bb15
10 files changed, 205 insertions, 119 deletions
diff --git a/handbook/extendpoky.xml b/handbook/extendpoky.xml
index f259d2ef0a..fa789d4afb 100644
--- a/handbook/extendpoky.xml
+++ b/handbook/extendpoky.xml
@@ -26,7 +26,15 @@
26 </para> 26 </para>
27 27
28 <para> 28 <para>
29 The simplest way to add a new package is to base it on a similar 29 Before writing a recipe from scratch it is often useful to check
30 someone else hasn't written one already. OpenEmbedded is a good place
31 to look as it has a wider scope and hence a wider range of packages.
32 Poky aims to be compatible with OpenEmbedded so most recipes should
33 just work in Poky.
34 </para>
35
36 <para>
37 For new packages, the simplest way to add a recipe is to base it on a similar
30 pre-existing recipe. There are some examples below of how to add 38 pre-existing recipe. There are some examples below of how to add
31 standard types of packages: 39 standard types of packages:
32 </para> 40 </para>
@@ -556,6 +564,37 @@ BBFILE_PRIORITY_extras = "5"</literallayout>
556 </para> 564 </para>
557 </section> 565 </section>
558 566
567 <section id="usingpoky-changes-supplement">
568 <title>Supplementry Metadata Repositories</title>
569
570 <para>
571 Often when developing a project based on Poky there will be components
572 that are not ready for public consumption for whatever reason. By making
573 use of the collections mechanism and other functionality within Poky, it
574 is possible to have a public repository which is supplemented by a private
575 one just containing the pieces that need to be kept private.
576 </para>
577 <para>
578 The usual approach with these is to create a separate git repository called
579 "meta-prvt-XXX" which is checked out alongside the other meta-*
580 directories included in Poky. Under this directory there can be several
581 different directories such as classes, conf and packages which all
582 function as per the usual Poky directory structure.
583 </para>
584 <para>
585 If extra meta-* directories are found, Poky will automatically add them
586 into the BBPATH variable so the conf and class files contained there
587 are found. If a file called poky-extra-environment is found within the
588 meta-* directory, this will be sourced as the environment is setup,
589 allowing certain configuration to be overridden such as the location of the
590 local.conf.sample file that is used.
591 </para>
592 <para>
593 Note that at present, BBFILES is not automatically changed and this needs
594 to be adjusted to find files in the packages/ directory. Usually a custom
595 local.conf.sample file will be used to handle this instead.
596 </para>
597 </section>
559 598
560 <section id='usingpoky-changes-commits'> 599 <section id='usingpoky-changes-commits'>
561 <title>Committing Changes</title> 600 <title>Committing Changes</title>
@@ -564,8 +603,8 @@ BBFILE_PRIORITY_extras = "5"</literallayout>
564 Modifications to Poky are often managed under some kind of source 603 Modifications to Poky are often managed under some kind of source
565 revision control system. The policy for committing to such systems 604 revision control system. The policy for committing to such systems
566 is important as some simple policy can significantly improve 605 is important as some simple policy can significantly improve
567 usability. The tips below are based on the policy that OpenedHand 606 usability. The tips below are based on the policy followed for the
568 uses for commits to Poky. 607 Poky core.
569 </para> 608 </para>
570 609
571 <para> 610 <para>
@@ -622,6 +661,56 @@ BBFILE_PRIORITY_extras = "5"</literallayout>
622 upgradable packages in all cases. 661 upgradable packages in all cases.
623 </para> 662 </para>
624 </section> 663 </section>
664 <section id='usingpoky-changes-collaborate'>
665 <title>Using Poky in a Team Environment</title>
666
667 <para>
668 It may not be immediately clear how Poky can work in a team environment,
669 or scale to a large team of developers. The specifics of any situation
670 will determine the best solution and poky offers immense flexibility in
671 that aspect but there are some practises that experience has shown to work
672 well.
673 </para>
674
675 <para>
676 The core component of any development effort with Poky is often an
677 automated build testing framework and image generation process. This
678 can be used to check that the metadata is buildable, highlight when
679 commits break the builds and provide up to date images allowing people
680 to test the end result and use them as a base platform for further
681 development. Experience shows that buildbot is a good fit for this role
682 and that it works well to configure it to make two types of build -
683 incremental builds and 'from scratch'/full builds. The incremental builds
684 can be tied to a commit hook which triggers them each time a commit is
685 made to the metadata and are a useful acid test of whether a given commit
686 breaks the build in some serious way. They catch lots of simple errors
687 and whilst they won't catch 100% of failures, the tests are fast so
688 developers can get feedback on their changes quickly. The full builds
689 are builds that build everything from the ground up and test everything.
690 They usually happen at preset times such as at night when the machine
691 load isn't high from the incremental builds.
692 </para>
693
694 <para>
695 Most teams have pieces of software undergoing active development. It is of
696 significant benefit to put these under control of a source control system
697 compatible with Poky such as git or svn. The autobuilder can then be set to
698 pull the latest revisions of these packages so the latest commits get tested
699 by the builds allowing any issues to be highlighted quickly. Poky easily
700 supports configurations where there is both a stable known good revision
701 and a floating revision to test. Poky can also only take changes from specific
702 source control branches giving another way it can be used to track/test only
703 specified changes.
704 </para>
705 <para>
706 Perhaps the hardest part of setting this up is the policy that surrounds
707 the different source control systems, be them software projects or the Poky
708 metadata itself. The circumstances will be different in each case but this is
709 one of Poky's advantages - the system itself doesn't force any particular policy
710 unlike a lot of build systems, allowing the best policy to be chosen for the
711 circumstances.
712 </para>
713 </section>
625 </section> 714 </section>
626 715
627 <section id='usingpoky-modifing-packages'> 716 <section id='usingpoky-modifing-packages'>
diff --git a/meta/classes/base.bbclass b/meta/classes/base.bbclass
index b7eb62c01a..e801fd12a9 100644
--- a/meta/classes/base.bbclass
+++ b/meta/classes/base.bbclass
@@ -946,108 +946,6 @@ addtask build after do_populate_staging
946do_build = "" 946do_build = ""
947do_build[func] = "1" 947do_build[func] = "1"
948 948
949# Functions that update metadata based on files outputted
950# during the build process.
951
952def explode_deps(s):
953 r = []
954 l = s.split()
955 flag = False
956 for i in l:
957 if i[0] == '(':
958 flag = True
959 j = []
960 if flag:
961 j.append(i)
962 if i.endswith(')'):
963 flag = False
964 r[-1] += ' ' + ' '.join(j)
965 else:
966 r.append(i)
967 return r
968
969def packaged(pkg, d):
970 import os, bb
971 return os.access(get_subpkgedata_fn(pkg, d) + '.packaged', os.R_OK)
972
973def read_pkgdatafile(fn):
974 pkgdata = {}
975
976 def decode(str):
977 import codecs
978 c = codecs.getdecoder("string_escape")
979 return c(str)[0]
980
981 import os
982 if os.access(fn, os.R_OK):
983 import re
984 f = file(fn, 'r')
985 lines = f.readlines()
986 f.close()
987 r = re.compile("([^:]+):\s*(.*)")
988 for l in lines:
989 m = r.match(l)
990 if m:
991 pkgdata[m.group(1)] = decode(m.group(2))
992
993 return pkgdata
994
995def get_subpkgedata_fn(pkg, d):
996 import bb, os
997 archs = bb.data.expand("${PACKAGE_ARCHS}", d).split(" ")
998 archs.reverse()
999 pkgdata = bb.data.expand('${TMPDIR}/pkgdata/', d)
1000 targetdir = bb.data.expand('${TARGET_VENDOR}-${TARGET_OS}/runtime/', d)
1001 for arch in archs:
1002 fn = pkgdata + arch + targetdir + pkg
1003 if os.path.exists(fn):
1004 return fn
1005 return bb.data.expand('${PKGDATA_DIR}/runtime/%s' % pkg, d)
1006
1007def has_subpkgdata(pkg, d):
1008 import bb, os
1009 return os.access(get_subpkgedata_fn(pkg, d), os.R_OK)
1010
1011def read_subpkgdata(pkg, d):
1012 import bb
1013 return read_pkgdatafile(get_subpkgedata_fn(pkg, d))
1014
1015def has_pkgdata(pn, d):
1016 import bb, os
1017 fn = bb.data.expand('${PKGDATA_DIR}/%s' % pn, d)
1018 return os.access(fn, os.R_OK)
1019
1020def read_pkgdata(pn, d):
1021 import bb
1022 fn = bb.data.expand('${PKGDATA_DIR}/%s' % pn, d)
1023 return read_pkgdatafile(fn)
1024
1025python read_subpackage_metadata () {
1026 import bb
1027 data = read_pkgdata(bb.data.getVar('PN', d, 1), d)
1028
1029 for key in data.keys():
1030 bb.data.setVar(key, data[key], d)
1031
1032 for pkg in bb.data.getVar('PACKAGES', d, 1).split():
1033 sdata = read_subpkgdata(pkg, d)
1034 for key in sdata.keys():
1035 bb.data.setVar(key, sdata[key], d)
1036}
1037
1038
1039#
1040# Collapse FOO_pkg variables into FOO
1041#
1042def read_subpkgdata_dict(pkg, d):
1043 import bb
1044 ret = {}
1045 subd = read_pkgdatafile(get_subpkgedata_fn(pkg, d))
1046 for var in subd:
1047 newvar = var.replace("_" + pkg, "")
1048 ret[newvar] = subd[var]
1049 return ret
1050
1051# Make sure MACHINE isn't exported 949# Make sure MACHINE isn't exported
1052# (breaks binutils at least) 950# (breaks binutils at least)
1053MACHINE[unexport] = "1" 951MACHINE[unexport] = "1"
diff --git a/meta/classes/insane.bbclass b/meta/classes/insane.bbclass
index 1f136d78ce..2b0c284775 100644
--- a/meta/classes/insane.bbclass
+++ b/meta/classes/insane.bbclass
@@ -439,7 +439,7 @@ def package_qa_check_rdepends(pkg, workdir, d):
439 bb.data.update_data(localdata) 439 bb.data.update_data(localdata)
440 440
441 # Now check the RDEPENDS 441 # Now check the RDEPENDS
442 rdepends = explode_deps(bb.data.getVar('RDEPENDS', localdata, True) or "") 442 rdepends = bb.utils.explode_deps(bb.data.getVar('RDEPENDS', localdata, True) or "")
443 443
444 444
445 # Now do the sanity check!!! 445 # Now do the sanity check!!!
diff --git a/meta/classes/package.bbclass b/meta/classes/package.bbclass
index 282315567f..df870142f1 100644
--- a/meta/classes/package.bbclass
+++ b/meta/classes/package.bbclass
@@ -2,6 +2,8 @@
2# General packaging help functions 2# General packaging help functions
3# 3#
4 4
5inherit packagedata
6
5PKGDEST = "${WORKDIR}/install" 7PKGDEST = "${WORKDIR}/install"
6 8
7def legitimize_package_name(s): 9def legitimize_package_name(s):
@@ -208,7 +210,7 @@ def runtime_mapping_rename (varname, d):
208 #bb.note("%s before: %s" % (varname, bb.data.getVar(varname, d, 1))) 210 #bb.note("%s before: %s" % (varname, bb.data.getVar(varname, d, 1)))
209 211
210 new_depends = [] 212 new_depends = []
211 for depend in explode_deps(bb.data.getVar(varname, d, 1) or ""): 213 for depend in bb.utils.explode_deps(bb.data.getVar(varname, d, 1) or ""):
212 # Have to be careful with any version component of the depend 214 # Have to be careful with any version component of the depend
213 split_depend = depend.split(' (') 215 split_depend = depend.split(' (')
214 new_depend = get_package_mapping(split_depend[0].strip(), d) 216 new_depend = get_package_mapping(split_depend[0].strip(), d)
@@ -438,7 +440,7 @@ python populate_packages () {
438 dangling_links[pkg].append(os.path.normpath(target)) 440 dangling_links[pkg].append(os.path.normpath(target))
439 441
440 for pkg in package_list: 442 for pkg in package_list:
441 rdepends = explode_deps(bb.data.getVar('RDEPENDS_' + pkg, d, 1) or bb.data.getVar('RDEPENDS', d, 1) or "") 443 rdepends = bb.utils.explode_deps(bb.data.getVar('RDEPENDS_' + pkg, d, 1) or bb.data.getVar('RDEPENDS', d, 1) or "")
442 for l in dangling_links[pkg]: 444 for l in dangling_links[pkg]:
443 found = False 445 found = False
444 bb.debug(1, "%s contains dangling link %s" % (pkg, l)) 446 bb.debug(1, "%s contains dangling link %s" % (pkg, l))
@@ -868,7 +870,7 @@ python package_do_pkgconfig () {
868python read_shlibdeps () { 870python read_shlibdeps () {
869 packages = bb.data.getVar('PACKAGES', d, 1).split() 871 packages = bb.data.getVar('PACKAGES', d, 1).split()
870 for pkg in packages: 872 for pkg in packages:
871 rdepends = explode_deps(bb.data.getVar('RDEPENDS_' + pkg, d, 0) or bb.data.getVar('RDEPENDS', d, 0) or "") 873 rdepends = bb.utils.explode_deps(bb.data.getVar('RDEPENDS_' + pkg, d, 0) or bb.data.getVar('RDEPENDS', d, 0) or "")
872 for extension in ".shlibdeps", ".pcdeps", ".clilibdeps": 874 for extension in ".shlibdeps", ".pcdeps", ".clilibdeps":
873 depsfile = bb.data.expand("${PKGDEST}/" + pkg + extension, d) 875 depsfile = bb.data.expand("${PKGDEST}/" + pkg + extension, d)
874 if os.access(depsfile, os.R_OK): 876 if os.access(depsfile, os.R_OK):
@@ -901,7 +903,7 @@ python package_depchains() {
901 def pkg_adddeprrecs(pkg, base, suffix, getname, depends, d): 903 def pkg_adddeprrecs(pkg, base, suffix, getname, depends, d):
902 904
903 #bb.note('depends for %s is %s' % (base, depends)) 905 #bb.note('depends for %s is %s' % (base, depends))
904 rreclist = explode_deps(bb.data.getVar('RRECOMMENDS_' + pkg, d, 1) or bb.data.getVar('RRECOMMENDS', d, 1) or "") 906 rreclist = bb.utils.explode_deps(bb.data.getVar('RRECOMMENDS_' + pkg, d, 1) or bb.data.getVar('RRECOMMENDS', d, 1) or "")
905 907
906 for depend in depends: 908 for depend in depends:
907 if depend.find('-native') != -1 or depend.find('-cross') != -1 or depend.startswith('virtual/'): 909 if depend.find('-native') != -1 or depend.find('-cross') != -1 or depend.startswith('virtual/'):
@@ -922,7 +924,7 @@ python package_depchains() {
922 def pkg_addrrecs(pkg, base, suffix, getname, rdepends, d): 924 def pkg_addrrecs(pkg, base, suffix, getname, rdepends, d):
923 925
924 #bb.note('rdepends for %s is %s' % (base, rdepends)) 926 #bb.note('rdepends for %s is %s' % (base, rdepends))
925 rreclist = explode_deps(bb.data.getVar('RRECOMMENDS_' + pkg, d, 1) or bb.data.getVar('RRECOMMENDS', d, 1) or "") 927 rreclist = bb.utils.explode_deps(bb.data.getVar('RRECOMMENDS_' + pkg, d, 1) or bb.data.getVar('RRECOMMENDS', d, 1) or "")
926 928
927 for depend in rdepends: 929 for depend in rdepends:
928 if depend.find('virtual-locale-') != -1: 930 if depend.find('virtual-locale-') != -1:
@@ -946,15 +948,15 @@ python package_depchains() {
946 list.append(dep) 948 list.append(dep)
947 949
948 depends = [] 950 depends = []
949 for dep in explode_deps(bb.data.getVar('DEPENDS', d, 1) or ""): 951 for dep in bb.utils.explode_deps(bb.data.getVar('DEPENDS', d, 1) or ""):
950 add_dep(depends, dep) 952 add_dep(depends, dep)
951 953
952 rdepends = [] 954 rdepends = []
953 for dep in explode_deps(bb.data.getVar('RDEPENDS', d, 1) or ""): 955 for dep in bb.utils.explode_deps(bb.data.getVar('RDEPENDS', d, 1) or ""):
954 add_dep(rdepends, dep) 956 add_dep(rdepends, dep)
955 957
956 for pkg in packages.split(): 958 for pkg in packages.split():
957 for dep in explode_deps(bb.data.getVar('RDEPENDS_' + pkg, d, 1) or ""): 959 for dep in bb.utils.explode_deps(bb.data.getVar('RDEPENDS_' + pkg, d, 1) or ""):
958 add_dep(rdepends, dep) 960 add_dep(rdepends, dep)
959 961
960 #bb.note('rdepends is %s' % rdepends) 962 #bb.note('rdepends is %s' % rdepends)
@@ -987,7 +989,7 @@ python package_depchains() {
987 pkg_addrrecs(pkg, base, suffix, func, rdepends, d) 989 pkg_addrrecs(pkg, base, suffix, func, rdepends, d)
988 else: 990 else:
989 rdeps = [] 991 rdeps = []
990 for dep in explode_deps(bb.data.getVar('RDEPENDS_' + base, d, 1) or bb.data.getVar('RDEPENDS', d, 1) or ""): 992 for dep in bb.utils.explode_deps(bb.data.getVar('RDEPENDS_' + base, d, 1) or bb.data.getVar('RDEPENDS', d, 1) or ""):
991 add_dep(rdeps, dep) 993 add_dep(rdeps, dep)
992 pkg_addrrecs(pkg, base, suffix, func, rdeps, d) 994 pkg_addrrecs(pkg, base, suffix, func, rdeps, d)
993} 995}
diff --git a/meta/classes/package_deb.bbclass b/meta/classes/package_deb.bbclass
index 28e67fcc9b..d90939fdb6 100644
--- a/meta/classes/package_deb.bbclass
+++ b/meta/classes/package_deb.bbclass
@@ -194,9 +194,9 @@ python do_package_deb () {
194 194
195 bb.build.exec_func("mapping_rename_hook", localdata) 195 bb.build.exec_func("mapping_rename_hook", localdata)
196 196
197 rdepends = explode_deps(unicode(bb.data.getVar("RDEPENDS", localdata, 1) or "")) 197 rdepends = bb.utils.explode_deps(unicode(bb.data.getVar("RDEPENDS", localdata, 1) or ""))
198 rdepends = [dep for dep in rdepends if not '*' in dep] 198 rdepends = [dep for dep in rdepends if not '*' in dep]
199 rrecommends = explode_deps(unicode(bb.data.getVar("RRECOMMENDS", localdata, 1) or "")) 199 rrecommends = bb.utils.explode_deps(unicode(bb.data.getVar("RRECOMMENDS", localdata, 1) or ""))
200 rrecommends = [rec for rec in rrecommends if not '*' in rec] 200 rrecommends = [rec for rec in rrecommends if not '*' in rec]
201 rsuggests = (unicode(bb.data.getVar("RSUGGESTS", localdata, 1) or "")).split() 201 rsuggests = (unicode(bb.data.getVar("RSUGGESTS", localdata, 1) or "")).split()
202 rprovides = (unicode(bb.data.getVar("RPROVIDES", localdata, 1) or "")).split() 202 rprovides = (unicode(bb.data.getVar("RPROVIDES", localdata, 1) or "")).split()
diff --git a/meta/classes/package_ipk.bbclass b/meta/classes/package_ipk.bbclass
index c4f53046f5..1aa2c814bb 100644
--- a/meta/classes/package_ipk.bbclass
+++ b/meta/classes/package_ipk.bbclass
@@ -235,8 +235,8 @@ python do_package_ipk () {
235 235
236 bb.build.exec_func("mapping_rename_hook", localdata) 236 bb.build.exec_func("mapping_rename_hook", localdata)
237 237
238 rdepends = explode_deps(bb.data.getVar("RDEPENDS", localdata, 1) or "") 238 rdepends = bb.utils.explode_deps(bb.data.getVar("RDEPENDS", localdata, 1) or "")
239 rrecommends = explode_deps(bb.data.getVar("RRECOMMENDS", localdata, 1) or "") 239 rrecommends = bb.utils.explode_deps(bb.data.getVar("RRECOMMENDS", localdata, 1) or "")
240 rsuggests = (bb.data.getVar("RSUGGESTS", localdata, 1) or "").split() 240 rsuggests = (bb.data.getVar("RSUGGESTS", localdata, 1) or "").split()
241 rprovides = (bb.data.getVar("RPROVIDES", localdata, 1) or "").split() 241 rprovides = (bb.data.getVar("RPROVIDES", localdata, 1) or "").split()
242 rreplaces = (bb.data.getVar("RREPLACES", localdata, 1) or "").split() 242 rreplaces = (bb.data.getVar("RREPLACES", localdata, 1) or "").split()
diff --git a/meta/classes/packagedata.bbclass b/meta/classes/packagedata.bbclass
new file mode 100644
index 0000000000..c9d64d6da2
--- /dev/null
+++ b/meta/classes/packagedata.bbclass
@@ -0,0 +1,82 @@
1def packaged(pkg, d):
2 import os, bb
3 return os.access(get_subpkgedata_fn(pkg, d) + '.packaged', os.R_OK)
4
5def read_pkgdatafile(fn):
6 pkgdata = {}
7
8 def decode(str):
9 import codecs
10 c = codecs.getdecoder("string_escape")
11 return c(str)[0]
12
13 import os
14 if os.access(fn, os.R_OK):
15 import re
16 f = file(fn, 'r')
17 lines = f.readlines()
18 f.close()
19 r = re.compile("([^:]+):\s*(.*)")
20 for l in lines:
21 m = r.match(l)
22 if m:
23 pkgdata[m.group(1)] = decode(m.group(2))
24
25 return pkgdata
26
27def get_subpkgedata_fn(pkg, d):
28 import bb, os
29 archs = bb.data.expand("${PACKAGE_ARCHS}", d).split(" ")
30 archs.reverse()
31 pkgdata = bb.data.expand('${TMPDIR}/pkgdata/', d)
32 targetdir = bb.data.expand('${TARGET_VENDOR}-${TARGET_OS}/runtime/', d)
33 for arch in archs:
34 fn = pkgdata + arch + targetdir + pkg
35 if os.path.exists(fn):
36 return fn
37 return bb.data.expand('${PKGDATA_DIR}/runtime/%s' % pkg, d)
38
39def has_subpkgdata(pkg, d):
40 import bb, os
41 return os.access(get_subpkgedata_fn(pkg, d), os.R_OK)
42
43def read_subpkgdata(pkg, d):
44 import bb
45 return read_pkgdatafile(get_subpkgedata_fn(pkg, d))
46
47def has_pkgdata(pn, d):
48 import bb, os
49 fn = bb.data.expand('${PKGDATA_DIR}/%s' % pn, d)
50 return os.access(fn, os.R_OK)
51
52def read_pkgdata(pn, d):
53 import bb
54 fn = bb.data.expand('${PKGDATA_DIR}/%s' % pn, d)
55 return read_pkgdatafile(fn)
56
57python read_subpackage_metadata () {
58 import bb
59 data = read_pkgdata(bb.data.getVar('PN', d, 1), d)
60
61 for key in data.keys():
62 bb.data.setVar(key, data[key], d)
63
64 for pkg in bb.data.getVar('PACKAGES', d, 1).split():
65 sdata = read_subpkgdata(pkg, d)
66 for key in sdata.keys():
67 bb.data.setVar(key, sdata[key], d)
68}
69
70
71#
72# Collapse FOO_pkg variables into FOO
73#
74def read_subpkgdata_dict(pkg, d):
75 import bb
76 ret = {}
77 subd = read_pkgdatafile(get_subpkgedata_fn(pkg, d))
78 for var in subd:
79 newvar = var.replace("_" + pkg, "")
80 ret[newvar] = subd[var]
81 return ret
82
diff --git a/meta/packages/libidl/libidl-native_0.8.3.bb b/meta/packages/libidl/libidl-native_0.8.12.bb
index ce59fd4b86..ce59fd4b86 100644
--- a/meta/packages/libidl/libidl-native_0.8.3.bb
+++ b/meta/packages/libidl/libidl-native_0.8.12.bb
diff --git a/meta/packages/libidl/libidl_0.8.3.bb b/meta/packages/libidl/libidl_0.8.12.bb
index ac10a2422f..ac10a2422f 100644
--- a/meta/packages/libidl/libidl_0.8.3.bb
+++ b/meta/packages/libidl/libidl_0.8.12.bb
diff --git a/meta/packages/xorg-xserver/xserver-xf86-dri-lite_1.6.0.bb b/meta/packages/xorg-xserver/xserver-xf86-dri-lite_1.6.0.bb
new file mode 100644
index 0000000000..8f5ed47d38
--- /dev/null
+++ b/meta/packages/xorg-xserver/xserver-xf86-dri-lite_1.6.0.bb
@@ -0,0 +1,15 @@
1require xserver-xf86-dri-lite.inc
2
3PE = "1"
4PR = "r0"
5
6PROTO_DEPS += "xf86driproto"
7
8SRC_URI += "file://nodolt.patch;patch=1 \
9 file://libdri-xinerama-symbol.patch;patch=1 \
10 file://xserver-boottime.patch;patch=1"
11
12# Misc build failure for master HEAD
13SRC_URI += "file://fix_open_max_preprocessor_error.patch;patch=1"
14
15EXTRA_OECONF += "--enable-dri --enable-dri2"