summaryrefslogtreecommitdiffstats
path: root/dynamic-layers/openembedded-layer
Commit message (Collapse)AuthorAgeFilesLines
* dldt-inference-engine: Add ISSL license for the firmware filesMartin Jansa2019-10-251-2/+5
| | | | | Signed-off-by: Martin Jansa <Martin.Jansa@gmail.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: return support for VPUMartin Jansa2019-10-251-2/+21
| | | | | | | | | * add PACKAGECONFIG for vpu * add extra package for firmware files * tested on rpi4 with NCS2 Signed-off-by: Martin Jansa <Martin.Jansa@gmail.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: install extension headers in includedir instead of ↵Martin Jansa2019-10-241-1/+1
| | | | | | | | | share/doc * otherwise components depending on them won't be able to find them Signed-off-by: Martin Jansa <Martin.Jansa@gmail.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: add SSTATE_SCAN_FILES to fix CMake filesMartin Jansa2019-10-241-0/+4
| | | | | Signed-off-by: Martin Jansa <Martin.Jansa@gmail.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: build clDNN against opencl-icd-loaderChin Huat Ang2019-10-211-2/+1
| | | | | | | | | | | Instead of letting clDNN build against intel_ocl_icd prebuilt binaries under clDNN/common/intel_ocl_icd, configure cmake build to pick up opencl-icd-loader headers and libraries from staging directory. Do not set CMAKE_INSTALL_LOCAL_ONLY as it is unused. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: update 2019r2 -> 2019r3Chin Huat Ang2019-10-219-246/+215
| | | | | | | Refresh patches so that they apply cleanly on 2019r3. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: disable VPU pluginsChin Huat Ang2019-10-211-10/+1
| | | | | | | VPU plugins are untested, temporarily disable them. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: remove DEPENDS mkl-dnnChin Huat Ang2019-10-211-1/+0
| | | | | | | | Inference engine is still downloading and building it's own copy of mkl-dnn, so remove it from DEPENDS. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: fix ptest failuresChin Huat Ang2019-10-213-0/+34
| | | | | | | | | | | | | | | | | | | | | | | Package libmock_engine.so as part of dldt-inference-engine-ptest and update run-ptest to set LD_LIBRARY_PATH to fix the following InferenceEngineUnitTests failures: FAIL: 12 tests, listed below: FAIL: PluginTest.canCreatePlugin FAIL: PluginTest.canCreatePluginUsingSmartPtr FAIL: PluginTest.shouldThrowExceptionIfPluginNotExist FAIL: PluginTest.canCallErrorHandlerIfNecessary FAIL: PluginTest.canForwardPluginEnginePtr FAIL: PluginTest.canSetConfiguration FAIL: PluginDispatcherTests.canLoadMockPlugin FAIL: PluginDispatcherTests.returnsIfLoadSuccessfull FAIL: SharedObjectLoaderTests.canLoadExistedPlugin FAIL: SharedObjectLoaderTests.canFindExistedMethod FAIL: SharedObjectLoaderTests.throwIfMethodNofFoundInLibrary FAIL: SharedObjectLoaderTests.canCallExistedMethod Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* lms: Do not build on muslKhem Raj2019-10-101-0/+2
| | | | | | | It depends on ace which is marked as incompatible for musl as well Signed-off-by: Khem Raj <raj.khem@gmail.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: add PACKAGECONFIG for python APIChin Huat Ang2019-10-101-5/+11
| | | | | | | | | | | | Add PACKAGECONFIG[python3] for building dldt-inference-engine-python3 package which contains the inference engine python API. Also tweak recipe to inherit python3native instead of relying on host python as building the python API requires python3-cython which might not be available on the host. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: fix clDNN install directoryChin Huat Ang2019-09-302-0/+28
| | | | | | | | | | | | | | | | Install clDNN to /usr/lib to resolve the following inference engine error when running with GPU plugin: [ ERROR ] Failed to create plugin libclDNNPlugin.so for device GPU Please, check your environment Cannot load library 'libclDNNPlugin.so': libclDNNPlugin.so: cannot open shared object file: No such file or directory /usr/src/debug/dldt-inference-engine/2019r2-r0/git/inference-engine/include/details/os/lin_shared_object_loader.h:36 /usr/src/debug/dldt-inference-engine/2019r2-r0/git/inference-engine/src/inference_engine/ie_core.cpp:277 Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: upgrade 2019r1.1 -> 2019r2Anuj Mittal2019-09-3014-404/+751
| | | | | | | | | | | | | | | | * Release notes: https://software.intel.com/en-us/articles/OpenVINO-RelNotes * Enable unit tests to be built and tested using ptest mechanism. * Include patches from Clear Linux for build fixes. * Switch to using python3 and threading to using TBB. Switch ENABLE_OPENCV to off so opencv from system is used. * Remove do_install and patch Makefiles instead to install libraries correctly. Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: add recipeChin Huat Ang2019-09-288-0/+498
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This recipe builds the inference engine from opencv/dldt 2019 R1.1 release. OpenVINO™ toolkit, short for Open Visual Inference and Neural network Optimization toolkit, provides developers with improved neural network performance on a variety of Intel® processors and helps further unlock cost-effective, real-time vision applications. The toolkit enables deep learning inference and easy heterogeneous execution across multiple Intel® platforms (CPU, Intel® Processor Graphics)—providing implementations across cloud architectures to edge device. For more details, see: https://01.org/openvinotoolkit The recipe needs components from meta-oe so move it to dynamic-layers/openembedded-layer. GPU plugin support needs intel-compute-runtime which can be built by including clang layer in the mix as well. CPU and GPU plugins have been sanity tested to work using classification_sample. Further fine-tuning is still needed to improve the performance. Original patch by Anuj Mittal. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* lms: upgrade 1921.0.0.0 -> 1932.0.0.0Alexander Usyskin2019-08-072-11/+6
| | | | | | | Drop library packaging workarounds - not needed with a new sources. Signed-off-by: Alexander Usyskin <alexander.usyskin@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* lms: add recipe for lms 1921.0.0.0Alexander Usyskin2019-06-192-0/+70
This is a new release of Local Manageability Service. This open-source release deprecates unsupported lms7 and lms8. This recipe depends on ACE and MeTee library recipes. Signed-off-by: Alexander Usyskin <alexander.usyskin@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>