summaryrefslogtreecommitdiffstats
path: root/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0002-use-ade-and-pugixml-from-system.patch
Commit message (Collapse)AuthorAgeFilesLines
* dldt-inference-engine: upgrade 2019r1.1 -> 2019r2Anuj Mittal2019-09-301-32/+0
| | | | | | | | | | | | | | | | * Release notes: https://software.intel.com/en-us/articles/OpenVINO-RelNotes * Enable unit tests to be built and tested using ptest mechanism. * Include patches from Clear Linux for build fixes. * Switch to using python3 and threading to using TBB. Switch ENABLE_OPENCV to off so opencv from system is used. * Remove do_install and patch Makefiles instead to install libraries correctly. Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
* dldt-inference-engine: add recipeChin Huat Ang2019-09-281-0/+32
This recipe builds the inference engine from opencv/dldt 2019 R1.1 release. OpenVINO™ toolkit, short for Open Visual Inference and Neural network Optimization toolkit, provides developers with improved neural network performance on a variety of Intel® processors and helps further unlock cost-effective, real-time vision applications. The toolkit enables deep learning inference and easy heterogeneous execution across multiple Intel® platforms (CPU, Intel® Processor Graphics)—providing implementations across cloud architectures to edge device. For more details, see: https://01.org/openvinotoolkit The recipe needs components from meta-oe so move it to dynamic-layers/openembedded-layer. GPU plugin support needs intel-compute-runtime which can be built by including clang layer in the mix as well. CPU and GPU plugins have been sanity tested to work using classification_sample. Further fine-tuning is still needed to improve the performance. Original patch by Anuj Mittal. Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com> Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>