diff options
author | Chin Huat Ang <chin.huat.ang@intel.com> | 2019-09-27 06:11:51 +0800 |
---|---|---|
committer | Anuj Mittal <anuj.mittal@intel.com> | 2019-09-28 17:18:30 +0800 |
commit | 096598691de246c23902d49d228c7562ba2c9cc5 (patch) | |
tree | 8df7c03aaf84bd38374ae049ba0f31031c20c6d0 /recipes-multimedia | |
parent | 7aef51c962c023f27e3fcda4c2419f1ced9942b9 (diff) | |
download | meta-intel-096598691de246c23902d49d228c7562ba2c9cc5.tar.gz |
dldt-inference-engine: add recipe
This recipe builds the inference engine from opencv/dldt 2019 R1.1
release.
OpenVINO™ toolkit, short for Open Visual Inference and Neural network
Optimization toolkit, provides developers with improved neural network
performance on a variety of Intel® processors and helps further unlock
cost-effective, real-time vision applications.
The toolkit enables deep learning inference and easy heterogeneous
execution across multiple Intel® platforms (CPU, Intel® Processor Graphics)—providing
implementations across cloud architectures to edge device.
For more details, see:
https://01.org/openvinotoolkit
The recipe needs components from meta-oe so move it to
dynamic-layers/openembedded-layer. GPU plugin support needs intel-compute-runtime
which can be built by including clang layer in the mix as well.
CPU and GPU plugins have been sanity tested to work using
classification_sample. Further fine-tuning is still needed to improve
the performance.
Original patch by Anuj Mittal.
Signed-off-by: Chin Huat Ang <chin.huat.ang@intel.com>
Signed-off-by: Anuj Mittal <anuj.mittal@intel.com>
Diffstat (limited to 'recipes-multimedia')
0 files changed, 0 insertions, 0 deletions