Optimization Best Practices to Maximize Apps with OpenVINO™ toolkit

The Intel® Distribution of OpenVINO™ toolkit enables high-performance, deep learning deployments.

In this session, software specialists Zoe Cayetano, Anna Belova, and Dmitry Temnov discuss optimization best practices to maximize your deep learning metrics, including throughput, accuracy, and latency.

You will:

  • Get an understanding of the different optimization metrics that are important in deep learning
  • Learn about the available tools and techniques to optimize your deep learning applications within Intel® Distribution of OpenVINO™ toolkit
  • See a demo of how to use the Deep Learning Workbench (DL Workbench) to visually fine-tune your models
  • Explore benchmarking resources to test on your own applications
  • And more

Get started

Zoe Cayetano, AI Product Manager, Intel Corporation

Passionate about democratizing technology access for everyone and working on projects with outsized impact on the world, Zoe is a Product Manager for AI and IoT working on a variety of interdisciplinary business and engineering problems. Prior to Intel, she was a data science researcher for a particle accelerator at Arizona State University, where she analyzed electron beam dynamics of novel x-ray lasers that were used for crystallography, quantum materials and bioimaging. She holds Bachelor’s degrees in Applied Physics and Business. Philippines-born, she is currently based in San Francisco, California.

Anna Belova, Deep Learning Software Engineer, Intel Corporation

Anna is a Deep Learning Software Engineer at Intel. Her role is to make Intel customers happy with OpenVINO providing a technical support. Before this, she worked on other Intel software products like MediaSDK and pre-sales support of Android-based mobile devices. She graduated with my Bachelor’s in Business Informatics and Master’s in Applied Mathematics and Informatics from the Higher School of Economics, Nizhny Novgorod, Russia.

Dmitry Temnov, Technical Consulting Engineer, Intel Corporation

Dmitry Temnov is a Technical Consulting Engineer at Intel, enabling customers to accelerate computer vision and deep learning applications using Intel® Distribution of OpenVINO™ toolkit.

For more complete information about compiler optimizations, see our Optimization Notice.