Bringing Model Intelligence to the Edge using Intel OpenVino Toolkit and TensorflowLite (Part — 1)

Road to Edge AI

In this series of blogs, I am gonna show how we can bring model intelligence from CPU to small Raspberry Pi devices using the Intel OpenVino toolkit. You can observe the power of this toolkit on CPU as well. This is a tutorial to know more about OpenVino and Tensorflow Lite. For simplicity, I have used hello world datasets(fashion Mnist) to get the intuition behind the complete workflow. This can help many ones like you to replicate this logic with your own model. I am also gonna show you how we can convert the model into Tensorflow Lite and will compare it with OpenVino Model in terms of performance and Frame per second.

Training model on CPU or GPU now is much more convenient to work with although it takes time and effort to benchmark the model and achieving wonderful results. But have you imagined how you are gonna extract the gained model knowledge to solve real-world problems? Is the training model is worth it? Yes, It is not. Most of the time, You need to use it in Edge Device as one smart Intelligent device having your model intelligence. Moving Intelligence to small devices is also challenging and exploring. The common issue that we have:

  1. Latency: Delay in getting many critical applications is hazardous like AV. It takes time to complete a round trip with large data flow over limited bandwidth
  2. Reliability: on the internet and other communication network leads to dependency.
  3. Privacy: dealing Private information way from data sources is currently a major concern every gaint company is facing
  4. Cost: CPU/GPU generally consumes more power and highly costly than the edge device.

If we have computing power near the data sources, we can solve most of these issues at a greater extent. Let’s see what end-edge-cloud looks like:

Fig 1.1: End-Edge-Cloud Architecture

In this data processing world, every problem can be solved by this structure. I will give you time to think about it.

End: Place where data is generated

Edge: Computing power near data sources to process data in order to make critical decisions and/or preprocess that data to send to the cloud for other non-critical tasks.

Cloud Services: We all know this well.

What is Intel OpenVino Toolkit?

Fig 1.2: Intel OpenVino Toolkit Workflow

According to them, “The OpenVINO™ toolkit is a comprehensive toolkit for quickly developing applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNNs), the toolkit extends CV workloads across Intel® hardware, maximizing performance.” Let’s simplify further:

  1. benchmark your model performance in Tensorflow, Pytorch, MXNet framework,
  2. use Model Optimiser of OpenVino Tooklit to convert it into Intermediate Representation
  3. Use Platform-specific, highly optimized environment to get the inference from images.

What is Tensorflow Lite?

TensorFlow Lite is an open source deep learning framework for on-device inference

This will come with ease when I will explain with examples in the next coming blogs. Many things coming on the way!!

Stay Safe. Stay hungry.

Please give it a clap and feel free to connect with me: My LinkedIn

References:

  1. OpenVino Toolkit:
  2. Convergence of Edge Computing and Deep Learning: A Comprehensive Survey
  3. Tensorflow Lite

Open to Edge AI Opportunity