Introduction
This content takes YOLOv3, a public model, and shows how to convert it to a format called IR (.bin / .xml) for use in the OpenVINO™ toolkit. The YOLOv3 public model we will use is from the following URL
The contents of this document target the Windows version 2020.2, which is the latest version at the time of writing (as of 4/24/2019).
The main steps are as follows
- Dump the YOLOv3 TensowFlow model from the GitHub repository
- Convert to YOLOv3 frozen model (.pb)
- Convert YOLOv3 frozen model to IR
1. dump YOLOv3 TensowFlow model from GitHub repository
First, we will show how to dump a TensowFlow model from the GitHub repository (https://github.com/mystic123/tensorflow-yolo-v3).
1.1 Clone the YOLOv3 repository on GitHub
git clone https://github.com/mystic123/tensorflow-yolo-v3.git
When the clone is complete, a directory like this will be created. (Here, we will proceed with the explanation assuming that you cloned it under C:\home. )
1.2 Navigate to the cloned directory (C:\home\tensorflow-yolo-v3)
cd tensorflow-yolo-v3
1.3 Download coco.names from the DarkNet site and save it to C:\home\tensorflow-yolo-v3
Hover over the above link, right-click, and then "Save Link As" to download.
- Reference URL:.
Save the downloaded file in the same directory.
1.4 Modify coco.names slightly: open coco.names in a text editor, delete line 81, which is the last line with only spaces, and save it over.
- Before editing
- After editing
1.5 Download yolov3.weights from the following URL C:\home\tensorflow-yolo-v3
Download is complete. 2.
2. convert to YOLOv3 frozen model (.pb)
Run the Python library (converter) stored in the cloned YOLOv3 directory.
python converter_weights_pb.py --class_names coco.names --data_format NHWC --weights_file yolov3.weights
Once converted, the following message is generated.
We also confirmed that the file was generated.
NOTES The following message is generated when the file is converted ModuleNotFoundError: If you get the error "No module named 'PIL'", please install the Python image processing library "pillow". 3.
Converting YOLOv3 frozen model to IR
From here, we will use the Model Optimizer, a Python library included in the OpenVINO™ toolkit for conversion.
In addition to frozen_darknet_yolov3_model.pb, a configuration file yolov3.json is required as an input file for this conversion. yolov3.json is stored in the following directory
- <OPENVINO_INSTALL_DIR>\deployment_tools\model_optimizer\extensions\front\tf
Execute the following commands to convert according to the data type.
- Data type FP32
python C:\IntelSWTools\openvino\deployment_tools\model_optimizer\mo_tf.py ^
--input_model C:\home\tensorflow-yolo-v3\frozen_darknet_ yolov3_model.pb ^
--transformations_config C:\IntelSWTools\openvino\deployment_tools\model_optimizer\extensions\front\tf\yolo_v3.json ^
--input_shape [1,416,416,3] ^
--data_type=FP32 ^
--model_name yolov3 ^
--output_dir C:\home\tensorflow-yolo-v3\FP32
When converted, the following message is generated.
We have also confirmed that the file has been generated.
- Data type FP16
python C:\IntelSWTools\openvino\deployment_tools\model_optimizer\mo_tf.py ^
--input_model C:\home\tensorflow-yolo-v3\frozen_darknet_ yolov3_model.pb ^
--transformations_config C:\IntelSWTools\openvino\deployment_tools\model_optimizer\extensions\front\tf\yolo_v3.json ^
--input_shape [1,416,416,3] ^
--data_type=FP16 ^
--model_name yolov3 ^
--output_dir C:\home\tensorflow-yolo-v3\FP16
When converted, the following message is generated.
We also confirmed that the file was generated. 4.
4. summary
As you can see, the OpenVINO™ toolkit makes it easy to convert a publicly available generic model into a dedicated IR format. In addition to the generic models, Intel® offers a variety of pre-trained models that have already been trained by Intel®. These are available as commercially available pre-trained models, making it easy to build your own applications by combining the models you need.
The OpenVINO™ toolkit is also available to try on your own laptop. We invite you to try out the OpenVINO™ Toolkit on your own laptop and develop your own AI system.
Reference Links