Introduction
We have built a Yocto Linux environment on a
ADLINK I-Pi SMARC 1200 with an octa-core (Arm® Cortex-A78 x4 + A55 x4) MediaTek® Genio 1200.
To build the environment, I used AIoT tools provided by MediaTek to flash and control the board.
We have summarized the procedure and hope you will try it.
<Main products used in this project
-
I-Pi SMARC 1200 (I-Pi SMARC Plus carrier, ADLINK SMARC LEC-MTK-I1200 module, 4GB LPDDR4X, 64GB UFS storage)
MediaTek® Genio 1200 platform-based I-Pi SMARC development Kit.
Other items to be prepared for this project:
- Linux Host PC (used to write Yocto image to SMARC UFS storage)
- HDMI cable and monitor
- USB mouse and keyboard
- LAN cable (used to connect to the Internet)
- Webcam (used for AI demo)
Reference:
- Installing AIoT Tools on Linux (I-Pi SMARC 1200 wiki)
- Boot From UFS (I-Pi SMARC 1200 wiki)
- AIoT tools manual
The main steps are as follows
- Setup of Linux Host PC writing environment (AIoT tools)
- Hardware setup
- Writing Yocto image from Host PC to SMARC development kit and booting Yocto
- (Object Detection demo using TensorFlow Lite quantization model
Setting up a Linux Host PC for writing (AIoT Tools)
STEP 1-1 : Install Git and Pip
Install Ubuntu 22.04 on the Host PC and execute the following commands to install git.
sudo apt update
sudo apt install git
python3 -V
Install the latest pip with python 3.10
sudo apt install python3-pip
pip --version
STEP 1-2 : Install fastboot
The AIoT tool uses fastboot to write images.
sudo apt update
sudo apt install android-tools-adb android-tools-fastboot
STEP 1-3 : USB device rules
Add new udev rules and user counts to plugdev.
echo 'SUBSYSTEM=="usb", ATTR{idVendor}=="0e8d", ATTR{idProduct}=="201c", MODE="0660", $ GROUP="plugdev"' | sudo tee -a /etc/udev/rules.d/96- rity.rules
echo -n 'SUBSYSTEM=="usb", ATTR{idVendor}=="0e8d", ATTR{idProduct}=="201c", MODE="0660", TAG+="uaccess"
SUBSYSTEM=="usb", ATTR{idVendor }=="0e8d", ATTR{idProduct}=="0003", MODE="0660", TAG+="uaccess"
SUBSYSTEM=="usb", ATTR{idVendor}=="0403", MODE="0660", TAG+="uaccess"
SUBSYSTEM=="gpio", MODE="0660", TAG+="uaccess"
' | sudo tee /etc/udev/rules.d/72-aiot.rules
sudo udevadm control --reload-rules
sudo udevadm trigger
sudo usermod -a -G plugdev $USER
STEP 1-4 : Install AIoT tools
Run the following commands to install AIoT tools on your Linux Host PC.
pip3 install -U -e "git+https://gitlab.com/mediatek/aiot/bsp/aiot-tools.git#egg=aiot-tools"
export PATH="$HOME/.local/bin:$PATH"
sudo adduser $USER dialout
Reboot the Host PC and type the following command in the terminal to check the AIoT tools
aiot-config
If you get the above message, the environment setup on the Host PC is complete.
2. Hardware Setup
Make the following connections with the I-Pi SMARC Development Kit.
Connect to the monitor with HDMI cable
・Insert LAN cable (used to connect to the Internet)
・USB mouse/keyboard
*In this case, we used USB Hub as shown in the picture.
Webcam
USB type-A to micro USB cable (included in the kit)
Connect USB type-A to Linux Host PC and micro USB to I-Pi SMARC Development Kit. 3.
Writing Yocto image to SMARC development kit from Host PC and booting Yocto
Create Yocto directory under Home.
sudo mkdir Yocto
I-Pi SMARC 1200 Download (ipi.wiki)
Access the above link, right-click on the Yocto pre-built image (Yocto LEC-MTK-1200 ), select "Save Link As" and save it in the directory you just created.
(The file is 3.5 GB in size, so it will take some time to download.)
The download is successfully completed.
Move to the directory where you want to save the downloaded file.
cd Yocto
Unzip the downloaded file using the unzip command.
unzpip LEC-1200-IPi-SMARC-PLUS_Yocto_kirkstone_V2_R8_2023_08_02.zip
Move to the unzipped directory.
cd LEC-1200-IPi-SMARC-PLUS_Yocto_kirkstone_V2_R8_2023_08_02/
Connect the Host PC and the SMARC Development Kit with the USB cable, connect the SMARC Development Kit power supply, and press the reset button on the Development Kit.
Reference: Reset button location: https: //www.ipi.wiki/pages/1200-docs?page=CarrierIntroduction.html
To write the image to SMARC's UFS storage, execute the following command
aiot-flash
After the above message, writing will not proceed, but pressing the power button on the development kit will start writing.
Reference: Location of the power button https://www.ipi.wiki/pages/1200-docs?page=CarrierIntroduction.html
When the following message is output, the writing of the image file is successfully completed.
When the writing is successfully completed, the SMARC Development Kit will restart automatically.
If the "terminal" icon appears in the upper left corner of the monitor connected via HDMI from the SMARC Development Kit, the boot was successful.
4. (Extra) Object Detection demo using TensorFlow Lite quantization model
STEP 4-1 : Download TensorFlow Runtime sample from Github repository
After booting Yocto Linux with SMARC development kit, click the terminal icon in the upper left corner to start it.
Create a py_work directory under Home.
sudo mkdir py_work
Move to the created directory.
cd py_work
Use the wget command to download a set of Python examples from the following Github repository
https://github.com/PINTO0309/TensorflowLite-bin
wget https://github.com/PINTO0309/TensorflowLite-bin/archive/refs/heads/main.zip
When the download is successfully completed, you will see the following message (File name: main.zip)
STEP 4-2: Edit the Python source code
Unzip the downloaded file using the unzip command.
unzip main.zip
Move to the generated TensorflowLite-bin-main directory with the cd command.
cd TensorflowLite-bin-main
Execute the ls command and you will see that there are three python files.
We will use the file mobilenetv2ssd-async-usbcam.py.
Modify the Python code (mobineletv2ssd-async-usb.py) in a text editor.
(vi is used below)
vi mobineletv2ssd-async-usbcam.py
- Fix the webcam number and FPS in the main function
## parser.add_argument("--usbcamno", type=int, default=0, help="USB Camera number.")
parser.add_argument("--usbcamno", type=int, default=1, help="USB Camera number. help="USB Camera number.")
# vidfps = 60
vidfps = 30
STEP 4-3 : Running the Object Detection Python Demo
Run the python demo with the following command
python3 mobilenetv2ssd-async-usbcam.py
You can see that an object is detected. The inference performance is up to 30 FPS, which is the upper limit of the webcam used this time.
If you want to stop the demo, press Ctrl + C in the terminal window.
Conclusion
This time we used the Yocto Linux image provided by ADLINK, but you can also use your own image. Please try it.
We also modified mobilenetv2ssd-async-usbcam.py to demonstrate the webcam in python, but gstreamer can also be used.
Reference: