Convert to QNN for Linux Host on CPU Backend

Note

This is Part 3 of the Convert to QNN tutorial for Linux host machines. If you have not completed Part 2, please do so here.

Transferring over all relevant files

  1. On the target device, open a terminal and make a destination folder by running:

    mount -o remount,rw /
    mkdir -p /data/local/tmp
    cd /data/local/tmp
    ln -s /etc/ /data/local/tmp
    chmod -R 777 /data/local/tmp
    mkdir -p "/data/local/tmp/qnn_tutorial"
    
  2. On the host device, use scp to transfer libQnnCpu.so from your host machine to /data/local/tmp/qnn_tutorial on the target device.

    scp "${QNN_SDK_ROOT}/lib/${QNN_TARGET_ARCH}/libQnnCpu.so" "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    
  3. Use scp to transfer the example built model. 1. Update the x64 folder below to the proper folder for your built model. The folder name depends on your host machine’s architecture.

    scp "${QNN_MODEL_PATH}" "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    
  4. Transfer the input data, input list, and script from the QNN SDK examples folder into /data/local/tmp/qnn_tutorial on the target device using scp in a similar way:

    scp -r "${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/cropped"  "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    scp "${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/target_raw_list.txt"  "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    scp "${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/imagenet_slim_labels"  "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    scp "${QNN_SDK_ROOT}/examples/Models/InceptionV3/scripts/show_inceptionv3_classifications.py"  "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    
  5. Transfer qnn-net-run from $QNN_SDK_ROOT/bin/$QNN_TARGET_ARCH/qnn-net-run to /data/local/tmp/qnn_tutorial on the target device:

    scp "$QNN_SDK_ROOT/bin/$QNN_TARGET_ARCH/qnn-net-run" "${TARGET_USER}@${TARGET_IP}:/data/local/tmp/qnn_tutorial"
    

Doing inferences on the target device processor

  1. Open a terminal instance on the target device. 1. Alternatively, you can ssh from your Linux host machine, run the following command to ssh into your target device. 2. These console variables were set in the above instructions for “Transferring all relevant files”.

    ssh "${TARGET_USER}@${TARGET_IP}"
    

    Note

    You will have to log in with your target device’s login for that username.

  2. Navigate to the directory containing the test files:

    cd /data/local/tmp/qnn_tutorial
    
  3. Run the following command on the target device to execute an inference:

    ./qnn-net-run \
       --model "./<model_name_here>.so" \
       --input_list "./target_raw_list.txt" \
       --backend "./libQnnCpu.so" \
       --output_dir "./output"
    
  4. Run the following script on the target device to view the classification results:

    Note

    You can alternatively copy the output folder back to your host machine with scp and run the following script there to avoid having to install Python on your target device.

    python3 ".\show_inceptionv3_classifications.py" \
        -i ".\cropped\raw_list.txt" \
        -o "output" \
        -l ".\imagenet_slim_labels.txt"
    
  5. Verify that the classification results in output match the following:

    File Path

    Expected Output

    ${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/cropped/trash_bin.raw

    0.777344 413 ashcan

    ${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/cropped/chairs.raw

    0.253906 832 studio couch

    ${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/cropped/plastic_cup.raw

    0.980469 648 measuring cup

    ${QNN_SDK_ROOT}/examples/Models/InceptionV3/data/cropped/notice_sign.raw

    0.167969 459 brass