AnsweredAssumed Answered

Question about imx8m plus EVB

Question asked by Ryan Huang on Aug 26, 2020
Latest reply on Aug 27, 2020 by Manish Bajaj



1. How to inference our own models, my question is, will it only accept, TFlite, ArmNN and CV. If yes, should we convert our models to this format. and also i would like to know what specs or format does the inference model should be to accepted by inference engine in NXP.

2. How do we know the latest pyeIQ version is running on NPU or CPU. because when i check in TOP command, it shows only CPU loading and its like above 150%.