Now that you have your environment set up correctly, you can build the ONNX Runtime inference engine.
ONNX Runtime is an open-source inference engine designed to accelerate the deployment of machine learning models, particularly those in the Open Neural Network Exchange (ONNX) format. ONNX Runtime is optimized for high performance and low latency, making it popular for production deployment of AI models. You can learn more by reading the ONNX Runtime Overview .
Open up a Windows Powershell and checkout the source tree:
cd C:\Users\$env:USERNAME
git clone --recursive https://github.com/Microsoft/onnxruntime.git
cd onnxruntime
git checkout 9b37b3ea4467b3aab9110e0d259d0cf27478697d
You might be able to use a later commit. These steps have been tested with the commit 9b37b3ea4467b3aab9110e0d259d0cf27478697d
.
You use the Ninja generator to build on Windows for Android. First, set JAVA_HOME to the path to your JDK install. You can point to the JDK from Android Studio, or a standalone JDK install.
$env:JAVA_HOME="C:\Program Files\Android\Android Studio\jbr"
Now run the following command:
./build.bat --config Release --build_shared_lib --android --android_sdk_path C:\Users\$env:USERNAME\AppData\Local\Android\Sdk --android_ndk_path C:\Users\$env:USERNAME\AppData\Local\Android\Sdk\ndk\27.0.12077973 --android_abi arm64-v8a --android_api 27 --cmake_generator Ninja --build_java
An Android Archive (AAR) file, which can be imported directly in Android Studio, will be generated by using the above command with --build_java
When the build is complete, confirm the shared library and the AAR file have been created:
ls build\Windows\Release\onnxruntime.so
ls build\Windows\Release\java\build\android\outputs\aar\onnxruntime-release.aar