Now that you have your environment set up correctly, you can build the ONNX Runtime inference engine.
ONNX Runtime is an open-source inference engine designed to accelerate the deployment of machine learning models, particularly those in the Open Neural Network Exchange (ONNX) format. ONNX Runtime is optimized for high performance and low latency, making it popular for production deployment of AI models. You can learn more by reading the ONNX Runtime Overview .
Open up a Windows PowerShell and checkout the source tree:
cd C:\Users\$env:USERNAME
git clone --recursive https://github.com/Microsoft/onnxruntime.git
cd onnxruntime
git checkout 5630b081cd25e4eccc7516a652ff956e51676794
You might be able to use a later commit. These steps have been tested with the commit 5630b081cd25e4eccc7516a652ff956e51676794
. This corresponds to ORT 1.22.2
You use the Ninja generator to build on Windows for Android. First, set JAVA_HOME to the path to your JDK install. ONNX Runtime compiles well with JDK 17. If you face issues with compilation please check your JDK version.
$env:JAVA_HOME="C:\Program Files\Microsoft\jdk-17.0.16.8-hotspot\"
Now run the following command:
./build.bat --config Release --build_shared_lib --android --android_sdk_path C:\Users\$env:USERNAME\AppData\Local\Android\Sdk --android_ndk_path C:\Users\$env:USERNAME\AppData\Local\Android\Sdk\ndk\27.3.13750724 --android_abi arm64-v8a --android_api 27 --cmake_generator Ninja --build_java
An Android Archive (AAR) file, which can be imported directly in Android Studio, will be generated by using the above command with --build_java
When the build is complete, confirm the shared library and the AAR file have been created:
ls build\Windows\Release\libonnxruntime.so
ls build\Windows\Release\java\build\android\outputs\aar\onnxruntime-release.aar