Set up your Python environment

Before building ExecuTorch, it is highly recommended to create an isolated Python environment. This prevents dependency conflicts with your system Python installation and ensures that all required build and runtime dependencies remain consistent across runs:

    

        
        
sudo apt update
sudo apt install -y python3 python3.12-dev python3-venv build-essential cmake
python3 -m venv pyenv
source pyenv/bin/activate

    

Keep your Python virtual environment activated while you complete the next steps. This ensures all dependencies install in the correct location.

Download the ExecuTorch source code

Clone the ExecuTorch repository from GitHub. The following command checks out the stable v1.0.0 release and ensures all required submodules are fetched:

    

        
        
export WORKSPACE=$HOME
cd $WORKSPACE
git clone -b v1.0.0 --recurse-submodules https://github.com/pytorch/executorch.git

    
Note

The instructions in this Learning Path were tested on ExecuTorch v1.0.0. Commands or configuration options might differ in later releases.

Build and install the ExecuTorch Python components

Next, you’ll build the ExecuTorch Python bindings and install them into your active virtual environment. This process compiles the C++ runtime, links hardware-optimized backends such as KleidiAI and XNNPACK, and enables optional developer utilities for debugging and profiling.

Run the following command from your ExecuTorch workspace:

    

        
        
cd $WORKSPACE/executorch
CMAKE_ARGS="-DEXECUTORCH_BUILD_DEVTOOLS=ON" ./install_executorch.sh

    

This builds ExecuTorch and its dependencies using cmake, enabling optional developer utilities such as ETDump and Inspector.

Verify the Installation

After the build completes, check that ExecuTorch is installed in your active Python environment. Run the following command:

    

        
        
python -c "import executorch; print('Executorch build and install successfully.')"

    

If you see the success message, your environment is ready. You can now move on to cross-compiling and preparing to profile KleidiAI micro-kernels.

Back
Next