Can you run the MLPerf Inference Benchmark suite on ML models with different backends?
Yes, different backends such as TensorFlow, ONNX Runtime, PyTorch and TFLite are supported.
Can you run the MLPerf Inference Benchmark suite on either cpu or gpu on your machine?
Yes, you can select the target device option.