Can you run the MLPerf Inference Benchmark suite on ML models with different backends?
Yes, different backends are supported. Such as tensorflow, onnxruntime, pytorch and tflite
Can you run the MLPerf Inference Benchmark suite on either cpu or gpu on your machine?
Yes, you can select the target device option.