Measure Machine Learning Inference Performance on Arm servers

What you've learned

You should now know how to:

  • Install and run TensorFlow on your Arm-based cloud server.
  • Use MLPerf Inference benchmark suite, an open-sourced benchmark from MLCommons to test ML performance on your Arm server.

Knowledge Check

Can you run the MLPerf Inference Benchmark suite on ML models with different backends?

Can you run the MLPerf Inference Benchmark suite on either cpu or gpu on your machine?


Back
Next