Measure Machine Learning Inference Performance on Arm servers

About this Learning Path

Who is this for?

This is an introductory topic for software developers interested in benchmarking machine learning workloads on Arm servers.

What will you learn?

Upon completion of this learning path, you will be able to:

  • Install and run TensorFlow on your Arm-based cloud server.
  • Use MLPerf Inference benchmark suite, an open-sourced benchmark from MLCommons to test ML performance on your Arm server.


Before starting, you will need the following:

  • An Arm based instance from an appropriate cloud service provider or an on-premise Arm server.