If you’re running any AI or ML workloads on Ampere processors, you want to make sure you use Ampere Optimized Frameworks for a 2-5 X boost in performance. Since this free AI software is yet to be added to the Azure marketplace we provided walkaround access via the Azure Community Images gallery.
Ampere Optimized Frameworks are by no means exclusive, or even specific to, Azure.
To the contrary, in fact, the only reason for this guide is that setting them up from the Azure marketplace, which would be the most obvious way for Azure users, isn’t available yet. Likewise, they are yet to be added to the GCP marketplace and so we also have a simple walkaround for GCP users.
The software should by all means be platform agnostic as long as the platform in question uses Ampere CPUs. There are also multiple distribution channels already in place. I’d say a direct download for the Ampere AI Solutions site, Artificial Intelligence Inference Performance, is still probably the simplest/fastest. This conjecture is also based on the fact that, at the moment, it’s the channel we see the most downloads from, i.e., remains the channel of choice for most developers out there.