Home: https://onnxruntime.ai/
Package license: MIT
Feedstock license: BSD-3-Clause
Summary: Cross-platform, high performance ML inferencing and training accelerator
Azure |
Name | Downloads | Version | Platforms |
---|---|---|---|
Installing onnxruntime
from the astrorama
channel can be achieved by adding astrorama
to your channels with:
conda config --add channels astrorama
conda config --set channel_priority strict
Once the astrorama
channel has been enabled, onnxruntime
can be installed with:
conda install onnxruntime
It is possible to list all of the versions of onnxruntime
available on your platform with:
conda search onnxruntime --channel astrorama
If you would like to improve the onnxruntime recipe or build a new
package version, please fork this repository and submit a PR. Upon submission,
your changes will be run on the appropriate platforms to give the reviewer an
opportunity to confirm that the changes result in a successful build. Once
merged, the recipe will be re-built and uploaded automatically to the
astrorama
channel, whereupon the built conda packages will be available for
everybody to install and use from the astrorama
channel.
Note that all branches in the astrorama/onnxruntime-feedstock are
immediately built and any created packages are uploaded, so PRs should be based
on branches in forks and branches in the main repository should only be used to
build distinct package versions.
In order to produce a uniquely identifiable distribution:
- If the version of a package is not being increased, please add or increase
the
build/number
. - If the version of a package is being increased, please remember to return
the
build/number
back to 0.