Input parameters for Local Repositories
Input all required values in
input/software_config.json
.
Parameter |
Details |
---|---|
cluster_os_type
Required |
Default value: |
cluster_os_version
Required |
|
repo_config
Required |
Note
|
softwares
Required |
Note The accepted names for software is taken from |
Below is a sample version of the file:
{
"cluster_os_type": "ubuntu",
"cluster_os_version": "22.04",
"repo_config": "partial",
"softwares": [
{"name": "k8s", "version":"1.26.12"},
{"name": "jupyter"},
{"name": "openldap"},
{"name": "kubeflow"},
{"name": "beegfs", "version": "7.4.2"},
{"name": "nfs"},
{"name": "kserve"},
{"name": "amdgpu", "version": "6.0"},
{"name": "cuda", "version": "12.3.2"},
{"name": "ofed", "version": "24.01-0.3.3.1"},
{"name": "vllm"},
{"name": "pytorch"},
{"name": "tensorflow"},
{"name": "bcm_roce", "version": "229.2.9.0"}
],
"kserve": [
{"name": "istio"},
{"name": "cert_manager"},
{"name": "knative"}
],
"amdgpu": [
{"name": "rocm", "version": "6.0" }
],
"vllm": [
{"name": "vllm_amd"},
{"name": "vllm_nvidia"}
],
"pytorch": [
{"name": "pytorch_cpu"},
{"name": "pytorch_amd"},
{"name": "pytorch_nvidia"}
],
"tensorflow": [
{"name": "tensorflow_cpu"},
{"name": "tensorflow_amd"},
{"name": "tensorflow_nvidia"}
]
}
For a list of accepted values in softwares
, go to input/config/<operating_system>/<operating_system_version>
and view the list of JSON files available. The filenames present in this location (without the * .json extension) are a list of accepted software names. The repositories to be downloaded for each software are listed the corresponding JSON file. For example: For a cluster running Ubuntu 22.04, go to input/config/ubuntu/22.04/
and view the file list:
amdgpu.json
bcm_roce.json
beegfs.json
cuda.json
jupyter.json
k8s.json
kserve.json
kubeflow.json
nfs.json
ofed.json
openldap.json
pytorch.json
tensorflow.json
vllm.json
For a list of repositories (and their types) configured for amdgpu, view the amdgpu.json`
file:
{
"amdgpu": {
"cluster": [
{"package": "linux-headers-$(uname -r)", "type": "deb", "repo_name": "jammy"},
{"package": "linux-modules-extra-$(uname -r)", "type": "deb", "repo_name": "jammy"},
{"package": "amdgpu-dkms", "type": "deb", "repo_name": "amdgpu"}
]
},
"rocm": {
"cluster": [
{"package": "rocm-hip-sdk{{ rocm_version }}*", "type": "deb", "repo_name": "rocm"}
]
}
}
Note
To configure a locally available repository that does not have a pre-defined json file, click here.
Input the required values in
input/local_repo_config.yml
.
Parameter |
Details |
---|---|
repo_store_path
Required |
Default value: |
user_repo_url
Optional |
|
user_registry
Optional |
|
os_repo_url
Optional |
|
omnia_repo_url_rhel
Required |
Default value: - { url: "https://download.docker.com/linux/centos/$releasever/$basearch/stable", gpgkey: "https://download.docker.com/linux/centos/gpg" }
- { url: "https://repo.radeon.com/rocm/rhel8/{{ rocm_version }}/main", gpgkey: "https://repo.radeon.com/rocm/rocm.gpg.key" }
- { url: "https://download.fedoraproject.org/pub/epel/8/Everything/$basearch", gpgkey: "https://dl.fedoraproject.org/pub/epel/RPM-GPG-KEY-EPEL-8" }
- { url: "https://repo.radeon.com/amdgpu/{{ amdgpu_version }}/rhel/{{ cluster_os_version }}/main/x86_64", gpgkey: "https://repo.radeon.com/rocm/rocm.gpg.key" }
- { url: "https://www.beegfs.io/release/beegfs_{{beegfs_version}}/dists/rhel8", gpgkey: "https://www.beegfs.io/release/beegfs_{{beegfs_version}}/gpg/GPG-KEY-beegfs" }
- { url: "https://yum.repos.intel.com/oneapi", gpgkey: "https://yum.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB" }
- { url: "https://ltb-project.org/rpm/openldap25/$releasever/$basearch", gpgkey: ""}
|
omnia_repo_url_rocky
Required |
Default value: - { url: "https://download.docker.com/linux/centos/$releasever/$basearch/stable", gpgkey: "https://download.docker.com/linux/centos/gpg" }
- { url: "https://repo.radeon.com/rocm/rhel8/{{ rocm_version }}/main", gpgkey: "https://repo.radeon.com/rocm/rocm.gpg.key" }
- { url: "https://download.fedoraproject.org/pub/epel/8/Everything/$basearch", gpgkey: "https://dl.fedoraproject.org/pub/epel/RPM-GPG-KEY-EPEL-8" }
- { url: "https://repo.radeon.com/amdgpu/{{ amdgpu_version }}/rhel/{{ cluster_os_version }}/main/x86_64", gpgkey: "https://repo.radeon.com/rocm/rocm.gpg.key" }
- { url: "https://www.beegfs.io/release/beegfs_{{beegfs_version}}/dists/rhel8", gpgkey: "https://www.beegfs.io/release/beegfs_{{beegfs_version}}/gpg/GPG-KEY-beegfs" }
- { url: "https://yum.repos.intel.com/oneapi", gpgkey: "https://yum.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB" }
- { url: "https://ltb-project.org/rpm/openldap25/$releasever/$basearch", gpgkey: ""}
- { url: "http://dl.rockylinux.org/$contentdir/$releasever/PowerTools/$basearch/os/", gpgkey: ""}
|
omnia_repo_url_ubuntu
Required |
Default value: - { url: "https://download.docker.com/linux/ubuntu {{ os_release }} stable", gpgkey: "https://download.docker.com/linux/ubuntu/gpg" }
- { url: "https://repo.radeon.com/rocm/apt/{{ rocm_version }} {{ os_release }} main", gpgkey: "https://repo.radeon.com/rocm/rocm.gpg.key" }
- { url: "https://www.beegfs.io/release/beegfs_{{beegfs_version}} {{ os_release }} non-free", gpgkey: "https://www.beegfs.io/release/beegfs_{{beegfs_version}}/gpg/GPG-KEY-beegfs" }
- { url: "https://repo.radeon.com/amdgpu/{{ amdgpu_version }}/ubuntu {{ os_release }} main", gpgkey: "https://repo.radeon.com/rocm/rocm.gpg.key" }
- { url: "https://ltb-project.org/debian/openldap25/jammy jammy main", publickey: "https://ltb-project.org/documentation/_static/RPM-GPG-KEY-LTB-project" }
- { url: "https://nvidia.github.io/libnvidia-container/stable/deb/amd64 /", gpgkey: "https://nvidia.github.io/libnvidia-container/gpgkey" }
- { url: "http://ppa.launchpad.net/deadsnakes/ppa/ubuntu {{ os_release }} main", gpgkey: "" }
- { url: "https://a2o.github.io/snoopy-packages/repo/ubuntu {{ os_release }} stable", publickey: "https://a2o.github.io/snoopy-packages/snoopy-packages-key.pub" }
|
Input
docker_username
anddocker_password
ininput/provision_config_credentials.yml
to avoid image pullback errors.
If you have any feedback about Omnia documentation, please reach out at omnia.readme@dell.com.