Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • SkyHiGh:
    • gpu.v100.8G: 8CPU, 90GB RAM and 1/4 of a Nvidia Tesla v100 with 8GB GPU-RAMVRAM.
    • gpu.micro.a100.4G: 5CPU, 25GB RAM and 1/10 of a Nvidia Tesla A100 with 4GB GPU RAMVRAM.
    • gpu.medium.a100.10G: 12CPU, 60GB RAM and 1/4 of a Nvidia Tesla A100 with 10GB GPU RAMVRAM.
      • Currently primarily for affiliates of SFI NORCICS
    • gpu.large.a100.20G: 24CPU, 120GB RAM and 1/2 of a Nvidia Tesla A100 with 20GB GPU RAMVRAM.
      • Currentlt primarily for affiliates of SFI NORCICS
    • gpu.large.a100d.20G: 24CPU, 120GB RAM and 1/4 of a Nvidia Tesla A100D with 20GB GPU RAMVRAM
      • Currently exclusively for affiliates of NBL
    • gpu.large+r.a100.20G: 24CPU, 240GB RAM and 1/2 of a Nvidia Tesla A100 with 20GB RAMVRAM
      • Currently exclusively for affiliates of NBL
  • SkyLow:
    • gpu.m10.8G: 8 CPU, 20GB RAM and 1 Nvidia Tesla M10 with 8GB GPU-RAMVRAM.
  • StackIT:
    • gpu.a100.10G: 14CPU, 60GB RAM, 1/4 Nvidia Tesla a100 with 10GB VRAM.

Custom flavors

Some of our tenants have requirements not fitting the "general purpose" use-cases, and is investing in dedicated compute-nodes equipped with accellerators, local storage and similar. These tenants usually gets custom flavors (Identified by the keyword 'custom' in its name) providing access to these resources. The meaning behind each of the custom flavors is given to the personell needing access.

IO-Intensive flavors

Most of our General-Purpose flavors are available in high-IO-versions. If 300IOPS are not enough for your use-case you can be granted access to flavors giving 600IOPS or 1200 IOPS:

...