List available GPU types and their specifications.
runpodctl gpu <subcommand> [flags]
Subcommands
List available GPUs
List GPUs that are currently available:
Include unavailable GPUs in the list:
runpodctl gpu list --include-unavailable
List flags
Include GPUs that are currently unavailable.
Example output
[
{
"available": true,
"communityCloud": true,
"displayName": "RTX 4090",
"gpuId": "NVIDIA GeForce RTX 4090",
"memoryInGb": 24,
"secureCloud": true,
"stockStatus": "High"
},
{
"available": true,
"communityCloud": true,
"displayName": "A100 PCIe",
"gpuId": "NVIDIA A100 80GB PCIe",
"memoryInGb": 80,
"secureCloud": true,
"stockStatus": "High"
}
]
Using GPU IDs
When creating Pods or Serverless endpoints, use the GPU ID from the list with the --gpu-id flag:
runpodctl pod create --template-id runpod-torch-v21 --gpu-id "NVIDIA RTX 4090"