Pump is 100% free. We help you save on cloud like the big tech!
Book a demo today!
inf2.24xlarge
On Demand
14.279
Spot
N/A
Reserved 1 Year
8.56779
Reserved 3 Year
5.71186
Region
US East (N. Virginia)
Payment Term
Partial Upfront
Variant
dedicated
Add to Cart
Family Sizes
Size
vCPU
Memory
inf2.xlarge
4
16
inf2.8xlarge
32
128
inf2.24xlarge
96
384
inf2.48xlarge
192
768
Summary
The inf2.24xlarge instance is in the machine learning asic instances family with 96 vCPUs and 384 GiB of memory.
Tech Specs
Compute
Value
vCPUs
96
Memory (GiB)
384
Memory per vCPU (GiB)
4
Physical Processor
AMD EPYC 7R13 Processor
Clock Speed (GHz)
2.95 GHz
CPU Architecture
x86_64
GPU
6
GPU Architecture
AWS Inferentia2
Video Memory (GiB)
192
GPU Compute Capability
0
FPGA
0
Storage
Value
EBS Optimized
False
Max Bandwidth (Mbps) on EBS
30000
Max Throughput (MB/s) on EBS
3750
Max I/O operations/second
120000
Baseline Bandwidth (Mbps) on EBS
30000
Baseline Throughput (MB/s) on EBS
3750
Baseline I/O operations/second
120000
Devices
N/A
Swap Partition
N/A
NVME Drive
N/A
Disk Space (GiB)
N/A
SSD
N/A
Initialize Storage
N/A
Networking
Value
Network Performance (Gibps)
50
Enhanced Networking
False
IPv6
False
Placement Group
False
Amazon
Value
Generation
current
Instance Type
inf2.24xlarge
Family
Machine Learning ASIC Instances
Name
INF2 24xlarge
Elastic Map Reduce
False