Pump is 100% free. We help you save on cloud like the big tech!
Book a demo today!
inf2.8xlarge
On Demand
14.279
Spot
N/A
Reserved 1 Year
8.56779
Reserved 3 Year
5.71186
Region
US East (N. Virginia)
Payment Term
Partial Upfront
Variant
dedicated
Add to Cart
Family Sizes
Size
vCPU
Memory
inf2.xlarge
4
16
inf2.8xlarge
32
128
inf2.24xlarge
96
384
inf2.48xlarge
192
768
Summary
The inf2.8xlarge instance is in the machine learning asic instances family with 32 vCPUs and 128 GiB of memory.
Tech Specs
Compute
Value
vCPUs
32
Memory (GiB)
128
Memory per vCPU (GiB)
4
Physical Processor
AMD EPYC 7R13 Processor
Clock Speed (GHz)
2.95 GHz
CPU Architecture
x86_64
GPU
1
GPU Architecture
AWS Inferentia2
Video Memory (GiB)
32
GPU Compute Capability
0
FPGA
0
Storage
Value
EBS Optimized
False
Max Bandwidth (Mbps) on EBS
10000
Max Throughput (MB/s) on EBS
1250
Max I/O operations/second
40000
Baseline Bandwidth (Mbps) on EBS
10000
Baseline Throughput (MB/s) on EBS
1250
Baseline I/O operations/second
40000
Devices
N/A
Swap Partition
N/A
NVME Drive
N/A
Disk Space (GiB)
N/A
SSD
N/A
Initialize Storage
N/A
Networking
Value
Network Performance (Gibps)
Up to 25
Enhanced Networking
False
IPv6
False
Placement Group
False
Amazon
Value
Generation
current
Instance Type
inf2.8xlarge
Family
Machine Learning ASIC Instances
Name
INF2 Eight Extra Large
Elastic Map Reduce
False