Pump is 100% free. We help you save on cloud like the big tech!
Book a demo today!
inf1.2xlarge
On Demand
5.193
Spot
N/A
Reserved 1 Year
3.115991
Reserved 3 Year
2.311556
Region
US East (N. Virginia)
Payment Term
Partial Upfront
Variant
dedicated
Add to Cart
Family Sizes
Size
vCPU
Memory
inf1.xlarge
4
8
inf1.2xlarge
8
16
inf1.6xlarge
24
48
inf1.24xlarge
96
192
Summary
The inf1.2xlarge instance is in the machine learning asic instances family with 8 vCPUs and 16 GiB of memory.
Tech Specs
Compute
Value
vCPUs
8
Memory (GiB)
16
Memory per vCPU (GiB)
2
Physical Processor
Intel Xeon Platinum 8275CL (Cascade Lake)
Clock Speed (GHz)
N/A
CPU Architecture
x86_64
GPU
1
GPU Architecture
AWS Inferentia
Video Memory (GiB)
0
GPU Compute Capability
0
FPGA
0
Storage
Value
EBS Optimized
False
Max Bandwidth (Mbps) on EBS
4750
Max Throughput (MB/s) on EBS
148.75
Max I/O operations/second
20000
Baseline Bandwidth (Mbps) on EBS
1190
Baseline Throughput (MB/s) on EBS
148.75
Baseline I/O operations/second
6000
Devices
N/A
Swap Partition
N/A
NVME Drive
N/A
Disk Space (GiB)
N/A
SSD
N/A
Initialize Storage
N/A
Networking
Value
Network Performance (Gibps)
Up to 25
Enhanced Networking
False
IPv6
False
Placement Group
False
Amazon
Value
Generation
current
Instance Type
inf1.2xlarge
Family
Machine Learning ASIC Instances
Name
INF1 Double Extra Large
Elastic Map Reduce
False