/
WhichVM

inf2.8xlarge

The inf2.8xlarge instance is in the Machine Learning ASIC family with 32 vCPUs and 128 GiB of memory, starting at $1.9679/hr.

Summary Pricing

$1.9679On Demand
$0.967413Spot
$1.23971y Reserved
$0.8501203y Reserved

Filters

Region
OS
Interval
Currency
Reserved/Savings Plan
Note: All tables below are also dropdowns

Compute

vCPUs
32
Memory
128 GiB
Memory/vCPU
4.0 GiB
Processor
AMD EPYC 7R13 Processor
Architecture
X86_64
GPU
No
GPU Model
N/A

Connectivity

Network Performance
Up to 25 Gigabit
Storage
EBS only

Cloud Provider

Provider
AWS
Instance Type
inf2.8xlarge
Family
Machine Learning ASIC
RegionID
us-east-1

Linux Committed Pricing Detail

Regional Availability

No regional availability data found for this instance.

Find Better Options

Cost Optimization Tools

Same vCPUs comparison across families and providers

No better alternatives found for this instance with current filters.