/
WhichVM

inf2.24xlarge

The inf2.24xlarge instance is in the Machine Learning ASIC family with 96 vCPUs and 384 GiB of memory, starting at $6.4906/hr.

Summary Pricing

$6.4906On Demand
$4.0898Spot
$4.08911y Reserved
$2.80393y Reserved

Filters

Region
OS
Interval
Currency
Reserved/Savings Plan
Note: All tables below are also dropdowns

Compute

vCPUs
96
Memory
384 GiB
Memory/vCPU
4.0 GiB
Processor
AMD EPYC 7R13 Processor
Architecture
X86_64
GPU
No
GPU Model
N/A

Connectivity

Network Performance
50 Gigabit
Storage
EBS only

Cloud Provider

Provider
AWS
Instance Type
inf2.24xlarge
Family
Machine Learning ASIC
RegionID
us-east-1

Linux Committed Pricing Detail

Regional Availability

No regional availability data found for this instance.

Find Better Options

Cost Optimization Tools

Same vCPUs comparison across families and providers

No better alternatives found for this instance with current filters.