/
WhichVM

inf2.xlarge

The inf2.xlarge instance is in the Machine Learning ASIC family with 4 vCPUs and 16 GiB of memory, starting at $0.758200/hr.

Summary Pricing

$0.758200On Demand
$0.154203Spot
$0.4776701y Reserved
$0.3275403y Reserved

Filters

Region
OS
Interval
Currency
Reserved/Savings Plan
Note: All tables below are also dropdowns

Compute

vCPUs
4
Memory
16 GiB
Memory/vCPU
4.0 GiB
Processor
AMD EPYC 7R13 Processor
Architecture
X86_64
GPU
No
GPU Model
N/A

Connectivity

Network Performance
Up to 15 Gigabit
Storage
EBS only

Cloud Provider

Provider
AWS
Instance Type
inf2.xlarge
Family
Machine Learning ASIC
RegionID
us-east-1

Linux Committed Pricing Detail

Regional Availability

No regional availability data found for this instance.

Find Better Options

Cost Optimization Tools

Same vCPUs comparison across families and providers

No better alternatives found for this instance with current filters.