Types and Comparison

Amazon S3 Storage Classes: Features, Limits, and Use Cases

Storage Class

Purpose

Definition

Differences

Limits

Use Cases

When to Use

S3 Standard

General-purpose storage for frequently accessed data

Durable, low-latency object storage

Highest availability (99.99%), low latency

Max object size: 5TBUnlimited total storage

Websites, applications, backups, data lakes

When data needs low latency and frequent access

S3 Intelligent-Tiering

Cost-optimized storage for unpredictable access

Automatically moves objects between tiers

Optimizes costs by shifting data between frequent/infrequent tiers

Min storage duration: 30 days Max object size: 5TB

Data with unknown or changing access patterns

When unsure how frequently data will be accessed

S3 Standard-IA (Infrequent Access)

Lower-cost storage for less frequently accessed data

Same durability as Standard but lower availability

Cheaper than Standard, retrieval fees apply

Min storage duration: 30 days Max object size: 5TB

Disaster recovery, backups, long-term storage

When data is accessed occasionally but must be available immediately

S3 One Zone-IA

Infrequent access storage in a single AWS zone

Lower durability as it is stored in one AZ

Costs less than Standard-IA but lacks multi-AZ redundancy

Min storage duration: 30 days Max object size: 5TB

Secondary backups, easily recreatable data

When cost savings matter more than high availability

S3 Glacier Instant Retrieval

Archival storage with immediate access

Similar to Standard-IA but for archive data

Costs less than IA but optimized for rarely accessed objects

Min storage duration: 90 days Max object size: 5TB

Compliance archives, medical imaging

When data must be retained but accessed occasionally

S3 Glacier Flexible Retrieval

Low-cost archival storage with retrieval flexibility

Long-term, infrequently accessed storage

Cheaper than Glacier Instant, retrieval takes minutes to hours

Min storage duration: 90 days Max object size: 40TB

Compliance, historical records, regulatory storage

When access is rare but retrieval speed isn’t urgent

S3 Glacier Deep Archive

Lowest-cost archival storage

Cold storage for data retained for years

Cheapest option, retrieval takes hours

Min storage duration: 180 days Max object size: 40TB

Legal archives, raw research data, financial records

When data is accessed once in a few years

S3 Reduced Redundancy Storage (RRS)

Legacy storage option for non-critical data

Designed for non-critical data replication

Lower durability (99.99%) than other classes

Max object size: 5TB

Temporary data, non-essential backups

Rarely used now; S3 Standard is preferred

Key Limits for S3

  • Bucket limit per AWS account: 100 (can request an increase)

  • Object size limit: 5TB (except Glacier, which allows 40TB)

  • PUT/POST requests per second: 3,500 per prefix

  • GET requests per second: 5,500 per prefix

  • Total storage: Unlimited

This now includes limits, retention periods, and when to use each storage class. 🚀 Let me know if you need further refinements!

Last updated