Financial Management Migration Optimization Tips

Amazon Cloud Storage: 7 Factors that Affect Amazon S3 Pricing

To help customers avoid unnecessary cloud spend, we explain the seven factors that affect AWS S3 pricing and practical tips to maximize your cloud spend with Amazon S3.

One of Amazon’s most popular cloud storage services is Amazon Simple Storage Service (S3). Despite its name, AWS cloud storage pricing for Amazon S3 causes more confusion than any of Amazon’s other cloud storage services. 

If you’re not clear about what levels of latency, redundancy, and accessibility you need, you can end up paying too much, or potentially losing business-critical data due to a lack of replication. To help customers avoid unnecessary cloud spend, we’ll explain the seven factors that affect AWS S3 pricing and practical tips to maximize your cloud spend with Amazon S3.

What is Amazon S3?

First of all, what is Amazon S3? Amazon S3 is object storage built to store and retrieve any amount of data from anywhere on the internet. 

If you’re looking for a durable, available, and scalable storage solution at the lowest possible cost, Amazon S3 is likely your best option. S3 delivers near 100% uptime and unlimited storage, while protecting sensitive data with a range of security and compliance capabilities. Common use cases include big data analytics, websites, mobile applications, backup and restore, archive, enterprise applications, and more.

Sounds great, right? So how much does it cost?

Seven factors that affect Amazon S3 pricing

There are seven factors that affect Amazon S3 pricing. We’ll look at each of the following factors independently to explain what each is and what influence it has on price.

  1. The region where you store your data
  2. The volume of data you store
  3. The level of redundancy
  4. The storage class
  5. Data requests
  6. Data transfers
  7. Data retrievals (Glacier only)

1. The region where you store your data

As of November 2020, there are 21 different regions where it’s possible to store data. Prices are different for each storage region, so you should compare costs for all regions where it’s acceptable for your business to store data. Below is a sample for the Northern California and Oregon regions.

sample s3 pricing amazon cloud storage

2. The volume of data you store

With AWS cloud storage pricing, the more data you store in the Standard storage tier, the less you pay per GB. Like many AWS services, S3 has a free tier for the first 12 months of your contract that gives you 5GB of Standard S3 storage. After the free storage is exhausted, the rate per GB decreases as the total amount stored in the Standard storage tier passes certain thresholds. The following pricing is for S3 Standard storage in the Oregon region: 

  • First 50TB per month – $0.0230 per GB
  • Next 450TB per month – $0.0220 per GB
  • Over 500TB per month – $0.0210 per GB

3. The level of redundancy

Data stored in the AWS S3 storage service is highly durable, but to prevent any potential loss of data, Amazon replicates data stored in the Standard storage tier, the Standard Infrequent Access storage tier, and the Archive storage tier across a minimum of three Availability Zones (AZs).

Businesses can increase or decrease this level of redundancy according to the nature of the data. For mission-critical data, you can configure S3 to automatically replicate data to another S3 bucket, either across different AWS regions or within the same region. This approach will increase storage costs by more than two times because you’ll not only pay the additional per GB S3 storage costs, but also the cost of PUT requests (i.e. putting the data in the S3 bucket), and potentially the cost of transferring the data from one region to another.

A more cost-efficient approach is to replicate mission-critical data to another region and then migrate that data to a less expensive storage class to reduce disaster recovery costs. Complete Amazon S3 pricing for replication services can be seen here.

4. The storage class

The AWS S3 service allows businesses to store data in different storage classes (also known as tiers) depending on how frequently data will be accessed, how long you need to store the data, and the redundancy and availability required. Frequently accessed data should be placed in the Standard S3 tier because, although the Infrequently Accessed and Archive tiers are cheaper, it costs more for PUT, COPY, GET, etc. requests and to retrieve data from the Infrequently Accessed and Archive tiers.

For the complete list of Amazon S3 storage classes, see our Ultimate Guide to Amazon Cloud Storage Pricing.

Amazon also provides an Intelligent-Tiering service to automatically transfer data between the Standard S3 and Standard Infrequent tiers, and most recently announced at AWS re:Invent 2020, between the Archive and Deep Archive storage tiers as well. 

The Intelligent-Tiering service can significantly reduce the management overhead by automatically moving data to the most cost-effective access tier. But keep in mind, it incurs a charge of $0.0025 per thousand items monitored, so costs can rack up if quickly if the service goes unchecked.

5. Data requests

How you access your data also impacts your S3 costs. S3 costs depend on the request type (e.g. PUT, COPY, GET, etc.), the number of requests, and the volume of data retrieved. For example, 1,000 GET requests to a Glacier tier costs only $0.0004, but 1,000 PUT requests to the same tier costs 125 times as much ($.05) and adds up much more quickly. It’s important to examine how you’ll access your data when choosing a storage class.

6. Data transfers

Data transfer costs are incurred when data is transferred out from an S3 storage bucket to the internet or another region (data transfer into S3 is free). Like the volume discounts for AWS simple storage, the more data you transfer out, the lower the rate you
pay per GB.

  • For data transfer out to the internet, the first 1GB/month is free, and after that, rates start at $0.09 per GB (Oregon region) and go as low as $0.05 per GB
  • Outbound data transfers to another region typically cost around $0.02 per GB for most regions in North America and Europe, but can be higher or lower depending on the distance between regions. Outbound data transfer for data originating in Asia Pacific, South America, Africa, and the Middle East are significantly higher
  • Data transfer out to CloudFront is always free

7. Data retrievals

The last factor that affects Amazon S3 pricing is data retrievals. Retrieving data from a Standard S3 storage volume is free of charge. However, if you place data in an Infrequent or Glacier storage class, charges apply when you retrieve it (and for requesting it, as outlined above). Pricing for data retrieval is based on the volume of data retrieved, but can vary depending on whether an expedited retrieval or bulk retrieval from an archive storage tier is required.

AWS offers the first 10GB of S3 Glacier retrievals for free, and after that, a per GB fee is charged. Glacier data retrieval request prices range from pennies per 1,000 requests for bulk requests, to $10 per 1,000 requests for expedited requests. If you’re making thousands of requests to retrieve data from archive storage every month, you really need to consider relocating the data.

AWS offers retrieval services called S3 Select and Glacier Select that enable you to retrieve subsets of data rather than entire objects. These services accelerate the speed at which data is retrieved and can save money on data transfer costs.

Optimizing cloud costs with Amazon S3

AWS offers paid-for S3 storage management solutions to help you manage, tag, and analyze your inventory of data. Amazon CloudWatch and AWS CloudTrail services offer free levels, but they can incur costs depending on the number of dashboards, metrics, alarms, logs, and custom events you use or create each month.

If your head isn’t already exploding with the variety of Amazon S3 storage options and associated costs, you may want to consider an AWS management solution, such as CloudHealth. CloudHealth simplifies S3 cost calculations by analyzing how your data is stored, providing reports that can identify where inefficiencies exist, and helping you to optimize S3 costs over time.

Once optimized, our platform maintains the optimized state through policy-driven automation. You simply create policies that—for example—alert you when S3 costs have exceeded a certain amount for an account or region, so you can take action and migrate objects to a lower, less expensive tier. You can also get reports and alerts based on granular S3 charges like transfer costs, and analyze where data is being transferred to. 

You can see how this works in practice by scheduling a free demo. And for more detailed information and best practices for managing and optimizing your Amazon cloud storage costs, see our in-depth eBook: The Ultimate Guide to Amazon Cloud Storage Pricing