AWS Simple Storage Service (S3) is one of the most popular AWS storage services. It is used for a wide range of use cases such as static websites, blogs, personal media storage, big data analytics, staging area for Redshift data, and enterprise backup storage. Because of its widespread usage, for most enterprises, AWS S3 spend is one of the top five biggest cost drivers among all AWS services. For a deeper analysis of AWS and Azure cloud spend across market segments, check out our Cloud Usage Report.
There are three primary costs associated with S3: storage cost charged per GB per month or hour, API cost for operation of files, and data transfer cost outside the AWS region. Despite these expenses S3 buckets remain popular because they are a durable solution and offer a simple web interface to store and retrieve data. However, enterprises that use S3 buckets for content delivery can drive up their cloud spend quickly.
S3 cost depend on a variety of attributes and cloud users need to carefully analyze factors such as their associated services, instances, tags, etc., to curtail their AWS S3 spend.
If you’re a cloud operations admin or cloud engineer, you’re likely aware of the moving parts of S3 storage. For example, data read/write or data moved in/out are considered as moving billable parts of S3 storage. This often means that AWS S3 expenses are influenced by a lot more than just the storage cost. A detailed analysis of all these factors can help you avoid the dreaded AWS S3 bill shock.
Making the most of AWS S3 buckets:
With AWS, you pay for the services you use and the storage units you’ve consumed. If AWS S3 service is a significant component of your AWS cost, then implementing best practices for managing AWS S3 costs becomes critical.
Here are some easy to implement checks that can help you manage your AWS S3 costs:
1. Store your data in a compressed format:
While there is no charge for transferring data into an S3 bucket, there is a charge involved for data storage and requests like PUT, GET, LIST etc. To avoid paying extra, it is essential to store your data in a compressed format.
2. Evenly distribute S3 objects:
If the S3 objects are distributed evenly into a virtual folder structure, the number of file operations needed to read these files will be less. This will lead to less spend as there is an additional cost for LIST and GET operations.
3. Use S3 for hosting static websites:
This helps to avoid EC2 costs and administrative overheads. S3 storage can scale up to millions of users without any manual intervention.
4. Appropriately tag buckets:
It is important to tag buckets appropriately so that you can prevent misuse of any S3 resources or easily identify if any data compromise occurs.
5. Monitor S3 buckets:
The S3 objects access patterns should be monitored and moved to appropriate storage classes. Different storage classes are priced differently therefore storing objects in the appropriate class can help reduce costs.
6. Enable “Lifecycle” feature:
With versioning enabled on S3 buckets, it becomes easier to delete unused objects or older versions. With AWS Lifecycle Management, you can define time-based rules that can trigger ‘Transition’(moving objects to different storage class) and ‘Expiration’ (permanent deletion of objects). This ensures limiting your S3 costs by reducing S3 storage.
7. Use Compressible Formats:
When using S3 for Big Data analytics, and staging Redshift data, try to use compressible formats like AVRON, PARQUET, ORC which will reduce the amount of S3 storage consumed.
8. Removing Unused S3 Buckets:
Cloud users often leave S3 buckets running even if they have no data in them. At Nutanix, we’ve seen unused storage buckets being discovered with a lot of our customers who are using Xi Beam - our cloud cost management service. For more on that, check out our Cloud Usage Report which goes into a lot of detail about cloud usage patterns.
S3 buckets are also notorious for leading to data breaches. Though this blog focuses on cloud spend, the potential for data breaches is critical to address as well. The way the S3 buckets are deployed, it’s literally as simple as missing a check-box and you’ve left your data exposed. There needs to be a strong focus to check for access permissions on S3 buckets.
9. Use a Cloud Cost Optimization service:
In order to keep your cloud costs under control on an ongoing basis, there is one last recommendation we’d like to make. And that is to use a dedicated cost optimization tool instead of relying on the cloud provider’s free options that do not go far enough to help you control your costs. There’s a lot of options on the market so which one should you use? Glad you asked.
Xi Beam provides deep visibility and unparalleled insights into the spend in your multi-cloud environment including AWS. Cloud operators can proactively identify underutilized resources, gain specific recommendations to right-size infrastructure services and easily ensure optimal cloud consumption. Beam’s machine intelligence driven reserved instance purchase recommendations help to drive deep cost savings and keep your cloud spend under control. If you’re interested to try Beam and save on your cloud spend, we offer a free 14-day trial of the to test it out.
Get started with Beam get those cloud costs under control today!
Disclaimer: This blog may contain links to external websites that are not part of Nutanix.com. Nutanix does not control these sites and disclaims all responsibility for the content or accuracy of any external site. Our decision to link to an external site should not be considered an endorsement of any content on such site.
© 2019 Nutanix, Inc. All rights reserved. Nutanix, the Nutanix logo and the other Nutanix products and features mentioned herein are registered trademarks or trademarks of Nutanix, Inc. in the United States and other countries. All other brand names mentioned herein are for identification purposes only and may be the trademarks of their respective holder(s).