In this article you will learn:
Every year, IT analytics company Flexera releases the State of the Cloud Report, finding that public cloud adoption is skyrocketing, but so is wasted spending. Flexera wrote in their 2020 State of the Cloud Report that waste in public cloud computing is self-estimated by companies to be almost 30%. However, as they also pointed out, organizations tend to underestimate their waste, so their real estimation is approximately 35%. It’s not possible to decrease waste spending completely to 0%, but also a small decrease can be helpful. It’s really important to save money where you can because then you can spend it where it’s needed the most (business goals, innovation of your product, etc.).
What are the root causes of this wasteful spending in AWS Cloud?
Mismanaged cloud resources: Idle, unused, over-provisioned.
Pricing complexity and difficulty predicting spending.
AWS offers over 200 fully-featured services and with a lot of options comes a lot of choices.
Because cloud resources are easy to deploy and costs are tightly coupled with usage, companies must rely on good governance and user behavior to manage and optimize costs. When the StormIT team architect technology solutions on Amazon Web Services (AWS) we take optimization of cloud services costs as one of the main areas and always aim to lower costs, but this is not possible without the proper knowledge which StormIT team can offer.
Evaluate AWS Cloud spend
It’s important to understand not just what you’re spending, but the value you’re getting in return. A bigger bill doesn’t necessarily indicate a problem if it means you’re growing your business.
The best way to evaluate cloud value is by looking for the unit cost (new users, subscribers, API calls, or page views) which is important for your business. The unit cost is the total cost of AWS services divided by the number of your units. Then you can focus on reducing this unit cost and know that your business can still grow.
8 AWS Cloud Cost Optimization Strategies
The majority of users of Amazon Web Services are familiar with at least some AWS cost optimization best practices, but probably not all of them. Below you will find a condensed list of the main tools for AWS Cloud cost optimization and management. However, which tools will work for you always depends on your architecture and it is almost impossible to tell which of these strategies and tools can bring the most cost savings to your use case.
StormIT team has more than seven years of experience working with AWS Cloud and we can help you with the implementation of these recommendations. If you have any questions, please contact us for a quick chat.
1. Use tags in your environment
Tagging can help you organize your resources, and to track your AWS costs on a detailed level. You should categorize resources by owner, purpose, or environment, which helps you organize them and assign cost accountability. You also should enforce at least some quality of tagging. You can set up cost allocation tags and use AWS generated tags or User-Defined Cost Allocation Tags.
2. Choose the right pricing model
AWS provides a range of pricing models for computing, storage, and other services. Choose the right pricing model to optimize costs based on the nature of your workload. The following section describes pricing models, which are mostly used for a range of AWS services:
On-Demand: Depending on the service, on-demand has an hourly rate or can be billed in increments of one second (for example Amazon RDS, or Linux EC2 instances). On-demand is mainly recommended for applications with short-term (one year or less) workloads that have periodic spikes, are unpredictable, or can’t be interrupted.
Amazon EC2 Spot Instances: EC2 Spot Instances lets you take advantage of unused EC2 capacity at discounts of up to 90% off the on-demand price. You should use Spot Instances for fault-tolerant or flexible applications and test and development workload, because EC2 spot instances can be interrupted with a two-minute warning if AWS needs the capacity back. You can combine Spot Instances with RI’s and On-Demand Instances using EC2 Auto Scaling.
Commitment discounts: Best for long-term projects and workloads with stable and predictable behavior. Users can select from multiple types depending on their business needs:
Savings Plans: This allows you to make an hourly commitment (measured in USD per hour) for one or three years and receive discounts across your computing resources (Amazon EC2, AWS Lambda or AWS Fargate).
Reserved Instances (RI): Provides a capacity reservation for one year or three years when purchased. Offer discounts up to 72% for a commitment on EC2 instances. RI’s are also available for RDS, Amazon Elasticsearch, Amazon ElastiCache, Amazon Redshift, and Amazon DynamoDB.
3. Stop paying for idle or low utilized Amazon EC2 or RDS instances
Identify idle or low utilized Amazon RDS instances:
You can use the Trusted Advisor Amazon RDS Idle DB instances check to identify RDS instances that have not had any connection over the last seven days. To reduce costs, stop these DB instances using the AWS Instance Scheduler.
Identify Amazon EC2 instances with low utilization:
You can use AWS Cost Explorer Resource Optimization to get a report of EC2 instances that are either idle or have low utilization. You can reduce costs by either stopping or downsizing these instances. Or you can use AWS Instance Scheduler to automatically stop instances when they are not needed.
4. Choose the right type of Amazon EC2 instance
You can analyze EC2 instances with AWS Compute Optimizer and get data and receive reporting recommendations for right-sizing these instances.
5. Start using specific Amazon S3 storage tiers
When users start using Amazon S3, they usually choose the Standard storage tier which is in most cases the right option. if you have some files which you do not usually need for more than 30 days, you can leverage other S3 tiers. You can use S3 analytics to analyze the frequency of file access which makes recommendations on where you can use the S3 Infrequently Accessed (S3 IA) tier to reduce costs. You also can use Life Cycle Policies to automate moving these files into the lower-cost storage tier. Alternately, you can also use S3 Intelligent-Tiering, which automatically analyzes and moves your objects to the appropriate storage tier.
6. Use the right volume type of Amazon Elastic Block Store (Amazon EBS)
For example, where performance requirements are lower, using Amazon EBS Throughput Optimized HDD (st1) storage typically costs half as much as the default General Purpose SSD (gp2) storage option. You can read more about every volume type here.
7. Use Auto Scaling or On-demand features for DynamoDB tables
Automatically scale your DynamoDB table with the Auto Scaling feature. You can enable this feature by using the simple steps described here.
But you can also use the On-demand mode. This mode allows you to pay-per-request for reading and writes requests so that you only pay for what you use.
The difference between Auto Scaling feature and the On-demand mode in DynamoDB is that you are only able to control the upper limits of your read and write capacity with Auto Scaling.
8. Reduce your data transfer costs
Data transfer from AWS resources (EC2, S3) to the public internet (your users) can create significant expenditure. If this happens consider using Amazon CloudFront CDN. Dynamic or static web content can usually be cached at Amazon CloudFront edge locations worldwide, and with this solution, you can reduce the cost of data transfer out (DTO) to the public internet. If you already use Amazon CloudFront and need to know more about possible cost savings, consider reading this article.
But there are also other data transfer cost optimizations based on specific scenarios:
The first example of this is accessing the data from Amazon S3 via Amazon EC2 within the same region is free of charge, whereas accessing Amazon S3 from a different region incurs a cost.
It’s good to avoid using public IP addresses for internal communication within the same Availability Zone (AZ). If you use a private IP address for communication (data transfers) between resources inside one AZ, data transfers are free.
Do you need help choosing the right cost optimization strategy?
10 Native AWS Cloud Cost Optimization Tools and Services
AWS is aware that cost optimization is something that almost every customer will need to do. After many years of experience, they have managed to create a vast array of tools and services that can be used for controlling cloud spend.
One of the keys to reducing cloud bills is to have visibility into services. CloudWatch is a AWS tool for collecting and tracking metrics, monitoring log ﬁles, creation of resource alarms, and setting of an automatic reaction to changes in AWS resources.
Example of usage:
You can set up an alarm with a notification when an EC2 CPU utilization goes below 20% and take action after investigation of why the instance is underutilized.
See patterns in AWS spend over time, project future costs, identify areas that need a further inquiry, observe Reserved Instance utilization, observe Reserved Instance coverage, and receive Reserved Instance recommendations.
Get real-time identiﬁcation of potential areas for optimization. One of the five areas checked by Trusted Advisor is cost optimization.
Automated recommendations related to cost optimization:
EC2 reserved instance optimization
Low utilization of EC2 instances
Idle elastic load balancers
Underutilized EBS volumes
Unassociated elastic IP addresses
Idle DB instances on Amazon RDS
4. AWS Budgets
Set custom budgets that trigger alerts when cost or usage exceed or are only forecasted to exceed a budgeted amount. Budgets can be set based on tags and accounts as well as resource types.
Example of usage:
You can create an overall budget for the whole account or create the budget for specific resources, such as several Amazon EC2 instances or Amazon CloudFront CDN data usage.
5. Amazon S3 analytics and Amazon S3 Storage Lens
Use Amazon S3 analytics – Storage Class Analysis for automated analysis and visualization of Amazon S3 storage patterns to help you decide when to shift data to a different storage class.
Amazon S3 Storage Lens delivers organization visibility into object storage usage, activity trends, and makes recommendations to improve cost-efficiency and apply best practices.
Delivers automatic cost savings on S3 service by moving data between two access tiers: frequent access and infrequent access.
Monitors your applications and automatically adjusts resource capacity to maintain steady and predictable performance at the lowest possible cost.
8. AWS Cost and Usage Report (AWS CUR)
After set-up, you can receive hourly, daily or monthly reports that break out your costs by product or resource and by tags that you define yourself. These report files are delivered to your Amazon S3 bucket.
Example of usage:
You can determine which S3 bucket is driving data transfer to spend.
9. AWS Compute Optimizer
Recommends optimal AWS resources for your workloads to reduce costs and improve performance by using machine learning. AWS Compute Optimizer analyzes resource utilization to identify AWS resources, such as Amazon EC2 instances, Amazon EBS volumes, and AWS Lambda functions, that might be under-provisioned or over-provisioned.
10. AWS Instance Scheduler
Simple service that enables customers to easily configure custom start and stop schedules for their Amazon EC2 and Amazon RDS instances.
Third-party AWS Cloud Cost Management Tools
There are also third-party tools that can help you with overall cost-effective cloud operations and they usually support checks across multiple public clouds and hybrid workloads.
The StormIT team understands that AWS cost optimization is an ongoing process. To help you achieve this, we provide you with our expertise and also access to the Cloudcheckr platform. CloudCheckr contains everything you need to manage and allocate costs, optimize spending, and save money in your AWS Cloud environment. It includes products for cost management, cloud security, compliance, resource inventory and optimization, and cloud automation.
Start your cloud project with StormIT and get free access to CloudCheckr Cost Management.
2. CloudHealth VMware
CloudHealth is a cloud management platform designed to drive increased business value at every stage of the cloud journey. CloudHealth can consolidate data across multiple cloud providers, on-premises environments, and integration partners, to provide visibility across your infrastructure. CloudHealth enhances the transparency of cloud usage and its overall impact on cost, performance, and security.
Centylitics is an intelligent cloud management platform that helps organizations on the public cloud in managing, securing, and optimizing their cloud infrastructure. You can use Centylitics six-step cost optimization strategy Resource Rightsizing, Instance Scheduling, Instance Reservation, Reserved Instance (RI) Utilization, Orphaned Resource Termination and Under-utilized Resource Identification.
Do you have any questions? Contact us and get a free consultation