How to Achieve Cost Optimization using Multi Cloud Environment

Shares

How to Achieve Cost Optimization using Multi Cloud Environment

There are many factors that IT decision-makers must take into account when developing and implementing new strategies for their organizations. Without a doubt, though, the need to limit and reduce cloud expenses is always a leading consideration. The potential for cloud cost savings is tempting for businesses in every sector.

With that in mind, many organizations are turning to multi cloud strategy. Just like any other service your business uses, different cloud providers are good at different things. Finding the right combination of cloud providers to match your business needs and your checkbook can give your company an extra boost of efficiency while dropping your costs accordingly.

But most organizations find it difficult to map their cost optimization strategies according to workload types. While implementing multi-cloud strategy, cloud architects encounters two critical problems:

(1) How to choose appropriate cloud services to minimize monetary cost in the  presence of different pricing policies?
(2) How to meet the different availability requirements of different services?

As to monetary cost, it mainly depends on particularly storage capacity consumption, data-level usage, and network bandwidth consumption. As to availability requirement, the major concern lies in which redundancy mechanism (i.e., replication) is more economical based on specific data access patterns. In other words, the primary challenge is to find a solution which reduces monetary cost and guarantee required availability.

In this blog, we’ve discussed some use cases to find appropriate cloud services in multi cloud environment which help organizations to reduce cloud costs and improve redundancy.

How Multi Cloud Strategy Reduces Cost in a Complex Application?

Using multi-cloud strategy, organizations leverage best services of different cloud providers and spread critical workloads. This practice helps organizations to reduce cost and increase redundancy.

Let’s understand this by taking an example of face detection system for an organization.  Basically, an enterprise wants to build a door entry system for shared workplaces where each company could be notified of a visitor, purely based on the face of that person being in their database? This system should identify among customers, prospects, visitors, strangers, patrons, commoners, suspects, etc.

Below are the objectives for building an app:

  • Higher redundancy for globally distributed users
  • Accuracy in face detection
  • Cost optimizationNow, before you consider above objectives, it is important to know how hard it is to build such app? Well, it requires a sound knowledge of machine learning, artificial intelligence and other technical skills. Also, this operation requires huge computing power and cost which become impediment for enterprises.

Now such functionality is affordable and available to everyone. Thanks to APIs like AWS rekognition or Google vision, creating a custom face recognition app is no longer a problem from a cost or functional perspective.

To serve these requests effectively, app needs to use compute service and face recognition API service. Here application architects can use either AWS Lambda or Google cloud functions for compute services. For face recognition APIs, they can use either AWS Rekognition or Google vision.

Based on these services there can be four sets of options to choose from, listed below:

  1. AWS Lambda with Google vision
  2. Google cloud functions and Google vision
  3. AWS Lambda and AWS Rekognition
  4. Google cloud functions and Rekognition

Here we have calculated pricing for all four scenarios to identify the cheapest solution.

AWS Lambda and Google vision

AWS Lambda charges:

Let’s take an example in which, we allocate 512MB of memory to the function, executed it 5 million times in one month, and it runs for 1 second each time.

Monthly compute charges:

The monthly compute price is $0.00001667 per GB-s and the free tier provides 400,000 GB-s.
Total compute (seconds) = 5M * (1s) = 5,000,000 seconds
Total compute (GB-s) = 5,000,000 * 512MB/1024 = 2,500,000 GB-s
Total compute – Free tier compute = Monthly billable compute GB- s
2,500,000 GB-s – 400,000 free tier GB-s = 2,100,000 GB-s
Monthly compute charges = 2,100,000 * $0.00001667 = $35.007

Monthly request charges:

The monthly request price is $0.20 per 1 million requests and the free tier provides 1M requests per month.
Total requests – Free tier requests = Monthly billable requests
5M requests – 1M free tier requests = 4M Monthly billable requests

Monthly request charges = 4M * $0.2/M = $0.80

Total monthly cost for AWS Lambda = Compute charges + Request charges = $35.007 + $0.80 = $35.81 per month

Now, let’s calculate pricing of Google vision API for 5,000,000 request per month.

First 1000 facial detection requests are free. Remaining requests are 5,000,000-1000 = 4,999,000 requests
It costs $1.5 per 1000 requests.
Cost of 4999000 requests = 4999 * $1.50 = $ 7498.5

The total cost of facial detection app performing 5 million requests per month using AWS Lambda and Google vision API is, $35.81 + $ 7498. 5 = $7534.31 per month.

Google cloud functions and Google vision

Monthly compute charges of Google cloud functions:

The monthly compute price is $0.0000025 per GB-s and the free tier provides 400,000 GB-s.
Monthly compute charges = 2,100,000 * $0.0000025 = $ 5.25

Monthly request charges:

The monthly request price is $0.40 per 1 million requests and the free tier provides 2M requests per month.
Total requests – Free tier requests = Monthly billable requests
5M requests – 2M free tier requests = 3M Monthly billable requests
Monthly request charges = 3M * $0.4/M = $1.2

Total monthly cost of Google cloud functions = Compute charges + Request charges = $5.25 +  1.2 = $6.75

Now, Cost of google vision is $ 7498.5 as calculated above.

Total cost of Google cloud functions and Google vision combination = 6.75+7498.5 = $7505.25 per month.

Google cloud functions and AWS Rekognition

Cost of AWS rekognition is $4196

Cost of Google cloud functions is $6.75

Total cost of Google cloud functions and AWS Rekognition = $4196 + $6.75 = $4202.75 per month.

AWS Lambda and AWS Rekognition

Cost of AWS rekognition is $4196

Cost of AWS Lambda is $35.81

Total cost of AWS Rekognition & AWS Lambda combination = $4196 + $35.81 = $4231.81 per month.

It is clear that Google cloud functions costs less than AWS Lambda and AWS rekognition API costs less than Google vision API. Based on this pricings and above calculations, we can conclude:

  • The total cost of AWS Lambda and Google vision API is highest, $ 7534.31 per month.
  • The total cost of Google cloud functions and AWS rekognition is lowest, $4202.75 per month.
  • The Total Cost of Google cloud functions and Google vision remains second, $7505.25 per month.
  • The total cost of AWS Lambda and AWS Rekognition remains third, $4231.81 per month.

From above conclusions, it is clear that using multi cloud services can be costly if implemented without analysis. On the other hand, it provides the cheapest solution for given requirement.

Optimize Storage Costs in a Multi Cloud Environment

Storage in a multi cloud environment provides a huge opportunity to optimize costs. It is important to note the subtle differences between the cloud storage options. While all the major cloud vendors offer object storage service, but vary significantly in terms of type of data to be stored, data saving and retrieving latency, durability requirements and proximity to compute resources.

For example Microsoft Azure offers Block Blobs for object and Storage Disk for I/O intensive read/write operations. Block Blobs storage comes at a lower price, but has a high latency and is not suitable for high performance and frequent read/write operations. Google, on the other hand, offers near-line storage at a low cost that’s suitable for file systems. Be aware that near-line storage has a higher latency. If you are looking for archival storage, it may make sense to choose a single cloud provider to keep storage management to a minimum. If redundancy and cost optimization is important, though, you may want to consider archiving on multiple clouds.

Cloud providers focus on delivering “4 Nines” i.e. cloud availability of 99.99%. Though this looks sufficient but a monthly downtime of even 4.32 minutes (0.01%) can result in significant loss for high traffic application. This availability alone is not enough to meet SLAs of enterprise customers. High end applications require “5 Nines” availability. In order to ensure this high availability, after decomposition the client reformulates the queries and then replicates each decomposed relation (chunk) to at least two cloud providers. Replication of each chunk is done at more than one cloud service provider so as to increase cloud availability from 4 nines i.e. 99.99% to at least 5 nines i.e. 99.999%. If one of the chunk storage provider goes down, the other chunk storage provider will provide the data chunks that were stored on the failed server.

Let’s understand storage cost optimization by taking an example of photo sharing application.

A photo sharing app requires object storage to store and display photos uploaded by their end-users. Their current end-user customer base is about 10,000, and on average, 5,000 users upload around 80,000 photos per month. The approximate average size of each photo is around 1MB. Each photo is viewed around three times per month.

Let’s calculate the tentative storage costs incurred by this photo sharing site using the three major cloud provider. The website has different storage requirements for object, file and data analysis.

Object Storage

AWS S3

  • Total storage per month: 80,000 images x 1MB = 80,000MB = 80GB
  • Storage cost for 80GB: 80GB x $0.023 = $1.84 per month
  • Total PUT/POST requests per month: 80,000 requests
  • PUT/POST request cost: 80,000 x ($0.005/1000) = $0.40 per month
  • Total GET requests per month: 80,000 photographs x 3 views = 240,000 requests
  • Total GET request cost: 240,000 x ($0.004/10000) = $0.096 per month
  • Total data transfer OUT cost from Amazon S3 to the internet: 80GB of data x 3 views x $0.09 per GB = $21.6 per month.Total cost = (storage cost + PUT/POST request cost + GET request cost + data transfer cost) = $23.936 per month

Azure Blob storage

Storage cost: 80GB x $0.0208 = $1.664 per month
PUT/ Write request cost : $0.40 per month
Read/Get request cost: $0.096 per month
Data transfer cost: 80GB of data x 3 views x $0.087 per GB = $20.88 per month

Total cost=  $1.664 + $0.40 + $0.096 + $20.88 = $23.4 per month

Google Cloud Standard Storage

Storage cost for multi region: 80GB x $0.026 = $2.08 per month
PUT/ Write request cost : $0.40 per month
Read/Get request cost: $0.096 per month

Total cost=  $2.08 + $0.40 + $0.096 = $2.576 per month

File Storage

An app needs to store 200 GB of data which can be accessed by multiple virtual machines at the same time. For this, they require managed Network File System (NFS).

Amazon EFS cost: 200 GB x 0.3 per GB/month = $60
Azure files cost: 200 GB x 0.1 per GB./month = $20
Google Cloud: 200 GB x 0.2 per GB/month = $40

Cold storage

Photo sharing app generates lots of log files (~10TB per month) from their data analysis that need to be retained for a long period of time due to compliance constraints. However, there is no immediate need to access this log data. Therefore, the company needs to store this data in cold storage.

Amazon Glacier cost: 10 TB (10000 GB) x 0.004 = $ 40 per month

Azure cool blob storage cost: 10 TB (10000 GB) x 0.0152 = $ 152 per month
Google Coldline cost: 10 TB (10000 GB) x 0.007 = $ 70 per month

Thus according to the above calculation storing files in multiple clouds is highly beneficial. An organization should use Google cloud standard storage, Azure files and AWS Glacier to host the different components of such (photo sharing) an application.

Moreover, dividing the app storage in chunks makes it secure. Here’s how to do it!! The client should maintain mapping table of the various relations, chunks names, a sequence of chunks and storage locations. Each chunk is given a random name. Even if the adversary is able to find out some information from all the chunks, he is not aware of the proper order of the chunks in making the information valuable. So the data will be secure. Splitting data into smaller chunks restricts data mining attacks also to a great extent as they contain an insufficient amount of data.

Cost optimization in a multi-cloud disaster recovery system

Businesses must maintain their data in different locations to comply with service-level agreements and to ensure business continuity in the event of a disaster. Though disaster recovery does not cost much compare to other cloud services but still it is not a small investment. The disaster recovery cost can be reduced by avoiding traditional methods like using third party migration tools and data transfer methods offered by cloud providers. Now, cloud platforms provides migration services at lower cost to migrate data to another cloud platform.

For example, an organization has multiple sql server backups going onto Azure. They contain 1TB data on Azure blog storage and wants to store the same data onto AWS Glacier for longer term disaster recovery. The average data is 1T/month.

There are two ways to accomplish data transfer:

1 – Using third party tools
2 – Using AWS Database Migration Service.

Cost of third party tool

Let’s take an example of Cloudsfer which provides migration service from one cloud to another. It charges $25 per 20 GB. According to this price, an organization has to spend $750 to migrate 1TB data from Azure to AWS.

Cost of AWS Database Migration Service (DMS)

AWS DMS uses C4 instance family to accomplish data migration. Here we are using c4.large which includes 100GB of GP2 network-attached storage. It transfers 100GB per hour and costs $0.175 per hour. To transfer 1TB data, we require 10 c4.large instances.

Cost of 10 c4.large instances= 0.175*10 = $1.75

Now, let’s calculate data transfer cost.  AWS charges $0.01 per GB when Data transfers from Azure VPN to AWS.

Data transfer cost= $0.01*1000 GB = $10

Total cost of migrating from Azure server to AWS = $1.75 + $10 = $11.75

As you can see, cost of DMS is very low compare to third party tools.

Opportunities to reduce cost in multi cloud environment

Multi cloud deployment and integration is a three-step process. First, apply policies to determine the best place to host an application component. Second, deploy the component in the selected cloud – a private cloud or one of your multiple public clouds. Finally, provide the necessary IP address information to tie the new hosting location into your application workflow.

Because every public cloud service manages IP addressing a little differently, it’s often necessary to customize deployment steps for the target cloud. This means it’s easy to make a mistake, and potentially expensive to recover from it. In some cases, adding a cloud provider can increase deployment and redeployment costs by over 50%.

Cost tends to go higher in multi cloud environment in the following cases:

  • Geographical region
  • Storage
  • Rate of consumption
  • Data movement

Geographical region – Major cloud providers have data centers across the globe and the pricing changes according to the region. For example, Azure Blob storage in the first 50 TB/month costs approximately $0.0192 in East Australia but only $0.0128 in Northern Europe. In our earlier use case of multi cloud storage we have explained how using a combination of different clouds can reduce cost.

Pricing is generally tiered to the rate of consumption of cloud storage. For example, AWS offers the first 50 TB/month of standard storage at $0.023 per GB, then the next 450 TB/month at $0.022 per GB and anything over 500 TB/month at $0.021 per GB in US, West. Thus it’s advisable to store related files in a one cloud.

Data movement from one cloud to database or between multiple clouds attract significant costs. Poorly planned multi-cloud deployments can more than double the cost by weaving workflow traffic in and out of various public clouds. For example, Azure Hot Blob storage with geographically redundant storage are charged an extra $0.10 per 10,000 container operations, $0.004 per 10,000 other operations and another $0.02 per GB of geo-replication data transfers.

But some applications require hosting in certain locations to overcome performance and latency issues. In such cases the application should use microservice architecture. The microservices are hosted in multiple cloud providers and treated as separate applications. Rather than communicating across cloud providers each microservice cluster will link back to company VPN. This would mean workflows would only pass through one cloud provider boundary, making the cost comparable to that of a single provider.

Operating in a multi cloud environment with bandwidth-heavy workloads that run over the network connection connecting the clouds, Direct Connect + InterConnect reduces the network costs into and out of the clouds. All data transferred over the dedicated connection is charged at the reduced AWS Direct Connect data transfer rate rather than internet data egress transfer rates.

Conclusion

In this blog, we have taken main areas where organizations spend more money while using a single cloud provider. We’ve calculated the tentative cost for those areas by taking different cloud services with maintaining efficiency to identify the lowest pricing option. If you believe you are being struck with extra costs, ensure to follow the above steps to optimize multi cloud cost. Above cost reducing methods can work for any organization who wants to adopt multi cloud environment.

Jignesh Solanki

A thought leader, Jignesh leads Simform's Product Engineering team by DevOps Institutionalization, end-to-end product development and consulting led transformation programs.

Sign Up for the Exclusive Multi-cloud Newsletter

85% of enterprise IT organizations will commit to multi cloud architectures by 2018. Don't miss out on exclusive content from cloud experts.

You have Successfully Subscribed!