Cloud Development, Code Quality, and Cost Savings
Published: August 26, 2018
Last Updated: October 29, 2022

Are you considering moving applications to the Cloud and questioning the readiness of your current development staff for such a move? Well, rightfully so. In the Cloud realm, where everything costs a penny here and a nickel there, organizations are putting a fresh focus on cost. When you pay for computing resources as you go instead of a flat allocation of resources, code quality starts to equate to costs savings and lack thereof equates to running up the tab. In this article we discuss a brief history of purchasing behaviors for computing resources, how that model is changing, and how cut rate developers working on high volume applications could be costing you a fortune!
Before we dive in how about subscribing to our email list for more thought provoking content straight to your inbox?
Like our insights?
Subscribe to our email list, and get the latest posts straight to your inbox.
Application Infrastructure Purchasing Evolution
When programming first emerged, computational resources were a huge constraint on what software business could build and sell. Over time, memory and computing capabilities increased bringing a new era: the information age. As costs came down, client side computing resources grew and applications began to grow in complexity and become feature rich. Over time this has bred a generation of web developers churning out code in applications with little regard to the impacts on how many bytes are transmitted, how many computing resources are required, and what the impacts are on storage. IT departments typically look at current computing resources, extrapolate those out for a period of time (let’s say 5 years) and purchase hardware to support the growth. Once the capital outlay is made, there is no incentive to conserve the excess capacity because costs remain relatively static. IT departments also don’t like critical applications experiencing downtime or latency, so these capital outlays typically project for excess computing capacity at the end of the depreciation period, again promoting a culture of programmers unconstrained by computing resources. Enter the cloud: a model based on paying for only the computing resources you need at the time you need them.

Instead of static costs for the life of application hardware, the cloud development allows the costs to grow with the same curvature that your applications resource consumption grows. In the old model, when you have purchased and planned for a static computing costs that exceed your consumption estimates for the next 5 years, is there any reason to worry about what the curve of consumption looks like? Hint: No, there is not. When there is a direct correlation with costs and consumption, there is now an incentive to conserve the computing resources consumed from within an application. Companies that want to compete on price of software products, or just control costs, will now need a cloud development staff that can think about more than just meeting the next deadline and obtaining delivery sign off. Costs for computing resources grow in a manner that may not become apparent until the amount of data in the application grows sufficiently.
Cloud Development Cost Example
Let’s consider a common feature in most applications, a report! In a hypothetical example, a small app development company builds a boutique accounts payable application. They expose a report to their customers that allows them to view their invoices so they can process them. The report allows the customers to search by date range, but since the report has the most recent invoices at the top anyways, most customers just click search as soon as they hit the page without filtering the results. This results in a situation where all the customers invoices are returned to them on each search. The application developers pulled a cool little JavaScript control that implements client side paging of the report, so the screen doesn’t get too cluttered. Since this is a new company, and has mostly new customers, the volume of invoices in the database is low. The report returns results in a speedy fashion and everyone is happy. Fast forward 2 years. The company has got some traction, new customers are coming on board quickly, but existing customers are starting to get frustrated with the application’s performance. Pages are loading slowly, the volume of invoice data in the database is starting to grow, customers are unable to process their invoices in a timely manner and they are holding our fictitious company accountable. The decision is made to hire a database administrator to help improve the application’s performance. Since database administrators are able to tune applications and increase their performance blindly (without understanding or even looking at the applications code or design), they see that there are some missing indexes and add new indexes to support performance and customers are back to being happy.
What may not be apparent is that although customers may go back to being happy, when applications performance is restored, you are still silently racking up additional costs for cloud computing resources. Why? Because this very common situation described above does not address the issue of returning all of a customer’s invoices when the report is pulled. A savvy cloud developer will understand that this report needs to be implemented with database paging. Database paging will pull and load into memory only the first page of query results. It will then load subsequent pages of invoices when they are requested. Our current scenario would query and load into memory all invoices and then page them on the client side (end users computer). What this translates to is additional costs for compute resources. Our existing scenario, though performing well from a response time perspective could be costing us $1.50 in compute resources per query run when it could be costing us $0.05 when implemented by a cloud developer who understands these concerns. As the volume of invoices in the database grows, so does the number of bytes transmitted using bandwidth, loaded into memory and processed by the CPU. In one year’s time this could be costing $2.00 per query run, but could be costing $0.07 a query if optimized for cloud development. These are hypothetical numbers, but very real concerns. Repeating occurrences of modules not developed in a cost-optimized manner across and enterprise applications can add up quickly over time and result in a death by a thousand cuts cost run up. Scenarios like this are quite common and many organizations aren’t prepared for the transition.
Cloud Development Cost Control
How can you avoid these types of silent killers?
- Code review your applications before moving them to the cloud.
- Enable performance monitoring on your applications and analyze them to identify bottlenecks before migration.
- Engage consultants to fill in expertise gaps.
- Train development staff for cloud computing.
In this article we talked about how cloud development comes with additional concerns with regards to code quality and how code quality equates to cost savings and competitive advantage. We discussed how legacy purchasing models supported development of applications that will yield sub optimal performance and costs in the cloud. We went over a common example in an enterprise environment that can yield to increased costs and finally we talked about a few ways to get cloud ready before moving your applications. Contact us if you have any questions.