
Software To Change In A Big Way!
Moore’s Law means hardware gets about twice as fast every 24 months as chips can have smaller footprints. We are reaching the end of silicon-based efficiency because the laws of physics mean the etching on chips cannot get all that much smaller.
Don’t take our word for it – Gartner forecast that the current tech stack is at end-of-life in 3 years – from January 2022 – which is about now!
Some say the age of quantum computing lies ahead.
Probably for some scientific apps or weather forecasting, but for business computing, quantum computers are not going to be a big seller any time soon. Applications would have to be rewritten in code for which there are few if any programmers. Quantum computers are not the next big thing.
A.I. is the next big thing – but it is hamstrung by lack of energy – as in electric power.
Highly efficient, optimized software stacks are the future. And that future is at hand!
The good news is these tech stacks optimize I/O – thus reducing the need for ever increasing data centers.
Software, particularly the DevOps movement, has been the lazy uncle in this equation for over a generation. Software has not had a major architectural innovation impacting speed or efficiency – ever. Software tagged along with faster hardware, cheaper processors hiding the inefficiency of its bloat.
A typical tech stack from the hardware up has a data management layer, middleware, virtualization, security, app code, a user interface. Each layer is general purpose. That means every layer has every conceivable feature, 95% of which no one customer needs. But they remain and must be supported.
Each layer introduces I/O wait states. Every I/O wait state means the CPU is wasting time not doing anything productive; the application is flailing. A flailing app eats up energy and compute resources, with no productive business outcome.
Software engineers once delivered purpose-built apps based on knowledge of how a CPU works. They wrote applications that pipelined data in its most efficient form.
Today’s software is too expensive to license, too difficult to maintain, and too slow to offer anything close to business agility.
Then there are the secondary inefficient software costs: data centers sprawl to cover entire city blocks and consume what is estimated to be 3% – 5% of the energy grid.
Eliminating or reducing the data center footprint is now the largest energy-saving opportunity for a major corporation. As the need to conserve power increases, firms will pay more attention to lowering the energy footprint of those data centers.
The best option at hand to reduce those costs and save energy is to dramatically increase the efficiency of software.
That efficiency, measured in elimination of I/O wait states thus optimizing the processors, is fast becoming the domain of the I/O optimizing tech stack.
That app has built in the full tech stack it needs to operate. The data management, middleware, security, and even GUI are purpose-built for that micro-app.
If the micro app needs to manage cell call billing, its data management understands how cell phones generate transactions. If it needs to manage an electric meter with unique types of data feeds, that comes as part of the data management layer.
The result is efficient use of the machine processor. Reducing I/O wait states makes a micro app run 1,000 to a million times faster. That’s a pretty good result if you are looking for transformation.
Low I/O wait state software is a difference of kind instead of just a degree.
The micro-app costs 1/10th the cost of an app using traditional technology. It uses 85% less storage. It can be built from scratch in a quarter.
One of the most common ways to introduce Low I/O wait state software is to build a parallel app for a QA process – a digital twin.
One can take a typical billing system, build a parallel app in a quarter, watch it run 1,000 times faster, use it as a QA checker for every transaction, then after a month, a year, or several years eliminate the legacy app and reap the benefits.
This often results in the reduced energy for any data center – with Low I/O tech running on less power than a table lamp.
Energy reduction, greenhouse gas reduction, cost reduction, and increased business agility – these are all readily achievable with increased software efficiency provided when you move to Low I/O wait state tech.