Despite a nearly 8% drop in IT spending in 2020 due to the global pandemic and the reduced operations of many organizations, IT spending as a whole remains high with growth averaging as much as 5% per year, with an expected total of more than $4 trillion in 2021.
IT budgets are notorious for their sprawl, and the subsequent waste that they entail. In fact, it’s estimated that as much as 50% of an organization’s total technology spend is managed outside of the IT department, and the systems run by IT tend to be legacy systems that are expensive to manage. Dell estimates that roughly 70% of applications being used by Fortune 5000 companies were developed 20 or more years ago and that 60-80% of IT budget is allocated to maintaining existing on-premise hardware and legacy software applications.
Let’s take a look at just how much money an organization can potentially save in modernizing IT infrastructure, what that entails, and how the investment to make those changes result in positive ROI faster than you might expect.
A Blueprint of IT Expenses in 2021
IT budget consists of a number of major components including compensation for IT professionals on staff and outside consultants, the core back-office systems that power your organization, including ERP, accounting, finance, and HR applications, hardware expenditures, and networking equipment and infrastructure. More modern expenses include data center expenses and bills for cloud applications or platforms operating business systems.
One of the most costly elements of the IT budget is the maintenance of legacy systems, including on-premise servers, hardware infrastructure, and out-of-date software that could be replaced and augmented by cloud solutions. Consider the federal government, notorious for holding on to outdated legacy systems well after their shelf life has expired. A recent GAO report identified that 80% of the $90 billion allocated for IT by the US government in 2019 was dedicated to maintaining software systems, many of which are outdated. Why is this such a problem?
To start, old code is more difficult to work with. It’s longer, uses outdated code, and there are fewer people fluent in developing and maintaining it. This creates larger burdens for support but also can lead to a backlog of maintenance that can create security vulnerabilities. Older technologies are often incompatible with newer technologies. When a new system is bolted into old infrastructure, it can require a tremendous amount of time to customize one or both systems so they work together.
When it comes to hardware, the costs can be complicated. On the surface level, the hard costs of running an on-premise server vs. paying for a cloud solution are comparable. However, when you dig beneath the surface, there are dozens of other factors including time for implementation, the flexibility to scale up or down with cloud services vs. the relative intractability of on-premise solutions, the maintenance and lifecycle costs of replacing a server, uptime reliability when relying on a cloud provider, and the offloading of certain responsibilities, including the security of the actual infrastructure. If you have variable computing needs, limited IT staff, or could use the upfront capital required to purchase and maintain a server, IaaS can help reduce costs and improve flexibility as an organization.
The Security Risk of Legacy IT Systems
Another significant cost of legacy IT systems is the potential risk of a security breach. Some of the biggest potential security gaps in enterprise IT are incompatibility between software and hardware systems – new and old – and the general operational inefficiencies within an organization. This often manifests as a lack of resources or expertise needed to stay on top of security issues as they develop.
When software is no longer supported by its provider, major vulnerabilities can develop. Windows 7 and other older versions of Windows still maintain an 18.6% market share, despite Microsoft ending support and urging all of its users to shift to Windows 10. Old software that is no longer patched when new vulnerabilities are discovered can be a major issue for your organization, especially when the average cybersecurity breach costs $3.86 million globally and $8.64 million in the United States, with a particularly high cost for healthcare and financial organizations.
A Lack of Experienced IT Professionals
There are several issues for organizations attempting to get the right people for the right roles. To start, there just aren’t enough people with the expertise needed to tackle key jobs. For example, Info Security Magazine reports that the skills shortage for cybersecurity professionals is nearly 4 million globally.
At the same time, existing IT professionals are starting to fall behind with the newest technologies. Even in organizations with ample staff, the specific skills needed to meet project requirements aren’t always present. Expertise for high-demand programming languages like Python, for example, is difficult to acquire. Upskilling and reskilling programs are a great start, but it can also help to look for outside support.
The Long Term Benefits of Modernization in Your IT Department
Whether you’re still running on-premise servers with limited staff and all the maintenance and regulatory requirements that come with it, or your legacy software systems are difficult to manage and potentially presenting larger security risks, modernization is an important strategy.
By working with an experienced service provider that can support your modernization efforts at the organizational level, you can transform legacy systems into modern platforms that offer improved functionality that reduces the cost of maintenance and support, addresses security risks that could lead to a data breach, and improve productivity not only for your IT team, but for all parts of your organization. Learn more about how Bedroc helps organizations identify and create new, efficient processes for their IT systems that will ultimately save time and reduce total costs.