Ancient legacy software threaten to become a major problem at TBTF banks and other financial institutions as well. These dinosaur apps, which are often mission critical, can have so many subsystems cobbled on to them that no one is quite sure how the system works. Programmers call this “spaghetti code,” a convoluted mess of difficult to understand code. Even worse is “black hole code.” The data goes into it, the right answer comes out, and no one knows how or why it works.
Systems tend to agglomerate, rather than having data exported out into newer, tidier, faster software:
The result of all this agglomeration is either that you lose an clear idea of how things all hang together, or you have people working manually or with kludged programs across systems. The danger with overbuilding is you can have parts that you’d built around and didn’t even know were there any more spring back to life in costly, nasty ways.
California has tried at least twice now to migrate its ancient payroll system to a modern platform and failed miserably. Apparently their legacy software system is a rat’s nest of dozens, if not hundreds of systems, with roots going back to COBOL written decades ago. The federal government has the same problem with much of its software too.
I convert ancient DOS-based databases to Windows so have some insight here. A big problem is that you can’t simply start from scratch and write a new system. The old data has to be brought into the new system, a process which can be maddeningly complicated. Plus, the new system generally needs to have the same functionality as the old system and, since many such systems are mission-critical, the new system has to work perfectly and quickly, a daunting task.
All of this is made worse by the current practice of using contractors or outsourcing to write the code. When something breaks or needs changing a few years later, the original programmers may not be around. You may not even know who they were.