Legacy software creates balls of confusion at big banks, elsewhere


Ancient legacy software threaten to become a major problem at TBTF banks and other financial institutions as well. These dinosaur apps, which are often mission critical, can have so many subsystems cobbled on to them that no one is quite sure how the system works. Programmers call this “spaghetti code,” a convoluted mess of difficult to understand code. Even worse is “black hole code.” The data goes into it, the right answer comes out, and no one knows how or why it works.

Systems tend to agglomerate, rather than having data exported out into newer, tidier, faster software:

The result of all this agglomeration is either that you lose an clear idea of how things all hang together, or you have people working manually or with kludged programs across systems. The danger with overbuilding is you can have parts that you’d built around and didn’t even know were there any more spring back to life in costly, nasty ways.

California has tried at least twice now to migrate its ancient payroll system to a modern platform and failed miserably. Apparently their legacy software system is a rat’s nest of dozens, if not hundreds of systems, with roots going back to COBOL written decades ago. The federal government has the same problem with much of its software too.

I convert ancient DOS-based databases to Windows so have some insight here. A big problem is that you can’t simply start from scratch and write a new system. The old data has to be brought into the new system, a process which can be maddeningly complicated. Plus, the new system generally needs to have the same functionality as the old system and, since many such systems are mission-critical, the new system has to work perfectly and quickly, a daunting task.

All of this is made worse by the current practice of using contractors or outsourcing to write the code. When something breaks or needs changing a few years later, the original programmers may not be around. You may not even know who they were.

  • Jonathan Lundell

    This is, to connect to a recent post by stretching a point very little, the problem with Excel: every somewhat complex spreadsheet (and Excel encourages complexity) is instant legacy software. It’s generally opaque to anyone but the author, including the author herself a month or three later.

    Excel (it’s not alone, of course, in this regard) not only encourages overcomplex spreadsheet logic, but it also encourages authors who have no real grasp of the problem. If it’s possible to write structured code in Excel, what percentage of spreadsheets are well-structured? Safe to say that it’s 0% within any reasonable margin of error.

    • Exactly, when such code becomes legacy code, everyone just assumes it is producing the correct data, and that may not be the case. The standard CPA audit rule is everything gets checked twice, and by different people. That doesn’t always happen in companies.

Powered by WordPress. Designed by WooThemes