To date, 8,500 computer programming languages have ever existed. But what happens when these languages cease to be of use? The Tiobe Index is a measure of language popularity. As languages fall into disuse, they move down the scale; vice versa for languages that are in greater demand at any given point. Languages like Perl and Ruby have fallen out of favor over time. The former’s downfall is likely due to its lingering obsolescence (its last major iteration was in 2000 and has yet to be released), and although the latter was used heavily in the development of Twitter, this was its only real claim to fame. Twitter has since shifted into a mix of Swift, Java, and Scala.

Programming languages can become obsolete or disfavored for a variety of reasons. Remember Flash? The program used to be required for filling out internet forms or viewing media content in the pre-YouTube era. But a combination of Flash presenting a lingering security risk and the advent of a new version of HTML that allows embedding video directly into website code has forced Adobe, the company that owns Flash, to cease development of the syntax effective as of 2020.

It’s tempting to think that, as programming languages become obsolete, fall in their Tiobe ranking, or are simply forgotten about over time, they cease to influence coding applications and can safely be ignored. Furthermore, if new coders and software engineers waste time and effort on learning dying languages at the expense of learning current programming, they risk compartmentalization or obsolescence along with the languages they studied. It would be like a priest learning all of the Latin rituals just before the Second Vatican Council, when masses were converted to vernacular.

However enticing it is for new programmers to focus on learning presently popular languages, it does not necessarily follow that long-relegated languages cannot come back to haunt organizations. For example, after slipping down the Tiobe Index for years, R has staged a recent comeback. SAS and SQL continue to be go-to statistical and data processing programs for many large businesses and corporate entities, despite not changing much over the past 30 years. The most striking recent example of this phenomenon, however, is the strange case of COBOL.

 

The Walking Dead (Computer Language)

In 1959, the Common Business-Oriented Language (COBOL), which was based on Rear Admiral Grace Hopper’s 1955 language that hewed closely to English syntax, was introduced as a temporary measure to allow portability of the Department of Defense’s data processing procedures. However, in keeping with Milton Friedman’s dictum that “there is nothing so permanent as a temporary government program,” the DoD essentially forced its contracted suppliers to manufacture COBOL-specific software by requiring it on all computers it purchased after 1961. Other federal and, soon, state departments also began converting data processing to COBOL, including the Departments of Veterans’ Affairs, Justice, Health and Human Services, and, as will become crucial later in the story, Labor.

Generations of academic computer scientists eschewed COBOL. They dismissed it for its lack of program structure and its reliance on vague programming heuristics that more organized languages did away with. These two facets, in addition to COBOL practitioners’ early and ongoing failures to document code, made the language byzantine and difficult for non-specialists to decipher. Thus rendered, COBOL suffered from the dual problems of having a dwindling number of bespoke coders and its being difficult — due to the language’s lack of documentation — to convert to modern code.

Despite these problems, COBOL caught on outside academic circles in a big way. U.S. federal and state governments were quick to institute COBOL for its flexibility and portability. COBOL eventually spilled over into the corporate world too. All of this took place before the computing revolution of the 1970s. Prior to then, many federal, state, and corporate systems wrote COBOL into their mainframes. What’s more, many of these same entities’ core processes presently rely on these same systems written in 50-or-more-years old COBOL code. And the problem is not miniscule. According to a 2017 Reuters report, 220 billion lines of COBOL code remain in use.

The central predicament of COBOL, namely a shrinking pool of specialists who can either maintain existing core processes or potentially convert these processes to other languages, has only grown starker recently. The average age of COBOL programmers is roughly 60 years old. These professionals currently have a mean yearly retirement rate of 10 percent. In addition, because of the historical antipathy for COBOL by the academy and, now, big tech, there is virtually no supply of skilled labor to replace these retiring programmers. But many businesses and government entities’ source programs still rely on COBOL for core data processing tasks. Therefore, it is becoming both too risky to keep COBOL, due to a lack of skilled labor to maintain and update programs, and too risky to attempt to convert core business programming functions to more prominent programming languages, due to this same problem. Thus, bizarrely, the more obsolete COBOL becomes, the more entrenched it gets.

These problems were made manifest in this past spring’s unemployment applications fiasco. In the wake of statewide lockdowns due to the coronavirus pandemic, tens of millions of workers suddenly found themselves unemployed. Many of the hardest hit states’ labor departments’ unemployment filing systems were built on COBOL code and were not equipped to handle sudden surges in applications for unemployment reliefThis led to weeks of delays in distributing unemployment benefits just as the pandemic began to surge. The spike in unemployment filings also exposed just how scarce COBOL programmers had become. The governor of New Jersey himself put out an open call for COBOL coders to assist with processing claims.

Because of COBOL’s early uptake by large-scale government and corporate entities on mainframe systems, because of COBOL’s lack of systematic documentation and dwindling number of specialist coders, and because of COBOL’s antiquated, hard-to-decipher syntax, systems that rely on it cannot be easily updated to newer programming languages any time soon. The complexity of the language does not lend itself to automation by other programming languages so attempting to fold COBOL into other processes has been ineffective. Outsourcing COBOL-related tasks to contract workers abroad has also failed. These workers turn over at a higher rate than those retiring (roughly 20 percent) and they can also have problems with the English syntax if they are not native speakers. There are also national security concerns around non-citizens working directly on sensitive data processes at higher levels of government.

More on COBOLWhy COBOL Has Stuck Around for All These Years

 

Battling the Zombies

The COBOL riddle, as with other legacy code, becomes how to get programmers to learn obsolete but process-critical languages when nearly every incentive for tech professionals works against their doing so. Likely the solution lies in firm recruitment and communication of needs with respect to these programs. Accomplishing this goal first requires organizations to stop ignoring mainframe and legacy systems issues in favor of new, more faddish computing breakthroughs. Myopic, short-term thinking is responsible for issues around COBOL and other types of legacy systems’ intransigence. IBM has already started a free, online tutorial for COBOL. Online forums have opened up to connect COBOL programmers to potential employersBut these solutions are short-run measures to what are inevitably going to be longer-term issues.

Two solutions present themselves for the longer term. COBOL and other legacy or mainframe process languages will either need to be converted to modern code, or organizations will need to hire new specialists to maintain and update legacy code. These solutions need not be mutually exclusive. With a larger ecosystem of programming languages than ever before and a wide variety of online resources available for learning current coding languages, legacy programmers have a wealth of resources to complement their legacy-system-specific knowledge. Ideally, legacy programmers could learn enough about modern languages to convert processes written in old code into new code. Companies that employ these individuals could offer financial incentives to them for code conversion rather than simple maintenance. Government entities or companies might also consider putting out open contracts or awards for retired legacy programmers’ work on temporary conversion projects with presently employed programmers who work in newer coding languages.

To address the longer term issue of a diminishing supply of legacy programmers, one solution might be for colleges to offer courses or certificates in legacy languages outside of traditional computing degrees. Community colleges and trade schools, who already supply some of this training, could hire retired legacy-system programmers on contractual or adjunct bases to accomplish these same goals. A potential source of students for these ironically titled “new legacy programmer” tracks could be transitioning professionals or those who seek to break into large-scale organizations but do not have traditional computing backgrounds or relevant work experience.

More than anything, it is crucial for large organizations and government entities not to let years go by without some thought as to the code-linguistic structures that undergird critical computing processes and the resources required to maintain or update them. With hundreds of billions or even trillions of lines of legacy code still operating worldwide, the problems of maintenance and conversion are only becoming more pronounced. Long-neglected programming languages thought to be obsolete or even dead can, thus, resurface to pose serious challenges to the digital integrity of organizations, industries, or governments. Legacy systems present challenges as well as opportunities for those who can think strategically about long-term objectives.

More From Edward HearnBigger Isn’t Better — When it Comes to Data

Expert Contributors

Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.

Learn More

Great Companies Need Great People. That's Where We Come In.

Recruit With Us