Hi, this is a question that popped into my mind when i saw an article about some AWS engineer talking about ai assistants taking over the job of programmers, this reminded me that it’s not the first time that something like this was said.

My software engineering teacher once told me that a few years ago people believed graphical tools like enterprise architect would make it so that a single engineer could just draw a pretty UML diagram and generate 90% of the project without touching any code,
And further back COBOL was supposed to replace programmers by letting accountants write their own programs.

Now i’m curious, were there many other technologies that were supposedly going to replace programmers that you remember?

i hope someone that’s been around much more than me knows something more or has some funny stories to share

  • It’s happened a few times in my career where people tell me I’ll be obsolete, but it’s always been some company hyping their new product and suits frothing at the prospect of not having to pay me anymore.

    So far they’re like 0 for 8 or so.

    Now I will say the goalposts move. What I’m doing now is for sure not what I was doing 10 years ago. I’m definitely heavier in devops and infra than where I was before (ironic because they said we’d never have to worry about that stuff again if we moved to the cloud). AI is still basically machine learning, just in a while loop, so I’ve spent time learning that. So, in a way, yes we’re obsolete in the sense that if I was the same engineer I was 10 years ago I wouldn’t be worth nearly this much, I had to grow and evolve with technology.

    • @scrubbles
      cool

      but it’s always been some company hyping their new product and suits frothing at the prospect of not having to pay me anymore

      i half expected it, after all it’s what’s happening right now

      What I’m doing now is for sure not what I was doing 10 years ago.

      that’s right, i guess some aspects of programming have really been made obsolete

      • some aspects of programming have really been made obsolete

        I’d agree that some specifics have been made obsolete. Some habits and routines are currently being ignored or skipped, but the amount of skill that’s gone away is very small.

        As mentioned before, we downsized brutally after Y2K. The people most affected were the highest-paid who weren’t the best code-grinders, and these were the documenters, the programme people, and the mentor types. We lost our guides, our structure, and our historians. We’ve been growing again like feral children rebuilding society from the wasteland like it’s Mad Max, and there’s a LOT of the Why that we either don’t know, that we ignore, or that we skip in the interests of (insert manufactured urgency here).

        We are re-learning some of the whys, but we haven’t yet seen the half-assedry chickens come home to roost on that. The symptoms are there: Boeing’s Gilligan’s Island in Space, supply-chain sploits in waves, personal information lost weekly, all these things that are clipboard hassles we stopped doing that pelrevent massively expensive things later.

        Crowdstrike may die now, mainly because they were marauding leopards we allowed to eat our face. Solarwinds before that, same issue but they seem to be okay. There are dozens of ohShit moments that could lead to similarly preventable problems, that we knew not to do … once.

        Well get there again but we’ll be rediscovering a lot of what some techbro will claim is obsolete, old-practice, too-cautious, hand-wringing in our neu and moderne go-hard/break-lives paradigm.

  • Oracle has a product called Oracle Policy Automation (OPA) that it sells as “you can write the rules in plain English in MS Word documents, you don’t need developers”. I worked for an insurance organization where the business side bought OPA without consulting IT, hoping they wouldn’t have to deal with developers. It totally failed because it doesn’t matter that they get to write “plain English” in Word documents. They still lack the structured, formal thinking to deal with anything except the happiest of happy paths.

    The important difference between a developer and a non-developer isn’t the ability to understand the syntax of a programming language. It’s the willingness and ability to formalize and crystallize requirements and think about all the edge cases. As an architect/programmer when I talk to the business side, they get bored and lose interest from all my questions about what they actually want.

  • Salesforce advertised “No more developers” for awhile in the mid 2010s. It was great fun trying to clean up the mess all the “not programmers” made of those systems. I really hate Salesforce. They must have some of the best sales people on the planet.

  • If a tool were created that properly converted an UML diagram into a project without any need for code, all the programmers that lost their job to this tool would then be hired by the company that offered it, in order to give maintenance and support to everything the customers want in their programs.

    It would be removing programmers from they payroll of some companies but they would still be working for them, just further down in the chain.

    The same is true for AI. If AI could completely replace programmers in some area, it would need a lot of programmers itself to keep dealing with all the edge cases that would show up from being used everywhere that a programmer was needed before.

    • To be fair, a lot of roles simply disappeared over the years.

      Developers today are much more productive than 30 years ago, mostly because someone automated the boring parts away.

      A modern developer can spin up a simple crud app including infrastructure in a day or so. That’s much much more productive than 1995. We just cram a lot more of the world into software, so we need 20x the amount of developers we needed back then.

  • Rational Rose etc. could generate code from UML diagrams, then you “only” needed architects.

    In reality it only gave a little help during the design phase, as soon as someone touches the generated code, you have to manually merge changes to UML.

    • It’s really weird, though, that nobody really created a language/tool to bridge these two world. It’s always just generating one representation from the other, mostly in a bad way.

      I’d argue, that for many problems, a graphical view of the system can help reasoning. But there simply is nothing in that regard.

      • For OOP languages, you can definitely get IDE plugins, which create UML from code.

        Personally, I’ve never found them useful, though, partially because our code was never OOP enough, e.g. we were using the actor pattern, or had important modules with functions, or had lots of small classes for handing data around etc…

        But also because it just makes for bad architecture diagrams.
        It has no sense of what’s important and what should be abstracted away. Or how to structure the diagram to make it readable, e.g. REST API at the top, database at the bottom.

        What I also really don’t like about generated architecture diagrams in general (even when the contents are specified via e.g. PlantUML), is that things jump around every time you make a structural change. This means people looking at the diagram have no chance of learning what it looks like, so they can spot changes or know where to look for what they’re interested in.

    • Was thinking that may be why it’s taking so long. It’s akin to knowing you have to train your human replacement before you’re fired. You can’t possibly teach a program or human everything you know in a limited time; and a great many don’t want to do.

  • The earliest I can think of (from personal experience) is 4GL languages; the early low-code platforms that first started to get traction in the early 80s. They wouldn’t have replaced programmers but some thought/hoped they would usher in an age of “low skill” programmers that companies could get away with paying minimum wage to.

    • Oh yeah, low-code platforms in general are pretty much always a thing, in every industry for various tasks.

      I’ve also never seen any of them that were not horribly abused with ridiculous workarounds or custom code snippets, which effectively made them as complex as a real program.

  • So far my experience with ai is it cannot evaluate the quality of the data it uses to any significant degree. As such it can summarize which is convenient for searching and give examples but ultimately you have to correct its mistakes and know enough to do so. There is some savings for a programmer in the sense you might be able to get some rough scaffolding and its a bit eaiser to identify relevant search links but I don’t see it replacing developers. It definitely allows one to do more though or even increase the quality. One really great thing it can do is auto commenting of the code which does not need as much improvement as actual code and makes it more likely for you to do the task (both because it does it and because it causes you to go. no don’t explain it like that). Is similarly helps with documentation. I doubt it could more than double productivity though. At least as how it stands now. Im not sure it can do much better without becoming general ai.

  • The first time I heard about programming being obsolete was when I was taught UML in university. That was over almost 15 years ago and it didn’t happen, if anything programmers now also had to know UML, which isn’t all that bad but it definitely didn’t replace anything, it’s just useful for designing and documenting projects.

    I also heard from colleagues that in the 80s and 90s people said that SQL was supposed to be used by users directly, making (some) programming obsolete.

    Now AI bullshit claims to be making programming obsolete. I won’t hold my breath.