The article discusses the impact of artificial intelligence (AI) on the job market. While AI is increasingly making some jobs obsolete, the need for computer science degrees is also predicted to increase as the understanding of AI systems becomes necessary. The article highlights a protest by Hollywood’s Writers Guild of America against the use of AI for writing scripts, citing the need for proper attribution and copyright. The article also discusses the views of experts who suggest that AI will not cause instant mass unemployment but will displace jobs over time and make people more productive. Companies like IBM are already replacing jobs that can be done by AI. Overall, the article suggests that the future impact of AI on jobs is uncertain, and there may be a need for humans to be trained like AI models to be unbiased, capable, and proficient in them.

  •  pingveno   ( @pingveno@lemmy.ml ) 
    link
    fedilink
    English
    10
    edit-2
    11 months ago

    I’m skeptical about any large impact on software development above the code monkey level. At least with today’s AI, it’s a statistical model based on its training set. It will spit out something that might work, so it can be useful for autocompletion and snippet generation. But when you’re trying to architect a system, gather requirements, or conceive of how data routes through a system then AI won’t help at all.

    •  pancake   ( @pancake@lemmy.ml ) 
      link
      fedilink
      English
      311 months ago

      It’s not a matter of intelligence. We humans use paper, pen, computers to assist us, and visual and sound perception in combination with language, rather than separately. AI, as it works right now, is doing the equivalent of reciting an essay from scratch or imagining an artwork in perfect detail in a second. Give it the chance to draw diagrams and process them iteratively, and it will do your and my jobs, and do them well.

      •  pingveno   ( @pingveno@lemmy.ml ) 
        link
        fedilink
        English
        411 months ago

        That’s not really what it’s doing, though. It’s mimicking what it’s seen before and producing something that looks like it fits with the query. It has no conceptual understanding of what it’s doing. It will likely be an aid, not a replacement.

        •  pancake   ( @pancake@lemmy.ml ) 
          link
          fedilink
          English
          211 months ago

          Current AI are regression models, they are trained on some input and thus encode any rules and patterns that may govern this input. So, by definition, it does have conceptual understanding, as any information that may constitute this understanding is encoded within the model in a form in which it may use it for any purpose it needs.

          For example, if you give an AI many examples of long multiplication, it will eventually learn the rules that govern it, and it will be able to perform any multiplication by itself. There’s no practical difference between this and actual understanding (also it doesn’t help that our brains are just analog tensor-multiplication machines, much like the AI we’re creating).

          •  pingveno   ( @pingveno@lemmy.ml ) 
            link
            fedilink
            English
            011 months ago

            But long multiplication and software development are two different breeds. Long multiplication is just one set of very mechanical behaviors. Software has to be highly specialized to the task at hand.

            Take a project I maintain, an access request system. I had to:

            • Get a concept of what was needed
            • Review the access request system that I was replacing
            • Model the database
            • Draft email templates
            • Make REST API calls to systems for provisioning access
            • Handle authentication and authorization
            • Fill in a tree of requestable items
            • Work on a deployment process

            All this required that I have a very deep understanding of this specific application, how it works with the rest of our systems, and how it works with our business processes. Access request systems are a dime a dozen, but getting it honed to our specific requirements took precise knowledge and some creativity.

    • Most software development work involves mindless tasks such as connecting to service endpoints or dbs, querying data, transforming it in some way, then either presenting it to the user, storing it in a db, or sending it to another endpoint. It’s pretty clear that even current GPT based systems can do much of this work.

      What that means in practical terms is that work that was previously done by a team of ten developers could now be done by a team of a couple of devs who do the requirements gathering and write specification tests to ensure that GPT generated code is doing what’s intended.

      The nature of the industry could easily shift towards people learning how to make effective prompts for ML systems and then create specifications for testing. For example, the human would write what an API is expected to produce for the defined use cases, and then the algorithm will try and find a solution that fits the spec. Even if it’s no better than monkeys on typewriters, if it can produce a thousand solutions per hour, and one of them is correct that’s still faster and cheaper than paying a human to do this.

      The specification could include time, memory, and processing power constraints as well. Basically, this would be same approach as using genetic algorithms or logical programming in style of Prolog.

      The reality is that coding teams will have a handful of senior devs doing tasks like architecture design and mapping user requirements to code, while most of the team does what you refer to as “code monkey” level tasks. So, AI will be replacing majority of the work that software developers do today.

      Furthermore, the way you get senior developers is by hiring junior developers who need years of training to get the necessary experience. Companies will have little incentive to invest in junior devs when they can just have the AI do the same thing orders of magnitude cheaper. The context of the whole discussion is that new graduates will have hard time finding jobs by the time they graduate.

      As @pancake@lemmy.ml correctly points out, the ML system does encode understanding in the model. And it’s silly to think that this won’t rapidly improve in the coming years. We have no idea what the plateau is for ML systems and how rapidly they will be improving going forward. Certainly, pretty much nobody expected to see anything as functional as ChatGPT even a few years ago.