For some background, I originally wanted to break into programming back when I was in college but drifted more into desktop tech support and now systems administration. SysAdmin work is draining me, though, and I want to pick back up programming and see if I can make a career out of it, but industry seems like it could be moving in a direction to rely on AI for coding. Everything I’ve heard has said AI is not there yet, but if it’s looking like it hits a point where it reaches an ability to fully automate coding, should I even bother? Am I going to be obsolete after a year? Five years?

  • Kbin_space_program@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I’ll add that one of the biggest hurdles/flaws to the fundamental architecture of the design of AI is their inability to reapply a specific use case in a slightly different scenario.

    This was a flaw that came out of self driving development and, as far as I understand, plays a major role in why the hype on it died off really fast. Adding more to the model can’t fix that flaw.

    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Well, look at 99% of business software. It’s always the same.

      “Tech” in the sense of FAANG and so on isn’t that large of an employer worldwide and even within them, a significant portion is regular old business software. Billing hours in AWS and billing phone calls isn’t that different.

      I’d argue, if an AI could create a form and transform that data into a given schema to push somewhere, 20% of all developers will need a new job.

      • mspencer712@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        And those jobs are critical to the process of making new developers.

        An important part of my education - the part that grad school can’t teach you, you have to learn it on the job - was being new and terrible, grinding on a simple problem and feeling like a waste of money. Any of the experienced guys sitting behind me could have done this thing in a few hours but I’ve been working on it for a week. “What’s the point? Any minute now they’re going to tap me on the shoulder and tell me I’m done, it’s time to go find another job.”

        But that never happened.

        Those early problems weren’t fun. At home I would have never chosen to work on them. I’d leave them for someone else. “But now that I’m collecting a paycheck for it, this isn’t up to me. I have to work on it. I can’t give up. I can ask for help, but I need to show my peers that I belong. I can solve difficult problems. I can persevere.”

        As a mediocre professional developer, I had to struggle to learn that. I wasn’t getting far on my own, without mentorship and motivation. Homework, pursuing degrees, wasn’t getting me there. (And even now, I seem to have about two weeks of attention span, for projects at home.)

        • AggressivelyPassive@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Those jobs, or better, this kind of task, won’t go away. It will just become different.

          Think about how completely different our modern development looks compared to, say, the 80s. Nobody writes assembly anymore, so nobody learns the basics! Nobody reads the manual anymore, instead we use autocomplete and Google. Remembering the arcane runes of C used to be exactly the kind of “hazing grunt work” you’re talking about. That skill used to be valuable, but who of us can honestly say to remember more than a handful of nontrivial method signatures?