For some background, I originally wanted to break into programming back when I was in college but drifted more into desktop tech support and now systems administration. SysAdmin work is draining me, though, and I want to pick back up programming and see if I can make a career out of it, but industry seems like it could be moving in a direction to rely on AI for coding. Everything I’ve heard has said AI is not there yet, but if it’s looking like it hits a point where it reaches an ability to fully automate coding, should I even bother? Am I going to be obsolete after a year? Five years?

  • NocturnalMorning@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    It you can pair coding with something in engineering with physics, there’s a lot less chance of that happening. That said AI taking programming jobs is still a long ways off if it ever gets there even.

    A lot of programs you ask chatGPT to write have a lot of nonsense in them, so someone has to manually sift through fake api calls, or just flat out incorrect algorithms. That said it’s not a bad starting point if you are a seasoned expert and can sift through all that stuff.

  • echo64@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    No. But not because of AI. There’s currently hundreds of thousands of out of work people surrounding tech. You’re competing with them for every job.

    Even then, most of engineering isn’t in the nuts and bolts of putting it together. It’s in the endless discussions and decisions that lead to the nuts and bolts.

  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    The automobile didn’t put cabbies out of jobs, it put horses out of work.

    If anything it actually made demand for cabbies skyrocket, because now they could do the same job but way faster, so now they were more affordable abd not just a service reserved for wealthy.

    In other words, expect that AI will increase demand for programmers exceptionally, as the bar for entry lowers.

    An LLM still needs a “pilot” to “drive” it, and you need to still know code well enough to interpret the output and catch mistakes or hallucinations.

    But typically when a field becomes more affordable, it goes up in demand, not down, because the target audience that can afford the service grows exponentially.

    “But if it’s so easy to become program now, what’s to stop people from just using ChatGPT and never hiring a programmer?”

    Same reason people still, today, hire cabs even if they can drive themselves.

    Convenience. Time is money and just because 1 person can do all the jobs of a company, doesn’t mean they physically have the time to do it.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      expect that AI will increase demand for programmers exceptionally, as the bar for entry lowers.

      Bingo! This also happened when web frameworks promised to take away our jobs. Also when code generators promised to take away our jobs.

      It turns out that expertise in computers remains pretty useful in a society that uses computers for almost everything. Even after the exact previous computer skill is no longer relevant.

      Source: Y’all can pry Commodore BASIC from my cold dead hands. I may not be getting paid for it, but I’ll keep producing that beautiful line numbered code until my last breath.

    • jadero@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      But typically when a field becomes more affordable, it goes up in demand, not down, because the target audience that can afford the service grows exponentially.

      I’ve always been very up front with the fact that I could not have made a career out of programming without tools like Delphi and Visual Basic. I’m simply not productive enough to have to also transcribe my mental images into text to get useful and productive UIs.

      All of my employers and the vast majority of my clients were small businesses with fewer than 150 employees and most had fewer than a dozen employees. Not a one of them could afford a programmer who had to type everything out.

      If that’s what happens with AI tooling, then I’m all for it. There are still far too many small businesses, village administrators, and the like being left using general purpose office “productivity” software instead of something tailored to their actual needs.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        There are still far too many small businesses, village administrators, and the like being left using general purpose office “productivity” software instead of something tailored to their actual needs.

        Exactly. The “AI will do it all” crowd don’t have this perspective. There’s so much more work to be done, and I hope AI is hugely impactful to help. But I’ve been at this long enough to know that’s still a long road.

    • Lydia_K@startrek.website
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Thank you for not being the rare rational outlook on AI and buying into the AI fear mongering, or the AI hype train.

      AI is the new auto hammer, it can do things faster and sometimes better. Why you can build a house faster and with less effort! Or you can bash someone’s skull in faster and with less effort! It’s just a new tool we can use, for better or for worse, like every other new tool before it.

  • palebluethought@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Other than maybe a few very rote, boilerplate types of development, all this shit about replacing coders is almost entirely noise made by either the wishful thinking of oligarchs or credulous repetition of that wishful thinking by clueless journalists.

    But it’s still a pretty rough time to be just getting into tech, just because of the state of the job market.

    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      The real question here is: how much “coding work” is there left to do?

      Currently, the bottleneck is available developers (barring short term problems). Even if AI would make every developer 30% more efficient, there would still be work to do. But there will be a point, where this tipps over. At some point, there’s no additional demand anymore. We just don’t know, when this will happen. 50%, 100%, maybe 700%?

      One thing to keep in mind is, that AI code doesn’t have to be good, just good enough. Many nerds seem to think that efficiency, beauty or elegance have value. But to a business, that’s just a collateral benefit. Software can be buggy, slow, hard to update. That all doesn’t matter, if the results and costs are in a good-enough ratio.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        How much coding work is left to be done? Infinity. There will always be more needed. Always. And while there is a certain truth to the idea that software just needs to be good enough, it will very quickly become nearly impossible to maintain and add new features.

        AI doesn’t make us 30% more efficient. There are certain tasks that’s it’s really helpful for, but they are really limited. I can see issues with junior developers being replaced with AI when they are in the takes more work to train them then just do their job stage. Beyond that, a good developer has skills and experience that AI will never be able to replace, especially since the code has to be maintained.

        • AggressivelyPassive@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          How much coding work is left to be done? Infinity.

          Well, no. That’s just plain wrong. There is only a certain amount of demand for software, like for every other product or service. That’s literally economy 101.

          AI doesn’t make us 30% more efficient.

          You don’t know that. Think about how much time you spend on boilerplating. Not only the “traditional” boilerplate, but maintenance, regular updates, breaking upgrades for dependencies, documentation.

          Think about search. Google isn’t that good at actually understanding what you want to find, an AI might find that one obscure blog post from 5 years ago. But in 10s, not 10h.

          Think about all the tests, that you write, that are super obvious. Testing for http 500 handling, etc.

          A technology doesn’t have to replace you to make you more efficient, just taking some work off your shoulders can boost productivity.

          • abhibeckert@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            10 months ago

            Well, no. That’s just plain wrong. There is only a certain amount of demand for software, like for every other product or service. That’s literally economy 101.

            But that demand isn’t going anywhere. A company with good profits is always going to be willing to re-invest a percentage of those profits in better software. A new company starting out is always going to invest whatever amount of risk they can tolerate on brand new software written from scratch.

            That money will not be spent on AI, because AI is practically free. It will always be spent on humans doing work to create software. Maybe the work looks a bit different as in “computer, make that button green” vs button.color = 'green' however it’s still work that needs to be done and honestly it’s not that big of an efficiency gain. It’s certainly not as big as the jump we did decades ago from hole punch programming to typing code on a keyboard. That jump did not result in lay offs, we have far more programmers now than we did then.

            If productivity improves, if anything that will mean more job opportunities. It lowers the barrier to entry allowing new projects that were not financially viable in the past.

            • festus@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              10 months ago

              Yes there will always be demand for coders, but will there be enough demand for the current (increasing) supply? Right now the global number of software developers is growing by about a million per year (total is only 28.7 million) - this means that (very roughly) to keep salaries stable we also need demand for new software to be growing by about 3.5% per year. I know that doesn’t sound like a lot, but a decade from now you’ll need 1.4 jobs for every job now to keep up with the supply.

              In the past we had new dynamics to get end-users to spend more and more time using computers and hence software (desktop PCs, video games, internet, mobile phones, social media, etc.). At this point there’s so little time left in a consumer’s day that tech can grow into that I worry that any further advancements will have to cannibalize from another area; I.e. we’ve reached peak demand for software.

          • VoterFrog@kbin.social
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            One thing that is somewhat unique about software engineering is that a large part of it is dedicated to making itself more efficient and always has been. From programming languages, protocols, frameworks, and services, all of it has made programmers thousands of times more efficient than the guys who used to punch holes into cards to program the computer.

            Nothing has infinite demand, clearly, but the question is more whether or not we’re anywhere near the peak, such that more efficiency will result in an overall decrease in employment. So far, the answer has been no. The industry has only grown as it’s become more efficient.

            I still think the answer is no. There’s far more of our lives and the way people do business that can be automated as the cost of doing so is reduced. I don’t think we’re close to any kind of maximum saturation of tech.

            • AggressivelyPassive@feddit.de
              link
              fedilink
              arrow-up
              0
              ·
              10 months ago

              Here again, I think, is a somewhat tech-centric view on economics.

              There is only a finite amount of automation demand, simply because human labor exists.

              Inside of our tech bubble, automation simply means more “functionality” per person per time unit. What took 10 devs a year yesterday can be done by 5 people in 6 months today. That’s all five and dandy, but at some point, software clashes with the hard reality of physics. Software doesn’t produce anything, it’s often just an enabler for physical production. Lube, or grease.

              Now, that production obviously can be automated tremendously as well, but with diminishing returns. Each generation of automation is harder than the one before. And each generation has to compete with a guy in Vietnam/Kenia/Mexico. And each generation also has to compete with its own costs.

              Why do you think, chips are so incredibly expensive lately? RND costs are going through the roof, production equipment is getting harder and harder to produce, and due to the time pressure, you have to squeeze out as much money as possible out of your equipment. So prices go up. But that can’t go on forever, at Stone point the customers can’t justify/afford the expense. So there’s a kind of feedback loop.

              • VoterFrog@kbin.social
                link
                fedilink
                arrow-up
                0
                ·
                10 months ago

                Yes, what I’m saying is that lower costs for software, which AI will help with, will make software more competitive against human production labor. The standard assumption is that if software companies can reduce the cost of producing software, they’ll start firing programmers but the entire history of software engineering has shown us that that’s not true as long as the lower cost opens up new economic opportunities for software users, thus increasing demand.

                That pattern stops only when there are no economic opportunities to be unlocked. The only way I think that happens is when automation has become so prevalent that further advancement has minimal impact. I don’t think we’re there yet. Labor costs are still huge and automation is still relatively primitive.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            I agree with you on the testing point, but I disagree with everything else that you’ve said. I didn’t say infinite demand, did I? I said there is an infinite need for coding. Just in the realm of business software, there will never come a day when there isn’t one more business requirement, and there will never come a day when there isn’t a need to translate that requirement into code. Is that literal infinity or only effectively so? I don’t care. I’m not writing an academic thesis here.

            Coding demand is not constrained by the amount of work that needs to be done, but by money (and to a degree organization because larger groups become less efficient) because there will always be more coding that needed to be done.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Currently, the bottleneck is available developers

        The bottleneck is developers competent to code in each specific domain. We have nowhere near enough, even for the most popular subject - writing an eCommerce store.

        Source: Let’s all play a quick round of “think of every eCommerce site you don’t hate.” Okay, now let’s all try to think of a second one. Anyone still trying for three?

        The specific domain list is, I suspect, currently uncountable.

        Twenty five years from now, I suspect we can get a ballpark approximate count of developer specializations by starting at the top of the popular WordPress plugins list, and stopping when we start hitting obvious duplicates.

        After that, we can compare that count to the known number of tailored-to-purpose AIs available.

        Subtract the two, and that’s the remaining runway on “AI is taking all of our jobs”. (If feeling generous, add a bit of leeway for each AI also needing some time to stop sucking.)

        Source: I’m part of the AI problem/solution. It’s fun times, but y’all excited folks are going to be disappointed for a long while, unless you’re incredibly satisfied with what you already have.

        Even when we reach true sentient AI (which may still be impossible), there will be an uphill road to teach it each domain we want it to work in.

        Teaching humans is difficult. Teaching computers is difficult. Teaching a computer that thinks like a human might be much easier.

        Perhaps it will teach itself at an incredible speed. Some humans do. But history suggests that getting each AI up to speed, in each domain we need it for, will still be…difficult.

  • mspencer712@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    As a professional C# developer since 2012, I’d say a programmer needs four kinds of knowledge. As an organizational user of Github Copilot for a couple months, I’d say AI tools can help with one, maybe two of those.

    Understanding language and syntax, so you can communicate the ideas in your head to the machine accurately: AI is fairly good at this, will certainly get a lot better.

    Understanding algorithms and data structures, well enough to compare and contrast, and choose the most appropriate ones for each circumstance: AI can randomly select something, unless it’s a frequently solved problem. I don’t expect this to get better except for the most repetitive of coding tasks.

    Understanding your execution environment and adapting your solutions to use it well: I don’t see the current generation of AI tools ever approaching this. I don’t think they have context for how a piece of code is used, when trying to learn from it. One size fits all is not a great approach.

    Understanding your customer’s needs and specific problems, and creating products, not code. Problem domains and solutions are a business’s entire reason for existence. This is all kept confidential (and outside the reach of an AI training data set) for competitive reasons. As a human employee, you get to peek behind the curtain and learn these things yourself.

    • Kbin_space_program@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I’ll add that one of the biggest hurdles/flaws to the fundamental architecture of the design of AI is their inability to reapply a specific use case in a slightly different scenario.

      This was a flaw that came out of self driving development and, as far as I understand, plays a major role in why the hype on it died off really fast. Adding more to the model can’t fix that flaw.

      • AggressivelyPassive@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Well, look at 99% of business software. It’s always the same.

        “Tech” in the sense of FAANG and so on isn’t that large of an employer worldwide and even within them, a significant portion is regular old business software. Billing hours in AWS and billing phone calls isn’t that different.

        I’d argue, if an AI could create a form and transform that data into a given schema to push somewhere, 20% of all developers will need a new job.

        • mspencer712@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          And those jobs are critical to the process of making new developers.

          An important part of my education - the part that grad school can’t teach you, you have to learn it on the job - was being new and terrible, grinding on a simple problem and feeling like a waste of money. Any of the experienced guys sitting behind me could have done this thing in a few hours but I’ve been working on it for a week. “What’s the point? Any minute now they’re going to tap me on the shoulder and tell me I’m done, it’s time to go find another job.”

          But that never happened.

          Those early problems weren’t fun. At home I would have never chosen to work on them. I’d leave them for someone else. “But now that I’m collecting a paycheck for it, this isn’t up to me. I have to work on it. I can’t give up. I can ask for help, but I need to show my peers that I belong. I can solve difficult problems. I can persevere.”

          As a mediocre professional developer, I had to struggle to learn that. I wasn’t getting far on my own, without mentorship and motivation. Homework, pursuing degrees, wasn’t getting me there. (And even now, I seem to have about two weeks of attention span, for projects at home.)

          • AggressivelyPassive@feddit.de
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            Those jobs, or better, this kind of task, won’t go away. It will just become different.

            Think about how completely different our modern development looks compared to, say, the 80s. Nobody writes assembly anymore, so nobody learns the basics! Nobody reads the manual anymore, instead we use autocomplete and Google. Remembering the arcane runes of C used to be exactly the kind of “hazing grunt work” you’re talking about. That skill used to be valuable, but who of us can honestly say to remember more than a handful of nontrivial method signatures?

  • vrighter@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    lol. Of course there is. Ai cannot code. it’s a glorified autocomplete that mostly gets things subtly wrong. So you’ll spend more time trying to understand the code you didn’t write and look for any bugs, than if you had written and understood it yourself.

  • BrianTheeBiscuiteer@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I don’t know what kind of sysadmin stuff you do but I don’t think it has to be a complete shift from infrastructure to software. There are lots of tools out there to help automate provisioning and maintenance. Terraform, Ansible, Puppet, and Chef are a few, and to a certain extent they’re really a language of their own. I find it scary how much infrastructure is managed by hand, by Bash scripts, or software engineers that don’t really grasp infrastructure.

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    AI will trivialize the day to day work of programming, the really hard part of programming is being able to put together complex systems in a maintainable way. It’s really more of a project structure problem than it is a programming problem. Best way to future proof yourself is to focus on the higher level architectural challenges and putting together complex infrastructure as opposed to learning things like sorting algorithms (although I don’t think interviewing has caught up with that yet). LLM powered systems may eventually get to the point where it can even replace architectural tasks, but at that point almost all office jobs will be obsolete.

    EDIT: Another thought, getting out of tech now is like getting out of tech right before the Internet took off. Yes AI will replace a ton of jobs and there’s going to be a big reckoning, but there’s a shit ton of work to get these systems in place in a reliable way. The next Gen of AI is super capable, but it’s also pretty jank, and getting it to work to it’s capability will require a ton of work which provides a ton of opportunity to get in on companies that will become big when they do it well.

  • Falcon@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    There’s hardly any work for programmers and there hasn’t been for years, I’d say over a decade.

    There is a massive demand for professionals, scientists and engineers who are tech literate and know programming.

    Go develop hard skills in some field and use programming to set yourself apart.

  • abhibeckert@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    AI is a tool for coders to use and it will never make coders obsolete. As someone trying to enter the industry, my advice is lean into it and use AI as a learning tool.

    Having said that - it is pretty hard to find a job in the industry right now, due to all the layoffs. Those layoffs are related to covid not AI, so it should be temporary… but in the mean time you’re likely to be competing for jobs with people who have decades of experience.

    I believe there is still a shortage of developers long term, but short term not so much.

    • RandomDevOpsDude@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      I find it very difficult to recommend generative ai as a learning tool (specifically for juniors) as it often spits out terrible code (or even straight up not working) which could be mistaken as “good” code. I think the more experienced a dev is, the better it is to use more like a pair programmer.

      The problem is it cannot go back and correct/improve already generated output unless prompted to. It is getting better and better, but it is still an overly glorified template generator, for the most part, that often includes import statements from packages that don’t exist, one off functions that could have been inline (cannot go back and correct itself), and numerous garbage variables that are referenced only once and take up heap space for no seemingly no good reason.

      Mainly speaking on GPT4, CoPilot is better, both have licensing concerns (of where did it get this code from) if you are creating something real and not for fun.

  • MajorHavoc@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    SysAdmin work is draining me, though, and I want to pick back up programming and see if I can make a career out of it

    SysAdmin plus Dev skills makes a DevOps Specialist which is still in high demand. I suppose it might feel too much like the SysAdmin work.

    Am I going to be obsolete after a year? Five years?

    I was advised not to bother decades ago based on similar logic, but I’m still very much in demand.