• Valmond@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    try writing it it in Assembly

    Small error, game crashes and takes whole PC with it burning a hole in the ground.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It was really easy to crash an Apple II game and get into the assembler. And my goodness am I glad I didn’t destroy my computer as a kid randomly typing things in to see what would happen.

      • Valmond@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Remember old Apple, had to use them when learning to program, there were 2 types, one with the OS on a diskette, one with a small hard drive, and they randomly showed a large bomb in the middle of the screen and you had to reload the OS. Probably the compuler that broke everything.

      • addie@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Writing in ASM is not too bad provided that there’s no operating system getting in the way. If you’re on some old 8-bit microcomputer where you’re free to read directly from the input buffers and write directly to the screen framebuffer, or if you’re doing embedded where it’s all memory-mapped IO anyway, then great. Very easy, makes a lot of sense. For games, that era basically ended with DOS, and VGA-compatible cards that you could just write bits to and have them appear on screen.

        Now, you have to display things on the screen by telling the graphics driver to do it, and so a lot of your assembly is just going to be arranging all of your data according to your platform’s C calling convention and then making syscalls, plus other tedious-but-essential requirements like making sure the stack is aligned whenever you make a jump. You might as well write macros to do that since you’ll be doing it a lot, and if you’ve written macros to do it then you might as well be using C instead, since most of C’s keywords and syntax map very closely to the ASM that would be generated by macros.

        A shame - you do learn a lot by having to tell the computer exactly what you want it to do - but I couldn’t recommend it for any non-trivial task any more. Maybe a wee bit of assembly here-and-there when you’ve some very specific data alignment or timing-sensitive requirement.

        • henfredemars@infosec.pub
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I like ASM because it can be delightfully simple, but it’s just not very productive especially in light of today’s tooling. In practice, I use it only when nothing else will do, such as for operating system task schedulers or hardware control. It’s nice to have the opportunity every once in a while to work on an embedded system with no OS but not something I get the chance to do very often.

          On one large ASM project I worked (an RTOS) it’s exactly as you described. You end up developing your own version of everything a C compiler could have done for you for free.

    • Capt. Wolf@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I tried decades ago. Grew up learning BASIC and then C, how hard could it be? For a 12 year old with no formal teacher and only books to go off of, it turns out, very. I’ve learned a lot of coding languages on my own since, but I still can’t make heads or tales of assembly.

      • zod000@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Sounds very similar to my own experience though there was a large amount of Pascal in between BASIC and C.

        • Capt. Wolf@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Yeah, I skipped Pascal, but it at least makes sense when you look at it. By the time my family finally jumped over to PC, C was more viable. Then in college, when I finally had to opportunity to formally learn, it was just C++ and HTML… We didn’t even get Java!

          • zod000@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I had used like four different flavors of BASIC by the time I got a IBM compatible PC, but I ended up getting on the Borland train and ended up with Turbo Pascal, Turbo C, and Turbo ASM (and Turbo C++ that I totally bounced off of). I was in the first class at my school that learned Java in college. It was the brand new version 1.0.6! It was so rough and new, but honestly I liked it. It’s wildly different now.

      • Dubiousx99@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Assembly requires a knowledge of the cpu architecture pipeline and memory storage addressing. Those concepts are generally abstracted away in modern languages

        • WolfLink@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          You don’t need to know the details of the CPU architecture and pipeline, just the instruction set.

          Memory addressing is barely abstracted in C, and indexing in some form of list is common in most programming languages, so I don’t think that’s too hard to learn.

          You might need to learn the details of the OS. That would get more complicated.

          • Dubiousx99@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I said modern programming languages. I do not consider C a modern language. The point still stands about abstraction in modern languages. You don’t need to understand memory allocation to code in modern languages, but the understanding will greatly benefit you.

            I still contend that knowledge of the cpu pipeline is important or else your code will wind up with a bunch of code that is constantly resulting in CPU interrupts. I guess you could say you can code in assembly without knowledge of the cpu architecture, but you won’t be making any code that runs better the output code from other languages.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Step 1: Begin writing in Assembly

    Step 2: Write C

    Step 3: Use C to write C#

    Step 4: Implement Unity

    Step 5: Write your game

    Step 6: ???

    Step 7: Profit

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I wanna see someone make a GPU accelerated game in assembly.

    Just throw the Vulkan and DX12 C APIs in the garbage and do it all yourself lol.

  • bratorange@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Don’t Want to be that Guy but you can actually use library’s in Assembly and probably want to, as otherwise you have no good way of interacting with the os.

    • jdr@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      In fact Chris Sawyer did use C for the purposes of linking the OS libraries necessary for windowing, rendering, sound etc.

  • MonkeMischief@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I love Roller Coaster Tycoon. It’s absolutely crazy how he managed to write a game in a way many wouldn’t even attempt even in those days, but it’s not just a technical feat, it’s a creative masterpiece that’s still an absolute blast to play.

    It still blows my mind how smoothly it gives the illusion of 3D and physics, yet it can run on almost anything.

    OpenRCT brings a lot of quality of life and is often the recommended way to play today, but the original RCT will always deserve a spot on any “Best Games of All Time” list.

    • dai@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It was even ported to the original Xbox. I remember the total games file size being incredibly small - compared to most other titles on that system.

        • ziggurat@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Like the classic, inherit a broken code base, and not being allowed by the owner to rewrite it from scratch. So you have to spend more time making each part work without the others working. Also before you are finished the customer says they have something else for you to do

          • derpgon@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            That’s when you start introducing modules that have the least impact on the legacy code base. Messaging is a good place to start, but building a new code next to the existing one and slowly refactoring whenever you got time to spare is at least a bearable way to go about it.

            • drphungky@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              Shhhh you just described iterative development. Careful not to be pro agile, or the developers with no social skills will start attacking you for being a scrum master in disguise!

              • derpgon@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                Fuck agile, or scrum, or whatever it is called. I just look at the issues and pick whatever I feel like doing. Kanban for life.

          • jabjoe@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            Programmers love to rewrite things, but it’s often not a good idea, let alone good for a business. Old code can be ugly because it is covered with horrible leasons and compromises. A rewrite can be the right thing, but it’s not to be taken lightly. It needs to be budgeted for, signed off on and carefully planned. The old system needs to stable enough to continue until the new system can replace it.

            • ziggurat@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              Okay, I’ll tell you, in this situation, the code never really worked outside of the demo stage. It was written in bash+ansibel+terraform+puppet designed to use ssh from a docker container and run stages of the code on different servers. And some of it supposedly worked on his computer, but when it failed to run when he was not clicking the buttons, and I read through each part, I can promise you that it never worked

              I didn’t write broken code base because I didn’t like the code, I meant that it didn’t work

              • jabjoe@feddit.uk
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 months ago

                The whole point of docker is to solve the “work on my computer” by providing the developer hacked up OS with the app. (Rather than fixing it and dealing dependencies like a grown up)

                Bit special for it to still be broken. If it flat out doesn’t work, at all, then it may well be “sunk cost fallacy” to keep working on it. There is no universal answer, but there is a developer tendency to rewrite.

                • ziggurat@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  I’ll consede that his point in using docker was to avoid the “it works on my computer” problem. It was literally one of his talking points in his handover meeting. But that is not the problem docker is trying to solve, and not it’s strength.

                  Docker and similar container software makes many things very convenient, and has uses far outside it’s originally intended usage.

                  And in this situation, when want stable package versions, and simpler uniform setup. And you don’t have stable package versions because docker doesn’t provide reproducible builds (and he didn’t do the work work srojdn that), and it is not a simpler setup when you want to use the hosts ssh agent with ssh inside docker, which require different steps for different distros, Mac and Idk if windows would have worked? And sharing your ssh agent into the docker image is not stable either even if you set it up, it isn’t sure to work the next reboot. And can be every difficult in some Linux distros due to permissions, etc.

                  Then I ended up putting it on a vm, that is already used for utilities. If I were to do it today, I would probsbly use nix, to actually run these programs that is very sensitive to program version changes in a stable reproducible environment that can run on any Linux distro, including in docker

                  But the program had many more issues, like editing yaml files by catting them and piping them into tac and piping into sed and then into tac again… And before you say you could do that just with one sed command, sure, but the sane solution is to use yq. Let’s just say that was the tip of the iceberg

                  Oh and just have to note, claimed working features, but no way for that code the be executed, and when I actually tried to hook up this code, I can’t believe its ever fully worked.

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      The game Roller Coaster Tycoon was famously hand written in raw CPU instructions (called assembly language). It’s only one step removed from writing literal ones and zeros. Normally computers are programmed using a human-friendly language which is then “compiled” into CPU instructions so that the humans don’t have to deal with the tedium and complication of writing CPU instructions.

        • ericbomb@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          To send the point home even more, this is how in python you make a line of text display:

          print("Hello World")

          This is the same thing, in assembly (According to a blog I found. I can’t read this. I am not build better.)

            org  0x100        ; .com files always start 256 bytes into the segment
          
              ; int 21h is going to want...
          
              mov  dx, msg      ; the address of or message in dx
              mov  ah, 9        ; ah=9 - "print string" sub-function
              int  0x21         ; call dos services
          
              mov  ah, 0x4c     ; "terminate program" sub-function
              int  0x21         ; call dos services
          
              msg  db 'Hello, World!', 0x0d, 0x0a, '$'   ; $-terminated message
          

          But python turns that cute little line up top, into that mess at the bottom.

          I like python. Python is cute. Anyone can read python.

          • pivot_root@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            That assembly is for a DOS application. It would be more verbose for a modern Linux or Win32 application and probably require a linker script.

            But python turns that cute little line up top, into that mess at the bottom.

            Technically, not quite. Python is interpreted, so it’s more like “call the print function with this string parameter” gets fed into another program, which calls it’s own functions to make it happen.

      • OsrsNeedsF2P@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        To further emphasize this, I had an assembly course in university. During my first lab, the instructor told us to add a comment explaining what every line of assembly code did, because if we didn’t, we would forget what we wrote.

        I listened to his advice, but one day I was in a rush, so I didn’t leave comments. I swear, I looked away from the computer for like 2 minutes, looked back, and had no idea what I wrote. I basically had to redo my work.

        It is not that much better than reading 1s and 0s. In fact in that course, we spent a lot of time converting 1s and 0s (by hand) to assembly and back. Got pretty good at it, would never even think of writing a game. I would literally rather create my own compiler and programming language than write a game in assembly.

        • pivot_root@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I’m probably completely insane and deranged, but I actually like assembly. With decent reverse engineering software like Ghidra, it’s not terribly difficult to understand the intent and operation of isolated functions.

          Mnemonics for the amd64 AVX extensions can go the fuck right off a bridge, though. VCVTTPS2UQQ might as well be my hands rolling across a keyboard, not a truncated conversation from packed single precision floats into packed unsigned quadword integers.

          • emergencybird@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I had a course in uni that taught us assembler on z/os. My advisor told me most students fail the course on the first try because it was so tough and my Prof for that course said if any of us managed to get at least a B in the course, he’d write us a rec letter for graduate school. That course was the most difficult and most fun I’ve ever had. I learned how to properly use registers to store my values for calculations, I learned how to use subroutines. Earned myself that B and went on to take the follow up course which was COBOL. You’re not crazy, I yearn to go back to doing low level programming, I’m mostly doing ruby for my job but I think my heart never left assembler hahaha

          • MonkderVierte@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 months ago

            Ah yes, there was this guy in our tech school class that used to code golf in assembly. Was a crack in math and analytics too, which might explain it somewhat. Well, everyone is different i guess.