Just use C. It solves all those problens given the most complicated feature is pointers and those hard aren’t to understand.
Just use C. It solves all those problens given the most complicated feature is pointers and those hard aren’t to understand.
It about a comic called “loss” where which has the same character positions as the rectangles.
Fuck, now you made me have a loss as well. I am like -900 points in The Game.
Thank you all for offering advice. I did eventually get it working and repaired all the packages.
I couldn’t figure out how to mount /dev/sda1 and did pacman -Syu and then I mounted it once I figured it out now pacman says there is nothing to do.
Did systemd or grub not even show up?
Will this work from slax linux? I am sorry if I seem like I can’t fix the issue myself seeing as you have given the resources for me to do so but what would be the exact steps to do that?
Thanks I might try that out later.
I ran it on my pc with a gtx 1070 with cuda enabled and compiled with the cuda compile hint but it ran really slowly how do you get it to run fast?
I’m not saying it’s unhealthly I am just saying they don’t help if they don’t pay above the cost of living. Sure you can get a job paying 15 USD but that isn’t even going to cover rent + utilities. So for now your stuck with your job and don’t have the option to switch.
That only works when worker are less replaceable and desperate. Their are a lots of open job positions today but most pay less than the cost of living.
I don’t know why this made me giggle so much.
Foo Fighters are a great band.
Just to warn you it might be very bulky and the model that the script is downloading is deprecated so you’ll have to find a different .gguf model on hugging face. Try to find a lightweight .gguf model and replace the MODEL variable with it nane as well the rest of the link. Or just download from a browser and move it into the models folder.
I believe Llama is open source but not sure how complicated it is to get running locally. Nevermind: https://replicate.com/blog/run-llama-locally
You can probably write a bash wrapper around it that feeds in “Can you summarize this text: (text here)” by setting the PROMPT variable the bash script. (Probably just do PROMPT=“Can you summarize this text: $1”) (Obviously don’t recompile everytime so remove the clone build and download code)
You can use tldr for man pages but for generic text I don’t know. You would probably need a LLM.
No, I have never used any of those closed source options. I wanted cloud services I have perfectly good esp32 lying around. And if I get worried about the vendor provided system libraries I can just buy a Raspberry Pi or something.
I agree, unless you doing low level stuff where you need absolute control you should use a modern language with proper abstraction just to save time. Most use cases where they use C++ can be replaced with Rust or Go as they aren’t saddled with years tech debt and bloat due to having mantaining backwards compatibility.