As someone who spends time programming, I of course find myself in conversations with people who aren’t as familiar with it. It doesn’t happen all the time, but these discussions can lead to people coming up with some pretty wild misconceptions about what programming is and what programmers do.
- I’m sure many of you have had similar experiences. So, I thought it would be interesting to ask.
A lot people compleatly overrate the amount of math required. Like its probably a week since I used a aritmetic operator.
At the same time, I find it amazing how many programmers never make the cognitive jump from the “playing with legos” mental model to “software is math”.
They’re both useful, but to never understand the latter is a bit worrying. It’s not about using math, it’s about thinking about code and data in terms of mapping arbitrary data domains. It’s a much more powerful abstraction than the legos and enables you to do a lot more with it.
For anybody who finds themselves in this situation I recommend an absolute classic: Defmacro’s “The nature of Lisp”. You don’t have to make it through the whole thing and you don’t have to know Lisp, hopefully it will click before the end.
the “playing with legos” mental model
??
Function/class/variables are bricks, you stack those bricks together and you are a programmer.
I just hired a team to work on a bunch of Power platform stuff, and this “low/no-code” SaaS platform paradigm has made the mentality almost literal.
I think I misunderstood lemmyvore a bit, reading some criticism into the Lego metaphor that might not be there.
To me, “playing with bricks” is exactly how I want a lot of my coding to look. It means you can design and implement the bricks, connectors and overall architecture, and end up with something that makes sense. If running with the metaphor, that ain’t bad, in a world full of random bullshit cobbled together with broken bricks, chewing gum and exposed electrical wire.
If the whole set is wonky, or people start eating the bricks instead, I suppose there’s bigger worries.
(Definitely agree on “low code” being one of those worries, though - turns into “please, Jesus Christ, just let me write the actual code instead” remarkably often. I’m a BizTalk survivor and I’m not even sure that was the worst.
My take was that they’re talking more about a script kiddy mindset?
I love designing good software architecture, and like you said, my object diagrams should be simple and clear to implement, and work as long as they’re implemented correctly.
But you still need knowledge of what’s going on inside those objects to design the architecture in the first place. Each of those bricks is custom made by us to suit the needs of the current project, and the way they come together needs to make sense mathematically to avoid performance pitfalls.
Removed by mod
Read that knowing nothing of lisp before and nothing clicked tbh.
When talking about tools that simplify writing boilerplate, it only makes sense to me to call them code generatiors if they generate code for another language. Within a single language a tool that simplifies complex tasks is just a library or could be implemented as a library. I don’t see the point with programmers not utilizing ‘code generation’ due to it requiring external tools. They say that if such tools existed in the language natively:
we could save tremendous amounts of time by creating simple bits of code that do mundane code generation for us!
If code is to be reused you can just put it in a function, and doing that doesn’t take more effort than putting it in a code generation thingy. They preach how the xml script (and lisp I guess) lets you introduce new operators and change the syntax tree to make things easier, but don’t acknowledge that functions, operator overriding etc accomplish the same thing only with different syntax, then go on to say this:
We can add packages, classes, methods, but we cannot extend Java to make addition of new operators possible. Yet we can do it to our heart’s content in XML - its syntax tree isn’t restricted by anything except our interpreter!
What difference does it make that the syntax tree changes depending on your code vs the call stack changes depending on your code? Of course if you define an operator (apparently also called a function in lisp) somewhere else it’ll look better than doing each step one by one in the java example. Treating functions as keywords feels like a completely arbitrary decision. Honestly they could claim lisp has no keywords/operators and it would be more believable. If there is to be a syntax tree, the parenthesis seem to be a better choice for what changes it than the functions that just determine what happens at each step like any other function. And even going by their definition, I like having a syntax that does a limited number of things in a more visually distinct way more than a syntax does limitless things all in the same monotonous way.
Lisp comes with a very compact set of built in functions - the necessary minimum. The rest of the language is implemented as a standard library in Lisp itself.
Isn’t that how every programming language works? It feels unfair to raise this as an advantage against a markup language.
Data being code and code being data sounded like it was leading to something interesting until it was revealed that functions are a seperate type and that you need to mark non-function lists with an operator for them to not get interpreted as functions. Apart from the visual similarity in how it’s written due to the syntax limitations of the language, data doesn’t seem any more code in lisp than evaluating strings in python. If the data is valid code it’ll work, otherwise it won’t.
The only compelling part was where the same compiler for the code is used to parse incoming data and perform operations on it, but even that doesn’t feel like a game changer unless you’re forbidden from using libraries for parsing.
Finally I’m not sure how the article relates to code being math neither. It just felt like inventing new words to call existing things and insisting that they’re different. Or maybe I just didn’t get it at all. Sorry if this was uncalled for. It’s just that I had expected more after being promised enlightenment by the article
Negl I absolutely did this when I was first getting into it; especially with langs where you actually have to import something to access “higher-level” math functions. All of my review materials have me making arithmetic programs, but none of it goes over a level of like. 9th grade math, tops. (Unless you’re fucking with satellites or lab data, but… I don’t do that.)
On the other hand in certain applications you can replace a significant amount of programming ability with a good undertstanding of vector maths.
Tbf, that’s probably because most CS majors at T20 schools get a math minor as well because of the obscene amount of math they have to take.
We must do different sorts of programming…
There’s a wide variety of types of programming. It’s nice that the core concepts can carry across between the disparate branches.
If I’m doing a particular custom view I’ll end up using
sin cos tan
for some basic trig but that’s about as complex as any mobile CRUD app gets.I’m sure there are some math heavy mobile apps but they’re the exception that proves the rule.
You should probably use matrices rather than trig for view transformations. (If your platform supports it and has a decent set of matrix helper functions.) It’ll be easier to code and more performant in most cases.
I mean I’m not sure how to use matrices to draw the path of 5 out of 6 sides of a hexagon given a specific center point but there are some surprisingly basic shapes that don’t exist in Android view libraries.
I’ll also note that this was years ago before android had all this nice composable view architecture.
Hah, yeah a hexagon is a weird case. In my experience, devs talking about “math in a custom view” has always meant simply “I want to render some arbitrary stuff in its own coordinate system.” Sorry my assumption was too far. 😉
Yeah it was a weird ask to be fair.
Thankfully android lets you calculate those views separately from the draw calls so all that math was contained to measurement calls rather than calculated on draw.
Sometimes when people see me struggle with a bit of mental maths or use a calculator for something that is usually easy to do mentally, they remark “aren’t you a programmer?”
I always respond with “I tell computers how to do maths, I don’t do the maths”
Which leads to the other old saying, “computers do what you tell them to do, not what you want them to do”.
As long as you don’t let it turn around and let the computer dictate how you think.
I think it was Dijkstra that complained in one of his essays about naming uni departments “Computer Science” rather than “Comput_ing_ Science”. He said it’s a symptom of a dangerous slope where we build our work as programmers around specific computer features or even specific computers instead of using them as tools that can enable our mind to ask and verify more and more interesting questions.
The scholastic discipline deserves that kind of nuance and Dijkstra was one of the greatest.
The practical discipline requires you build your work around specific computers. Much of the hard earned domain knowledge I’ve earned as a staff software engineer would be useless if I changed the specific computer it’s built around - Android OS. An android phone has very specific APIs, code patterns and requirements. Being ARM even it’s underlying architecture is fundamentally different from the majority of computers (for now. We’ll see how much the M1 arm style arch becomes the standard for anyone other than Mac).
If you took a web dev with 10YOE and dropped them into my Android code base and said “ok, write” they should get the structure and basics but I would expect them to make mistakes common to a beginner in Android, just as if I was stuck in a web dev environment and told to write I would make mistakes common to a junior web dev.
It’s all very well and good to learn the core of CS: the structures used and why they work. Classic algorithms and when they’re appropriate. Big O and algorithmic complexity.
But work in the practical field will always require domain knowledge around specific computer features or even specific computers.
I think Dijkstra’s point was specifically about uni programs. A CS curriculum is supposed to make you train your mind for the theory of computation not for using specific computers (or specific programming languages).
Later during your career you will of course inevitably get bogged down into specific platforms, as you’ve rightly noted. And that’s normal because CS needs practical applications, we can’t all do research and “pure” science.
But I think it’s still important to keep it in mind even when you’re 10 or 20 or 30 years into your career and deeply entrenched into this and that technology. You have to always think “what am I doing this for” and “where is this piece of tech going”, because IT keeps changing and entire sections of it get discarded periodically and if you don’t ask those questions you risk getting caught in a dead-end.
He has a rant where he’s calling software engineers basically idiots who don’t know what they’re doing, saying the need for unit tests is a proof of failure. The rest of the rant is just as nonsensical, basically waving away all problems as trivial exercises left to the mentally challenged practitioner.
I have not read anything from/about him besides this piece, but he reeks of that all too common, insufferable, academic condescendance.
He does have a point about the theoretical aspect being often overlooked, but I generally don’t think his opinion on education is worth more than anyone else’s.
Article in question: https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html
Based on some places I used to work, upper management seemed convinced that the “idea” stage was the hardest and most important part of any project, and that the easy part is planning, gathering requirements, building, testing, changing, and maintaining custom business applications for needlessly complex and ever changing requirements.
I found it useful when explaining programming to lay people to try to put various programming paradigms in everyday terms.
Imperative programming is like a cooking recipe. You need specific ingredients in certain amounts and you need to perform actions in a very specific order, or the recipe won’t turn out right.
OOP is like a bicycle. Lots of pieces interconnected and working together, hopefully interchangeable and standardized. It can also be used to explain unit testing to juniors. Clock mechanisms or engines can also work but people tend to relate better to bicycles.
Declarative programming (SQL) works like ordering at the restaurant. You still need to know how restaurants work and about meal courses and how to read the menu etc. but you don’t need to know how the sausage was made, only if it’s good or not.
SELECT food FROM menu WHERE name LIKE ‘Fried %’;
Lemme
cat menu | grep Fried
real quickYou don’t want to order from the cat menu.
grep -i fried < menu
grep -i fried menu
Of course! It’s amazing how this stuff just flows from the keyboard when you’re typing in a shell window, but feels awkward when typing in a Lemmy comment.
That you can mix and match bugfixes like lego blocks an hour before release.
I had some dude here trying to tell me, a mathematician and data scientist who is developing AIs for fun and is a holder of an MA in Visual Effects specializing in procedural design, who has worked on algorithmic video game development, what AIs are or are going to be capable of in procedural/generative/algorithmic game design “because he has played a lot of games” and argued for days with me, cherry-picking everything I wrote attempting to use my words refuting him to support his arguments.
Just… Infuriating.
That just because I’m a programmer that must mean I’m a master of anything technology related and can totally help out with their niche problems.
“Hey computer guy, how do I search for new channels on my receiver?”
“Hey computer guy, my excel spreadsheet is acting weird”
My friend was a programmer and served in the army, people ordered him to go fix a sattelite. He said he has no idea how but they made him try anyways. It didn’t work and everyone was disappointed.
Don’t pretend you suck at these things. You know very well you are fucking equipped to fix this kind of thing when you work with programming. Unless you’re, like a web developer or something ofc
Why should I fix any of those peasant problems if I can order the lowly technician to fix my computer which I use to make crappy SQL queries? 🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐🧐
Because you are a nice person to your relatives???
You are part of the problem
Nope
I used to get a lot of people asking for help with their printer. No, just because I am a software developer doesn’t mean I know how why your printer isn’t working. But, yes, I can probably help you…
My neighbour asked me to take a look at her refrigerator because it wasn’t working. I am a software developer.
Ironically, most of those things are true, but only with effort. We are better than most people at solving technical problems, or even problems in general, because being a programmer requires the person to be good at research, reading documentation, creative problem solving, and following instructions. Apparently those aren’t traits that are common among average people, which is baffling to me.
Sometimes I’ll solve a computer problem for someone in an area that I know nothing about by just googling it. After telling them that all I had to do was google the problem and follow the instructions they’ll respond by saying that they wouldn’t know what to google.
Just being experienced at searching the web and having the basic vocabulary to express your problems can get you far in many situations, and a fair bit of people don’t have that.
He said he has no idea how but they made him try anyways.
Uh, I’ve been present when such a thing happened. Not in the military, though. Guy should install driver on a telephone system, despite not being a software guy (he was the guy running the wires). Result: About as bad as expected. The company then sent two specialists on Saturday/Sunday to re-install everything.
And everyone expects you to know how to make phone apps.
Like, I think I know what to google in order to start learning how.
If you know Java or Javascript you can easily build apps.
But like in every other software field, design is often more important.
I mean the classic is that you must be “really good at computers” like I’m okay at debugging, just by being methodical, but if you plop me in front of a Windows desktop and ask me to fix your printer; brother, I haven’t fucked with any of those 3 things in over a decade.
I would be as a baby, learning everything anew, to solve your problem.
I enjoy your comment so much because your methodical and patient approach to debugging code is exactly what’s required to fix a printer. You literally are really good at computers even if your aren’t armed with a lot of specific knowledge. It’s the absolutely worst because troubleshooting without knowledge and experience is painfully slow and the whole time I’m thinking"they know so much more about this than I do! If they’d just slow down and read what’s on the screen …" But many people struggle to do even basic troubleshooting. Their lack of what you have makes them inept.
I was gonna say, the OP here sounds perfectly good at computers. Most people either have so little knowledge they can’t even start on solving their printer problem no matter what, or don’t have the problem solving mindset needed to search for and try different things until they find the actual solution.
There’s a reason why specific knowledge beyond the basic concepts is rarely a hard requirement in software. The learning and problem solving abilities are way more important.
I use a car analogy for these situations: You need a mechanic (IT professional.) I’m an engineer (coder.) They’re both technically demanding jobs, but they use very different skillsets: IT pros, like mechanics, have to think laterally across a wide array of technology to pinpoint and solve vague problems, and they are very good at it because they do it often.
Software engineers are more like the guy that designed one part of the transmission on one very specific make of car. Can they solve the same problems as IT pros? Sure! But it’ll take them longer and the solution might be a little weird.
Can they solve the same problems as IT pros? Sure! But it’ll take them longer and the solution might be a little weird.
Well the person just wants a solution that works. They didn’t say it has to be the best solution of all solutions.
I can definitely solve their problems, but I’d have to go through all of the same research they would have to. They’re basically just being lazy and asking us to do their work for them.
I work in service design and delivery. It’s my job to understand how devices actually function and interact. Can confirm that dev types can learn the stuff if they want to but most have not. Knowing how to set up your fancy computer with all the IDEs in the world is great but not the same as doing that for 5,000 people at once.
I think the difference is that they don’t know where to even start, and we clearly do and that’s the way to differentiate from perfectly working computer and a basically brick in their minds.
I go to excuse now is “I haven’t used windows in 10 years”, when people call me for tech support.
I literally can’t help them lol
“I don’t know anything about your apple device, I prefer to own my devices and not have somone else dictate what I can use it for”
fuck printers
fuck you. my uncle was a dot matrix.
That reminds me of one of those shit jokes from the eighties:
“There’s two new ladies in the typing pool who do a hundred times the work of anyone else.”
“What’re they called?”
“Daisy Wheel and Dot Matrix.”
Oh man love it. In the 80s, I used to go to my grandmother’s work after school. She was a stenographer at the neighborhood newspaper in Brooklyn. If she was alive she’d probably love this. My mom ran a copy room for a high school but I think it would go over her head.
The files are IN the computer.
That it’s mostly sitting behind a computer writing code. More than half my time is spent in the exploration phase: math, research, communication and developing a concept. The actual writing of code is typically less than 1/3.
Also as someone mentioned before, that it’s considered something ‘dry’. I honestly wouldn’t be able to code properly without my intuition. Take for example code smell. I don’t know why the code is bad, I just feel that it’s off somehow, and I keep chipping away until it feels just right.
I really like the word you used, code smell. I often have a hard time expressing to co-workers in code reviews why something feels off, it just does.
That they have any business telling me how complicated something is or how long something should take for me to implement.
Yeah like “Just add a simple button here”. Yeah of course, the button is not the difficult part.
Its like they think we just tell the computer what they asked us to do and we’re done.
I was coming here to talk about that recent post saying how easy it is to make a GUI and every program should already have one…
Command line is a GUI, change my mind
- You’re a hacker (only if you count the shit I program as hacks, being hack jobs)
- You can fix printers
- You’re some sort of super sherlock for guessing the reason behind problems (they’ll tell you “my computer is giving me an error”, fail to provide further details and fume at your inability to guess what’s wrong when they fail to replicate)
- If it’s on the screen, it’s production ready
If it’s on the screen, it’s production ready
“I gave you a PNG, why can’t you just make it work?”
I actually get that somewhat often, but for 3D printing. People think a photo of a 3D model is “the model”
Dude, I would just 2d print the png they sent and give them the piece of paper.
If they complained, I would say: “I literally printed the thing you told me to print.”
I’ve had questions like your 3rd bullet point in relation to why somebody’s friend is having trouble with connecting a headset to a TV.
No idea. I don’t know what kind of headset or what kind of TV. They are all different Grandma.
Doesn’t happen as much, but family and non tech friends would present me to other people that “worked with computers” thinking I could take new job opportunities. They were always wildly unrelated to my field.
I know I know,… they acted in good faith, and probably could have adapted a bit, but like 30 years ago there was a lot of overlap and systems where somewhat similar, but now somebody trained in Linux kernel maintenance isn’t going to learn how to create SharePoint SPFx webparts. Development is very specific now!
I was at a party explaining that we were finishing up a release trying to decide which bugs were critical to fix. The person that I was talking to was shocked that we would release software with known bugs.
When I explained that all software has bugs, known bugs, he didn’t believe me.
I think that non-tech people think that tech just goes. Like you pull it out of a box and turn it on and it just works. They have no idea how much jenk is in everything and how much jenk was eradicated before a user came went anywhere near.
What do you mean by jenk? Is that a specific term used to refer to tech junk?
Jank?
They mean jank:
jan·ky adjective, informal adjective: jank
of extremely poor or unreliable quality. "the software is pretty janky"
I’m not in IT but used to work with a very old terminal based data storage and retrieval system.
If the original programmers had implemented a particular feature, it was very easy to enter a command and have it spit out the relevant info.
But as times changed, the product outgrew its original boundaries, and on a regular basis clients would ask for specific info that would require printing out decades worth of data before searching and editing it to get what the client wanted.
I can not tell you how many times I heard the phrase, “Can’t you just push a button or something and get the information now??”
The thing that infuriated me the most was the idea that somehow we could do that, but didn’t want to, as if there was some secret button under the desk that we could push for our favourite clients. Ugh.
Not programming per se but my sister thinks it’s okay to have 300+ Chrome tabs open and just memorize the relative locations of them whenever she needs something. She’s lucky she has a beefy computer.
I might be your sister 😐
Stop that.
This is horrifying lol
High spec machines are a curse 🤔
I also leave every firefox tab open until I run out of RAM, at which point I use the “close tabs to left” button, moving some tabs that I still want to check out to the right beforehand. On firefox, one can simply use the list all tabs button to easily navigate or search through all tabs, so no memorization is needed (or just type the title of the document in the address bar and it will just switch to the tab if you have it open).