deleted by creator
deleted by creator
C++ because I use it for embedded systems, interfaces with and can easily use C code (opening up the ability to use 40+ years of libraries and already written code), and I’m 43 years old and don’t feel like learning an entirely new programming paradigm (I like OOP it makes the most sense to me).
I like being able to drill down and manage all my own resources like memory, etc. when I need to as well. I’ll use raw pointers with higher level abstractions all day depending on what’s convenient.
Started messing around with it some time in 2003, on Mandrake Linux when I was 21 years old. Experimented and ran servers with various distros in the years since but it didn’t become my daily driver until about 2014-15, with Debian.
Nope, stolen from a Telegram group chat, not sure where it’s originally from.
That’s the static variable in the function sticking around and watching the madness unfold.
I remember some dude on the Internet with a jar and an MLP figurine that might be able to help you out.
KDE since 2002. KDE 4 lyfe.
C and assembly programmers: first time?
It’s also because we started doing shit like using JS in places it really shouldn’t belong. Half the programs on my PC are just webapps running in a sandbox environment, instead of using systems languages like C/C++ directly like was the case 15-20 years ago. Abstractions on top of abstractions on top of abstractions. JS was fine for embellishing elements of a web site and facilitating AJAX, it should have never been turned into an app language.
That’d be like if interpreted BASIC was taken seriously in the 80s as more than just a toy and the majority of popular software was written in it. We’d rightfully question WTF society was thinking.
Another big part is learning how to set it up in a way that it’s functional and productive the first time and then STOP FUCKING WITH IT.
“WTF, none ‘a these are fuckin’ socks!”
have lost GPUs before, but not yet to mining.
I have. Power rails on GPU shorted due to failed MOSFET, blew the PSU as well as the card. The mini-explosion woke the whole family at 2am. I was also running it just under what was considered the high end of tolerable for the card, with a huge box fan attached to the side of the open case.
And if it survives until then will have a couple months of gaming out of it before a dried out capacitor or overstressed MOSFET blows. Ask me how I know.
deleted by creator
Something wrong with:
#include <Arduino.h>
void loop() {
digitalWrite(13, HIGH);
delay(1000);
digitalWrite(13, LOW);
delay(1000);
}
? 😂🤮
Cobol: you are old, and a nerd, and probably making some sweet cheddar right now propping up a mid to late 20th century beast somewhere.
Assembly: you are a cyborg.
Welp, back to NCSA Mosaic I guess. We never needed CSS and JS anyway, those were a huge mistake.
deleted by creator
I can relate. I can emphasize with someone who’s learned every nuance of a language, and after 30-40 years suddenly these kids come in with their strange hieroglyphics slowly replacing everything you’ve worked on.
It’s not very good at it though, if you’ve ever used it to code. It automates and eases a lot of mundane tasks, but still requires a LOT of supervision and domain knowledge to not have it go off the rails or hallucinate code that’s either full of bugs or will never work. It’s not a “prompt and forget” thing, not by a long shot. It’s just an easier way to steal code it picked up from Stackoverflow and GitHub.
Me as a human will know to check how much data is going into a fixed size buffer somewhere and break out of the code if it exceeds it. The LLM will have no qualms about putting buffer overflow vulnerabilities all over your shit because it doesn’t care, it only wants to fulfill the prompt and get something to work.