>I still do not understand how one can consider writing to memory the OS owns to be ok.
Things were different back then. People did a lot of hacky stuff to fit their programs into memory, because you were genuinely constrained by hardware limitations.
Not to mention, the idea of the OS owning the machine was not as well developed as it is today. Windows 3.11 was just another program, it didn't have special permissions like modern OSes, and you would routinely bypass it to talk to the hardware directly.
"Not to mention, the idea of the OS owning the machine "
I agree--back then when computers had <=4MB or RAM I would've called hogging unused memory for some selfish speculative future use "professional malpractice".
When an OS uses any memory that's otherwise unused as a file cache, which is instantly available if an application wants more memory, but isn't shown as "unused": "This OS is terrible, I have 16GB of RAM but all of it is being used!"
When an OS doesn't do this: "This OS is terrible, I bought all this RAM and the OS doesn't use it!"
> Things were different back then. People did a lot of hacky stuff to fit their programs into memory, because you were genuinely constrained by hardware limitations.
Are you going to tell them what "32-bit Clean" meant for Mac developers, or will we let them find out that particular horror movie for themselves?
A lot of software doing useful work halts pretty trivialy, consuming inputs and doing bounded computation on each of them. You're not going to recurse much in click handlers or keep making larger requests to handle the current one.
I was just very naive at 18 about program analysis. I haven't lost my imagination though. I was a self-taught IOI gold division competitor. I thought every problem had an algorithm. It doesn't work like that. Program analysis is collecting special snowflakes that melt in your hand. There is no end to the ways you can write a bug in C. Ghosts of Semmle, Semgrep, Coccinelle past, be humbled. LLMs saturate test coverage in a way no sane human would. I do not think they can catch all bugs because of the state space explosion though, but they will help all programmers get better testing. At the end of the day I believe language choice can obviate security bugs, and C/C++ is not easy or simple to secure.
If you start with safety in mind and don't just try to bolt it on, you're in a much better place. With the kind of code you need in typical applications you could force vast majority of it in a shape that passes termination checks in theorem provers without much overhead, especially if you can just put gnarly things in standard libarary and validate (with proofs hopefully) once.
Though starting with C/C++ is a losing proposition in that regard. And I guess any kind of discipline loses to just throwing half-baked javascript at wall, because deadlines don't care about bugs.
You've never seen the full power of static analysis, dynamic analysis, and test generation. The best examples were always silo'd, academic codebases. If they were combined, and matured, the results would be amazing. I wanted to do that back when I was in INFOSEC.
That doesn't even account for lightweight, formal methods. SPARK Ada, Jahob verification system with its many solvers, Design ny Contract, LLM's spitting this stuff out from human descriptions, type systems like Rust's, etc. Speed run (AI) producing those with unsafe stuff checked by the combo of tools I already described.
The silo'd codebases I was referring to are verification tools they produce. They're used to prevent attacks. Each tool has one or more capabilities others lack. If combined, they'd catch many problems.
Examples: KLEE test generator; combinatorial or path-bases testing; CPAChecker; race detectors for concurrency; SIF information flow control; symbolic execution; Why3 verifier which commercial tools already build on.
It's not about linear algebra (which is just used as a way to represent arbitrary functions), it's about data. When your problem is better specified from data than from first principles, it's time to use an ML model.
>Remember that the majority of industry is upstream of consumption.
People forget this. Oil companies may have dug up the oil, but they did so because we paid them to, so we could use the energy for good and useful things.
Climate change isn't 'evil billionaire companies are ruining the world', it's 'these things we did to improve our lives turn out to have side effects'.
This is backwards. If it weren't for 'eliminating jobs' we'd both be peasant farmers right now. Automation has improved the standard of living and raised wages for everyone, rich and poor alike.
I don't think you realize how bad NLP was prior to transformers. Oldschool entity recognition was extremely brittle to the point that it basically didn't work.
CV too for that matter, object recognition before deep learning required a white background and consistent angles. Remember this XKCD from only 2014? https://xkcd.com/1425/
CV is a space where I would 100% agree with you. But - edge cases notwithstanding - there's not so much of a dropoff with NER that I would first go to an LLM.
Certainly it's not impossible to DIY, but it's more difficult than just popping some aligners on your 3d printer.
Manufacturing them requires a resin printer and a vacuforming setup, but that's still the easy part. It's a whole system with a dental 3D scanner, software for rearranging your mouth, and attachment points that have to be epoxied onto (and later removed from) your teeth by a dentist.
They have to have at least 2 different materials as well. The temporary trays were much softer and I had almost ground through them in my sleep by the time I had to switch to the next one but the final set is much more robust.
Things were different back then. People did a lot of hacky stuff to fit their programs into memory, because you were genuinely constrained by hardware limitations.
Not to mention, the idea of the OS owning the machine was not as well developed as it is today. Windows 3.11 was just another program, it didn't have special permissions like modern OSes, and you would routinely bypass it to talk to the hardware directly.
reply