Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> the language is (for many purposes) an excellent replacement for C.

No, it's not. The runtime wouldn't even fit on many platforms. GC pauses are similarly not acceptable on soft/real-time systems, etc.

Ada/Rust get closer to a replacement.



Rust really felt like a language that has taken the decades of "lessons learned" and the all-stars features of many many languages and neatly packaged it as an ambitious replacement to C/C++ starting from scratch.


I've heard someone once say "if Rust is a criticism of C++, it comes from a position of respect" and that was definitely true in the early days of the programming language.

Interestingly though, I take the stance that Rust is much closer to C than C++.


I like to put it this way: C++ was supposed to be a "better C," and Rust is a better "better C" than C++.


That is a matter of tooling.

There are bare metal Go runtimes being shipped in products today, e.g. TamaGo unikernel for firmware.

As there are real time Java implementations, there could exist Go ones if the market cared about having them as well.


embedded Java, yes. real time?

Oh:

> The RTSJ addressed the critical issues by mandating a minimum specification for the threading model (and allowing other models to be plugged into the VM) and by providing for areas of memory that are not subject to garbage collection, along with threads that are not preemptable by the garbage collector. These areas are instead managed using region-based memory management. The latest specification, 2.0, supports direct device access and deterministic garbage collection as well.

https://en.wikipedia.org/wiki/Real_time_Java

Albeit 2.0 is still WIP at this time.

TIL


Instead of reading wikipedia articles about standards, see products that have been delivering value in production for the last 20 years.

https://www.ptc.com/en/products/developer-tools/perc

https://www.aicas.com/wp/products-services/jamaicavm/


thanks for the pointers

why not just without the part up to the comma? as in

"here are two..."

https://qht.co/newsguidelines.html


Probably given the somehow snarnky remark about the state of real time Java specification, as if that is what matters, maybe.


oh it was a real TIL. I first thought: "GCed and real time? not that I know of..." and _then_ I found that spec and deterministic GC and that's the big TIL I shared. sorry if I came across snarky myself.


Honestly, on safety critical systems you should never have used C to begin with. Formal validation is critical, so I'd argue that you should have used Ada SPARK instead of the burning dumpster fire that is MISRA-C or similar.


MISRA-C is far from a dumpster fire if the extremely widespread use in the automotive industry is to be believed!


MISRA-C is barely better than a coding style for which people wrote partial enforcement tools.

Studies suggest some of MISRA's rules if followed reduce significant bugs in software, some others increase bugs and many are neutral because they forbid things nobody who isn't entering an Obfuscated C Contest would think to do even in C.

e.g. IIRC MISRA says don't put variable declarations inside parts of a switch and then use the variables in other parts of the switch. Nobody does that, it's very silly.

Or MISRA says you must have a default switch case. So your three-way switch for the headlights enum, OFF/ DIPPED/ FULL fails because it needs a default. What's the default? Well, I guess you can assert it's never hit? But your C compiler has enum checking already, without a default it would have flagged if you forgot SPECIAL_HACK which is in the enum, but thanks to default the C compiler thinks you remembered about that and at runtime in somebody's car the SPECIAL_HACK is enabled and their car's CPU crashes.

The automotive industry wanted to write C. Or at least, the programmers it hired did, and nobody said "No. That is a terrible idea, stop it" instead they came up with MISRA C to continue excusing the inexcusable.


a = a; //MISRA


Just don't generate garbage. You shouldn't be using malloc anyway you silly embedded programmer.

Half joking, as a C/C++ programmer here.


malloc() isn't the issue, free() is. Allocate at init, never free, and you'll still have deterministic behavior and no heap fragmentation!*

* Unless you run out of memory, of course. Then you have issues.


malloc() on an embedded platform can still present pitfalls. This article explains why:

https://mcuoneclipse.com/2022/11/06/how-to-make-sure-no-dyna...


Yes, you can still run out of memory. But you can't get a fragmented heap if you never free(). And if you allocate only at init, not during the normal operation stage, you can't run out of memory if you manage to start up successfully.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: