Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

You appear to be making your points in a way in which no developments from the part 10-20 or so years are intruding. Can you point to a extant PL/I compiler that produces good code on modern architectures? How sure are you that the issues that separated C, Fortran and PL/I historically are even remotely relevant now?

I would imagine inlined functions (and for the brave, interprocedural analysis) renders many of your points moot. Fortran still has an edge on numerics as I understand it, but I don't think it's all that decisive.

The other reason that baking functionality into the language is problematic is that you wind up having a few things that go fast, while you neglect to build facilities that optimize other people's data structures. So you get the world's fastest dense array (say) but your sparse array is resolutely mediocre. Instead of a language packed with special-cases, I would much rather a minimal language with good analyses (many of which can be supported by language design; I think Rust has a good future here to support good analyses by how carefully it tracks lifetimes and how it doesn't fling aliases around with the abandon of C).



> You appear to be making your points in a way in which no developments from the part 10-20 or so years are intruding. Can you point to a extant PL/I compiler that produces good code on modern architectures? How sure are you that the issues that separated C, Fortran and PL/I historically are even remotely relevant now?

It's simple. Again, once again, over again, yet again, one more time, the main point is that functionality in a language definition and implemented in a compiler is faster than that functionality implemented in external functions. Simple. Dirt simple. An old point, and still both true and relevant and not changed in the last "10-20" years.

C is not supposed to have changed from K&R -- that it is backwards compatible was one of its main selling points and main reason to put up with such meager functionality that had programmers digging with a teaspoon.

Fortran and PL/I just illustrate the point. The point is thus illustrated. For this illustration, I can go back to old Fortran 66 and PL/I F level version 4. Still the point is illustrated and remains true.

And for the changes in C, does the language yet have definitions and support for multidimensional arrays and something decent in strings? And what about Malloc and Free -- have they been brought up to date?

I did NOT propose, I am NOT proposing, using Fortran or PL/I now. Instead, again, once again, over again, ..., I just made a point about language design and implementation. That this evidence is old is fully appropriate if we regard C as defined in K&R and not changed since. If people changed it, then it's a different language and likely not fully compatible with the past.

So, this thread was about C: I assumed C hasn't changed so assumed K&R C.

I don't want to use C but have, recently in one situation was essentially forced into it. But after that case, I left C. I'm not using Fortran or PL/I, either.

I'm using Microsoft's .NET version of Visual Basic (VB). Right, laughter, please, except it appears that .NET VB and .NET C# differ essentially only in syntactic sugar (there are translations both directions), and I prefer the VB flavor of that sugar as easier to teach, learn, read, and write than C# which borrowed some of the C syntax that K&R asserted was idiosyncratic and in places really obscure.

For this thread, I was surprised that Rust would not run circles around old, standard K&R C. Soooo, the answer is in part that the C of this thread is not old K&R C. Okay, C has changed and is no longer seriously backwards compatible. Okay. Okay to know, but for a huge list of reasons in my work I'm eager to f'get about C and its digging with a teaspoon.

If the new versions of C do not do well handling multidimensional arrays passed as arguments, then C should still be consider TOO SLOW, and too clumsy.

But, whether or not I use C, and whatever I think of it, the point remains: Functionality in the language is faster than that implemented via external functions.

Of course, this point is important in general when considering extensible languages. Sooo, maybe if could do for syntax as much as has been done for semantics, then we could do better on extensible languages. But then we could extend 1000s of ways, maybe once for each application, and then have to worry about the 1000s of cases of documentation, etc. So, extensible, when do it, has pros and cons.

I see: Discussing language design, especially for sacred C, is heresy in the religious language wars. Forbidden topic. Who'd uh thunk -- HN is full of hyper sensitive and hostile religious bigots.


> C is not supposed to have changed from K&R -- that it is backwards compatible was one of its main selling points and main reason to put up with such meager functionality that had programmers digging with a teaspoon.

C has gained new functionality and features. It just hasn't deprecated the old ones. K&R C will still compile with a modern C compiler, but it won't be as good as modern (C99/C17) C.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: