Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

He was a former BASIC book author too

Hmm, I'm starting to see a pattern. Is it possible BASIC, plus lack of internet back in the day, plus attrocious books are the reasons for truning poeple into terrible programmers? I happen to know only a couple of seniors but without exception their code, no matter what language written in today, is horrible on all fronts. I used to think it was a lack of attention to detail, their lack of wanting to strive for even the tiniest bit of more than just 'good enough for tady'. Possibly stemming from lack of education and lack of continuous self-education. But maybe there's more to it. Maybe they were influenced by a bad book. And/or by a not-so-optimal language like BASIC.



I would say "it depends".

Just as poor programmers came later from Visual Basic, or MSVC++ (somehow we went through a phase were everyone coming for interview with MSVC++ actually had C with classes, and for a while it was a warning sign and standing joke at the place I then worked). Getting people who claimed C++ who actually knew it was pretty challenging around the millennium.

In the 80s just about everyone who had an 8 bit started with BASIC, and an awful lot of them managed to go on to be perfectly acceptable programmers when they moved on to C, C++, assembler, and more recent languages, or even turned out a decent game in something like Blitz Basic on the Amiga. Then again there were some who couldn't move beyond BASIC, because it was simple enough almost everyone could piece something together with it - so even back then it was a warning like PHP can be today.

It's not age, and it's not BASIC - there's plenty of younger folks turning out abysmal code who've never been near it.

I wonder how I'd have turned out if I'd learnt C from one of this guy's books instead of K&R and having a couple of experts handy.


Might be totally wrong, but it's possible that MSVC++ programmers typically knowing 'C with classes' might come from the gaming world, as that's a fairly accurate description of some very popular game engines (Valve's Source engine[1] being the one I'm most familiar with, where e.g. std::string is unused in favor of char/wchar arrays).

[1] https://github.com/ValveSoftware/source-sdk-2013 (originally released in 2004)


It is a bit more complicated than that.

C wasn't much used in MS-DOS, as it was yet another systems language trying to gain the place of Assembly for high performance applications.

In some countries Pascal dialects (mostly TP compatible), Modula-2 reigned, while in others C and C++ were other contenders.

By the time Windows (written in C) became mature enough people started caring about it (3.x), C++ was already having its place via OWL (later VCL) on the Borland side, and Microsoft eventually came up with MFC.

However these were the days when C++ still didn't had a standard (which came in 1998), beyond the C++ARM book, so either you would stick with the compiler framework, or try to minimize language features for better portability.

Additionally on Windows everyone was learning it via Petzold's book, where he takes the approach C compiled with C++, not even "C with Classes".

Also although OWL and VCL were great OOP libraries with nice abstractions, MFC was pretty much a Win32 thin wrapper as its initial internal implementation (AFX) wasn't well received by internal MS employees as not being low level enough over Win32.

So there were lots of issues going on that lead to such cases.


Interesting, thanks, that clears up some of mystery from the Windows side of things. Explains why so many came in who clearly had experience, but were missing most of the ++ part.

Course back then STL (mostly now the standard library) was still sgi STL, and there was also RogueWave, both still fairly new and on the up.


Oh God, Petzold's book. What a piece of trash that was. The code, while not incorrect like the one in the article, was fairly atrocious, and the various UI things he did were unforgivable, if only because it encouraged legions of programmers to completely disregard any human interface principles. If I could erase one CS book from history, that might be the one.


You're not the first to notice a pattern, and you're in good company:

> It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

  --Edsger W. Dijkstra 
Though, to be fair, he would likely say that about a lot of mainstream languages today. Iirc he was very fond of Miranda, which in many ways was a precursor to Haskell.


To be fair, he said that about a lot of mainstream languages back then. That quote is from EWD498 at http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EW... . The larger context is (the rest of the comment quotes him):

FORTRAN —"the infantile disorder"—, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.

PL/I —"the fatal disease"— belongs more to the problem set than to the solution set.

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence.

APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums.


Having a discussion with someone who quotes Dijkstra can be frustrating. He wrote Go To Statement Considered Harmful and On the Cruelty of Really Teaching Computer Science and left behind a bunch of pithy quotes to mine besides. You end up having to explain why blanket bans on goto don't make sense, why it really is quite difficult to mathematically prove that a program works, or why it makes sense to teach students software engineering and not just teach them the mathematics.

At this point I'm ready to say "Dijkstra considered harmful" or "the use of Dijkstra quotes cripples the mind, their use in discussions should, therefore, be regarded as a criminal offence."


> Having a discussion with someone who quotes Dijkstra can be frustrating.

Agreed. Most of the time, it seems the pithy quotes are the only thing that the quoter knows about Dijkstra.

> You end up having to explain why blanket bans on goto don't make sense

For Dijkstra's GOTO quote specifically, there's a wonderful document by David Tribble named "Go To Statement Considered Harmful: A Retrospective" that does a line-by-line analysis of Dijkstra's paper and explains what Dijkstra's meant in the context of his time and examines where usage of GOTO still makes sense today.

http://david.tribble.com/text/goto.html


Dijkstra is the Nietzsche of computer science: Eminent, quotable, often irritated, and prone to be taken exactly the wrong way by people who only half-understand his ideas.


The dicta of Dijisktra should be handled with the same accommodation of self-contradictory paradox as is accorded to zen koans.


Ask them whether they'd use Yogi Berra quotes to decide how best to manage a baseball team.


Dijkstra is the Godwin of Computer Science


I did learn BASIC first, but then Pascal and in college and from books very soon learned the foundations of structured programming. It was an instant aha! since as soon as you write anything but trivial programs, you feel you need functions.

What I find disturbing is that author thought every language was the same save a few keywords, and then publisher was happy to put that crap to print.


Yeah, the quote about a professional C programmer rejecting it and the publisher publishing it anyway was telling!


Most of the major game engines of the past 20-30 years were programmed in a large part by people that started with BASIC.


You have only two data points, and obviously the second data point (the post you’re responding directly to) was remembered and posted because of its similarity to the main article, which is anything but an independent data point.

I think your conjecture is based on confirmation bias and a bad experience you’ve had with some colleagues.


Yes it is certainy possible there's simply a lack of data points but I don't think confirmation bias is at play here; at least for me personally I only know a couple of very senior devs and they all have the same problem so it's not like I'm actively cherrypicking only cases which seem to prove my point (which btw is just 'I see a pattern' not 'everyone who used BASIC once and had no easy access to proper source sucks')


2 data points is not a pattern. An equally plausible explanation is that you’re an environment full of idiots, junior and senior alike, yourself included. Your numerical and analytical skills are not unsupportive of this conclusion.


Keep in mind BASIC was the most accessible language in the 80's. Almost all computers shipped with basic... It was like today's Javascript.


I think the JS comparison undersells it. Many computers would literally drop you into a BASIC prompt at boot. I remember my first encounter with MS-DOS and finding it weird that you had to actually run a program to get to BASIC.


Yes, you are absolutely correct! BASIC seemed much more accessible on early microcomputers, when compared to any programming language today. Basic essentially was the OS. The Apple II, C64, etc. extended basic with DOS-like commands. They often shipped with BASIC manuals!


Apple’s guide to AppleSoft BASIC came with my IIGS and was my first programming book. It seems so much harder to get started now.


Yes, it definitely seems much harder. The BASIC guide that came with my Texas Instruments TI99/4A was my first programming book!


Replace BASIC with JavaScript and book about Scala language on Spark for a contemporary topic adjustment. You'll see the same there.


There's a sense in which the hardest programming language you'll ever learn is actually your second one. I think the reason for this is that with just one language under your belt, you have very little ability to distinguish between the abstractions the programming language offers you and the capabilities of the machine, and to distinguish between the abstractions the programming language offers you and the capabilities of programming itself. So for your first language, you're learning what is actually just an approximation and simplification of that first language, where all three of those things are all mixed together so you don't have to spend the cognitive effort to understand the differences and you can develop and rely on huge misconceptions without seeming to pay too large a price, but with your second, you're unlearning errors about the machine, unlearning errors about programming in general, and also learning a second language. Particularly difficult if you're making the leap from something like BASIC to C, where the second language is also substantially more difficult than the first.

For people of certain of a certain psychological orientation, there is the additional challenge that having put away your first language, you now think you are a "Programmer (TM)", and learning that second language and learning that you have a number of misconceptions can strike at your very identity. People can get psychologically attached to their misconceptions if it means retaining the illusion that they have mastery.

Nowadays the easiest way to screw this up is to go to a computer science/engineering program that uses just one language. As tempting as it may be from a curriculum simplicity perspective, it's a big mistake. I've interviewed a number of people who think that Java === computing. Not even the "JVM", mind you, but Java, the language, itself. I don't blame Java for this, it's the education. Java itself is not a great lens to understand computer capabilities through, and it's a miserable language to be your lens to understand the general capabilities of programming through, especially 10 years ago. (It's slowly getting better, with easy closures and such, but it's still stuff bolted on the side 20 year later.)

Looking at it from that perspective you can see why 8-bit-era BASIC was even worse than that. It offers a very impoverished view of the computer's capabilities and a very impoverished view of the possibilities of computing. (It was possible to rehabilitate BASIC into at least a passable language; I'm glad I don't have to use Visual Basic to do my job, but it's still light years ahead of the BASICs that still used line numbers, and I've done Real Work (TM) in it, albeit a long time ago.) A 21st-century Java-only programmer is substantially better equipped than a 20th-century 8-bit-era BASIC-only programmer.

(By "8-bit-era", I mean the timeframe, not necessarily the CPU. I'm fairly sure there were BASIC implementations with line numbers and such on non-8-bit-machines, and they'd still be dangerous. But as computers got into the 16- and especially the 32-bit era, even BASIC had to grow up.)


I have to say that having GFA Basic on the Atari ST as my first real development language made a huge difference, given that it had code blocks, functions, local variables, typed variables, arrays and record types and built-in commands for OS calls, memory access, matrix operations and a ton of other stuff. Made moving to Turbo Pascal relatively easy.

http://www.geocities.ws/slaszcz/mup/prgnames.html

https://github.com/ggnkua/Atari_ST_Sources/blob/master/GFA%2...


> I've interviewed a number of people who think that Java === computing. Not even the "JVM", mind you, but Java, the language, itself.

Could you elaborate on this? What exactly made you realize that was how/what they thought?


I had to ponder on what it is that really sets this sort of person apart, and I think it's the sort of sneering disdain at the idea that any of the other languages in the world are worth anything, or have any good ideas. Or maybe it's the way that when you ask them what's good or bad about some other language, you get back just a list of differences those languages have with Java, and it is simply assumed that all differences are ways in which they are inferior to Java.

And let me say again that it's not specifically Java. I've seen a couple of people that way with C, for instance, though not in an interview situation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: