Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

On the other hand, this really hurts readability. When reading other people's code you now effectively have to learn what "language" they use too.

I'd say it's probably worth that cost, if used judiciously.



"If used judiciously" is just FUD.

Soviet-era authority figure: "Western-style freedom has its good points---if applied judiciously".

If you're a proper Lisper, you use macros like it's going out of style, and other proper Lispers love your code for it.

All programs have their own dictionary of whatever it is they define, whether it be macros or variables. You can no more understand a function call just by looking at it than a macro call. (Should functional decomposition be introduced "judiciously" into large, monolithic blocks of code that have everything "at a glance" in one place?)


I guess the "problem" with understanding this perspective for us non-lisp people is that we may not be able to imagine all the places that we could have used macros (or something equivalent) if we haven't even learnt it. You don't know what you don't know, or in this case, you don't know how to use something you haven't used. But this seems to be the same for boatload of programming features that many people aren't taught with their first language, but rather later - higher order functions, lazy evaluation/generators/etc., type-level functions... we might at first think "this is a special tool only to be used in certain circumstances/only to be used by wizards". And then it might turn out that they are useful all the damn time.

The net result, at least for me, is that I get spoiled and then it is hard for me to go back 'more primitive ways'. :)


Macros are easy to understand if you are shown that whatever language you are using already has them. The difference is that they are locked in and wrapped behind at least two layers.

Firstly, there rigid surface syntax which customizes the look of every macro at the character level. For instance, in C, the do ... while(); loop must have a trailing semicolon. (No Lisp macro has such requirements.)

Secondly, that surface syntax translates to a limited set of abstract syntax tree forms which is not extensible.

Lisp macros open that up. The outer layer of varnish is stripped away, so any conceivable abstract syntax tree form has a notation; you do not have to invent new character-level surface syntax in order to work with a new form. And then, custom recognizers for arbitrary forms can be written by the language users, which do tree to tree transformations.

So then how you use these things once you have them is the same way that yu use the features of programming languages that you know.

E.g. loop is a macro in Lisp. We use it like this:

  [1]> (loop for x from 1 to 10
             for a = 2 then (* 2 a)
             collect (list x a)))
  ((1 2) (2 4) (3 8) (4 16) (5 32) (6 64)
   (7 128) (8 256) (9 512) (10 1024))
Someone wrote the loop macro, so in this situation I'm just a user. I don't care whether this is a macro, or built-in to the language like "foreach x in list do".

When I use loop, I'm benefiting from macro-writing.

The benefits are far-reaching. For instance, language experimentation takes place in the user base, not behind the closed doors of an ivory tower (ISO committee or whatever). Language ideas are packaged as code and shared around.

Technical problems turn into social problems, as someone noted.


This doesn't really address the issue of readability or understanding though. You still have to go through someone else's uncommented code and figure out what their weirdo macros actually do versus having standard, documented language features that you already understand.

Because there's not a ton of standardization (and for other reasons) you end up with a million different dialects of lisp and a fragmented community, in comparison to other language families.

And you can do open, community-driven standardization and language enhancement. Python is a decent example of this.


If someone writes "weirdo macros" and you take them away, that same someone will write weirdo code using something other than macros. Either way, you will have to understand what they are doing. In the worst case, that person will expand, by hand, the code their macros would have written. Now you can't fix a bug in the expander and cheaply re-expand.


>When reading other people's code you now effectively have to learn what "language" they use too

And? This is true of code in languages without macros, too. You have too learn all the vocabulary, all the types they use, all the functions, all the structure of the program— macros are just a tool for taking these domain-specific things and packing them into a denser syntactic abstraction. A well-written macro improves readability in both the short and long terms.

I mean, say I write a couple of macros to deal with a database and I write code like

    (with-sql-connection ("localhost" ...)
      (select *
         from  "customers"
         where (= "lastname" "Smith")))
Do you really think that's less readable than the nonsense you'd have to write a typical "framework," or worse, building the command string by hand?


    When reading other people's code you now effectively
    have to learn what "language" they use too.
That's well said, and in fact it's not uncommon to talk about Lisp's capacity for crafting the language to solve the problem at hand.

Paul Graham's essay "Programming Bottom-Up" explains it really really well:

"Experienced Lisp programmers divide up their programs differently. As well as top-down design, they follow a principle which could be called bottom-up design-- changing the language to suit the problem.

In Lisp, you don't just write your program down toward the language, you also build the language up toward your program. As you're writing a program you may think 'I wish Lisp had such-and-such an operator.' So you go and write it. Afterward you realize that using the new operator would simplify the design of another part of the program, and so on. Language and program evolve together. Like the border between two warring states, the boundary between language and program is drawn and redrawn, until eventually it comes to rest along the mountains and rivers, the natural frontiers of your problem.

In the end your program will look as if the language had been designed for it. And when language and program fit one another well, you end up with code which is clear, small, and efficient."

http://www.paulgraham.com/progbot.html

    On the other hand, this really hurts readability.
As others have pointed out, no more than unfamiliar classes and so forth.

In any language, you can think of programming as building mini languages to solve problems. You have nouns and verbs - types/classes/instances and methods/functions/operators. Lisp gives you those parts of speech and also lets you manipulate the fundamental grammar as well!


My experience with heavily metaprogrammed Ruby is that this story has a dark side. Yes, when language and program fit one another well, you end up with code which is clear, small, and efficient. That is true... until you need to add a new feature that was not accounted for in the language design. Now you have to wade into the code of the "compiler".

Abstraction generally gives up flexibility to obtain conciseness. Certainly a balance must be struck, but "not flexible enough" has caused me vastly more pain than "not concise enough". As a consultant who frequently wades into other people's code, I generally consider metaprogramming to be a scourge.


After programming in CL for several years, I can honestly say that when encountering new syntax that people have introduced in an app, it's no harder to follow along with it by reading the macro definition than encountering an unknown function. And when in doubt, you can just macroexpand the syntax and see what it's doing under the hood.


>When reading other people's code you now effectively have to learn what "language" they use too.

Paul Graham addressed this in On Lisp (page 59) [0]:

>So yes, reading a bottom-up program requires one to understand all the new >operators defined by the author. But this will nearly always be less work than >having to understand all the code that would have been required without them. >If people complain that using utilities makes your code hard to read, they >probably don’t realize what the code would look like if you hadn’t used them.

>Bottom-up programming makes what would otherwise be a large program look >like a small, simple one. This can give the impression that the program doesn’t >do much, and should therefore be easy to read. When inexperienced readers look >closer and find that this isn’t so, they react with dismay.

>We find the same phenomenon in other fields: a well-designed machine may >have fewer parts, and yet look more complicated, because it is packed into a >smaller space. Bottom-up programs are conceptually denser. It may take an effort >to read them, but not as much as it would take if they hadn’t been written that way.

[0] http://www.paulgraham.com/onlisp.html


> I'd say it's probably worth that cost, if used judiciously.

Can be said of anything so it really doesn't say anything at all.

This is could be a good criticism in theory, in practice

you can observe how it works out. Fukamachi uses @ everywhere, attila-lendvai uses his def-star library ej (def (class foo) ...) instead of (defclass foo ...), some people use iterate, etc and all is good.

People can write shit code sans macros no problem. I'd rather have them than don't. After all macros embody the whole point of programming[0]

[0]: http://axisofeval.blogspot.com/2011/05/why-of-macros.html




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: