Thanks for the solid feedback Pauan (and akkartik!) with some very good insights there! This sort of feedback is exactly what I was looking for when I posted this on the Arc forum, because I haven't actually done much Lisp before this
On to the specifics:
For quasiquote... yeah it had occured to me that almost every time I am not 'unquoting' a symbol then this is a mistake... but only when writing a macro. Outside a macro it's less clear that this is what someone would want.
And to clarify, you mean that :
`(one two (three))
would expand into
(list one two (list three))
(list one two (three))
So maybe a "define-hygienic" macro could re-define quasiquote in this fashion - but elsewhere it would work as normal.
I had also realised that the "hygiene" is currently broken by the fact that, in FirstClass Lisp, "quasiquote", "quote", "unquote", "unquote-splicing", "dot" and "slash" are all "special" in terms of what the reader reads in, so something like the following:
(lambda (quote phrase)
('quote-tag quote phrase))
will not do what the author probably intended. It's easy enough to work around it - but it's unsatisfactory.
It seems to me that the correct resolution is to have the reader itself also have a lexical environment which it refers to when generating these special symbols - although you would then have the problem of how to supply this environment to the reader. Something like:
By far the most interesting statement you've made is that making the macros first class isn't actually necessary for this hygiene trick anyway! That's a very interesting statement... I need to think about this.
What I most wanted was a "simple" concept of hygienic macros - that is, simple in understanding how they work: that they are just functions which happen to operate on the unevaluated source tree and return a source tree to be evaluated. Once we have that, I think the next most important thing is that there is no performance cost to using them.
If what you're saying is true then it might be possible to expand all of the "non" first class macros in a pre-compilation step anyway.
I'd really like to get the Sudoku solver at least at the same level of performance as the Python one.
One area in which the first class nature of the macros was particularly elegant was in the implementation of an "amb" macro. All of the classic examples of amb require some sort of global state to keep all of the continuation instances. But this is inelegant if you wanted to have multiple usages of (amb) which don't interfere with each other: suppose for example you want to create two threads - one for solving each Sudoku puzzle.
The conventional examples always add some sort of "amb-reset!" function to say "forget about everything you tried to solve before" - but this is inelegant and wouldn't work for multiple threads.
At the same time, you want (amb) to be a macro which lazily evaluates the possibilities, so that you can write code such as:
(define (anything-starting-from n)
(amb n (anything-starting-from (+ n 1))))
Having amb be a first-class macro allows the following pattern, which solves both of these problems:
(define (solve puzzle failure-continuation)
(with* (amb (make-amb-macro failure-continuation)
assert (make-assert amb))
... stuff requiring amb and assert
So we have a "solve" function whose use of 'amb' is entirely an implementation detail, which means you're guaranteed that it won't interfere with anyone else's use of the amb macro.
Here's the way I defined "make-amb-macro" and "make-assert":
; A hygienic 'amb' macro.
; Using a factory function allows us to maintain
; the hygiene of 'amb-fail'.
; Use 'make-amb' to make an
; amb macro that you know won't interfere
; with anyone else's amb macro.
; The ability to do this - return a hygienically
; scoped 'amb' macro from a
; function, is something that is only possible
; in a Lisp with First Class macros and continuations
; (i.e. First Class Lisp!)
(define (make-amb-macro exhausted)
(define amb-fail exhausted)
(define (set-fail thunk)
(set! amb-fail thunk))
; Based on
(define expand-amb (lambda
() (list force (get-fail))
(with* (old-fail (gensym)
`(,let ,old-fail (,get-fail)
(,c (,make-thunk ,y))))
(x . rest)
`(,amb ,x (,amb ,@rest))))
; Given an amb macro, make an appropriate
; assert function. Once again, this sort of
; thing is only possible with first-class macros.
(define (make-assert amb)
And to top it off: it looks like a forgot a "," before a make-thunk in there. Yes, a "quasiquote" which by default unquotes is a good idea!
"All of the classic examples of amb require some sort of global state to keep all of the continuation instances. But this is inelegant if you wanted to have multiple usages of (amb) which don't interfere with each other: suppose for example you want to create two threads - one for solving each Sudoku puzzle."
Woo! A kindred spirit! I scratched the same itch right when I was getting started with Arc:
I haven't ended up using this code, mainly because Arc isn't a great ecosystem for reentrant continuations. Arc's looping utilities use mutation, and one of the implementations of Arc which I like to support, Jarc, doesn't support reentrant continuations at all. Maybe you can give 'amb a better home.
"At the same time, you want (amb) to be a macro which lazily evaluates the possibilities"
Not me. My version of (make-amb) simply returns a procedure that chooses one of its arguments. Given that, we can get by with just one global macro for laziness:
(lazy-amb my-amb (foo a b c) (bar d e f) (baz g h i))
((my-amb (fn () (foo a b c)) (fn () (bar d e f)) (fn () (baz g h i))))
If you can stand to define a separate macro like 'lazy-amb and use it at each call site, it's a design pattern that obviates most use cases for fexprs.
"If you can stand to define a separate macro like 'lazy-amb and use it at each call site, it's a design pattern that obviates most use cases for fexprs."
Indeed. You and I had discussed this quite some time ago, about having convenience macros that expand into thunks. This thunkification is clunky and awful, but it does indeed give many of the benefits of vau.
I still think any new Lisp should use vau, but thunkification is a great way to retrofit many of the benefits of vaus into older (?) Lisps like Arc.
"Outside a macro it's less clear that this is what someone would want. [...] So maybe a "define-hygienic" macro could re-define quasiquote in this fashion - but elsewhere it would work as normal."
I don't see why... have you ever needed the normal quasiquote outside of macros? No? It's quite rare for me to want quasiquote outside of macros.
But the new quasiquote I'm proposing is useful outside of macros, precisely because it doesn't quote: it's a short version of list/cons:
(cons a b)
`(a . b)
(list a b c)
`(a b c)
(list* a b c)
`(a b . c)
(list* a (list b c) d)
`(a (b c) . d)
Then you could actually get rid of the "cons", "list" and list* functions and just use quasiquote instead.
"(list one two (list three))"
That is correct. If you want to express (list a b (c)) you can use `(a b ,(c))
"I had also realised that the "hygiene" is currently broken by the fact that, in FirstClass Lisp, "quasiquote", "quote", "unquote", "unquote-splicing", "dot" and "slash" are all "special" in terms of what the reader reads in, so something like the following:"
How I would handle this is to make the syntax expand to global values rather than symbols. So, for instance, I would define quote using vau (or a macro, whatever):
(define quote (vau _ (x) x))
And then I would have 'foo expand to (#<vau:quote> foo) where #<vau:quote> is the actual value of the global quote vau, not the symbol "quote". This avoids the redefinition problem you pointed out, because it doesn't use symbols, it uses values. Like I already said, I believe that any time you use symbols, you're hosed. Symbols are only useful for easily referring to values: it's the values that are important.
"If what you're saying is true then it might be possible to expand all of the "non" first class macros in a pre-compilation step anyway."
Various people (including rocketnia) are working on ways to inline vau, effectively turning them into compile-time macros. This is probably possible with the kind of immutable environments that Nulan has, but much much harder with mutable envs, because you need to rollback things if anything the vau depends on changes, as you already noted.
Personally, what I would do is provide full-blown Kernel-style vaus, including first-class environments:
(a b c) (eval env `(...))
_ (eval env `(...)))
Note that in the above version of vau, the environment argument comes first, and the remaining arguments are the pattern matching/body pairs, just like in your "lambda", except that it's a vau, obviously.
Then I would provide a way to tag a vau saying, "treat this vau like a macro". When the compiler sees something that's tagged as a macro, it will always macro-expand it at compile-time. Something like this:
(macro (vau _
(x y z) `(x y z)
Of course, you should also define a "defmacro" or whatever to make the above shorter:
(macro (vau env (name . bodies)
(macro (vau _ . bodies))))))
(x y z) `(x y z)
I'd like to note that the above is a lot like the hygienic pattern matching macros found in Scheme... except it doesn't need a massively unbelievably horrifically bloated system: it's built on the very simple concept of tagging vaus with a type.
Depending on how you implement it, macros in the above system may or may not be first-class, but even if they're not first-class, they're still built on top of the first-class vau. So, most code would use macros for speed, but your amb could use a raw vau.
If macros were no longer first-class, then the performance of the program is now clearer: macros are always expanded at compile-time, so they're always as fast as macros in other Lisps.
But vaus are slower because they're always run at runtime, so this serves as a red flag that says, "hey man, this is slow!". This also suggests a simple rule: "never use a vau when a macro will do". This complements the rule of "never use a macro when a function will do".
I would also build "lambda" on top of vau by using a tagging mechanism (like Kernel's wrap/unwrap):
(defmacro lambda bodies
`(wrap (vau _ . bodies)))
Basically, I believe the idea of tagging vaus to provide different callable behavior is extremely useful, and I prefer to see vau as the center of everything, with macros and functions being a subset of vaus (but with orthogonal purposes).
* : I use the word "vau" to refer to "fexprs" because it's shorter, nicer looking, easier to say, it has an analog to "lambda", and it's what Kernel uses.
* : Except I would call it "fn" rather than lambda, but that's just bike-shedding: