So, it looks like you've stumbled upon the hygienic macro technique that Arc/Nu uses, which is to unquote the macros/functions, thus relying upon the standard lexical scope of functions. It's the same technique that Kernel uses, but it doesn't actually require first-class macros at all, and so it's trivial to add it to Arc 3.1.
In any case, I would suggest that quasiquote by default unquotes things rather than quoting them:
`(foo bar qux) -> (list foo bar qux)
This makes writing hygienic macros a lot easier:
(define if
(macro (lambda (condition true-case false-case)
; Anything not false is considered 'true'
`((lambda (#f) false-case
_ true-case)
condition))))
Notice that I didn't use any unquotes in the above code. If you ever want to break hygiene, just use unquote + quote:
`(foo ,'bar qux) -> (list foo 'bar qux)
Over the course of the past few years, I've been slowly coming to the conclusion that hardcoding any name (whether it's a symbol or a string) is a code smell and that quote should only be used for intentionally breaking hygiene, like with Arc's "aif" or "afn".
Taken to the extreme, this also suggests not hardcoding the symbols 'unquote and 'unquote-splicing in the 'quasiquote form... instead, I would make that a part of the syntax itself.
But hey, take what I say with a grain of salt: if you listened to me, you'd wind up with an alternate version of Nulan! Your approach is traditional: it's mostly the same approach taken by most Lisps. Whether you consider that good or bad is up to you.
---
Pattern matching looks good, and that's definitely a good way to go about it, if you want a more traditional Lisp.
Thanks for the solid feedback Pauan (and akkartik!) with some very good insights there! This sort of feedback is exactly what I was looking for when I posted this on the Arc forum, because I haven't actually done much Lisp before this
On to the specifics:
For quasiquote... yeah it had occured to me that almost every time I am not 'unquoting' a symbol then this is a mistake... but only when writing a macro. Outside a macro it's less clear that this is what someone would want.
And to clarify, you mean that :
`(one two (three))
would expand into
(list one two (list three))
NOT
(list one two (three))
So maybe a "define-hygienic" macro could re-define quasiquote in this fashion - but elsewhere it would work as normal.
I had also realised that the "hygiene" is currently broken by the fact that, in FirstClass Lisp, "quasiquote", "quote", "unquote", "unquote-splicing", "dot" and "slash" are all "special" in terms of what the reader reads in, so something like the following:
(lambda (quote phrase)
('quote-tag quote phrase))
will not do what the author probably intended. It's easy enough to work around it - but it's unsatisfactory.
It seems to me that the correct resolution is to have the reader itself also have a lexical environment which it refers to when generating these special symbols - although you would then have the problem of how to supply this environment to the reader. Something like:
By far the most interesting statement you've made is that making the macros first class isn't actually necessary for this hygiene trick anyway! That's a very interesting statement... I need to think about this.
What I most wanted was a "simple" concept of hygienic macros - that is, simple in understanding how they work: that they are just functions which happen to operate on the unevaluated source tree and return a source tree to be evaluated. Once we have that, I think the next most important thing is that there is no performance cost to using them.
If what you're saying is true then it might be possible to expand all of the "non" first class macros in a pre-compilation step anyway.
I'd really like to get the Sudoku solver at least at the same level of performance as the Python one.
One area in which the first class nature of the macros was particularly elegant was in the implementation of an "amb" macro. All of the classic examples of amb require some sort of global state to keep all of the continuation instances. But this is inelegant if you wanted to have multiple usages of (amb) which don't interfere with each other: suppose for example you want to create two threads - one for solving each Sudoku puzzle.
The conventional examples always add some sort of "amb-reset!" function to say "forget about everything you tried to solve before" - but this is inelegant and wouldn't work for multiple threads.
At the same time, you want (amb) to be a macro which lazily evaluates the possibilities, so that you can write code such as:
(define (anything-starting-from n)
(amb n (anything-starting-from (+ n 1))))
Having amb be a first-class macro allows the following pattern, which solves both of these problems:
(define (solve puzzle failure-continuation)
(with* (amb (make-amb-macro failure-continuation)
assert (make-assert amb))
... stuff requiring amb and assert
)
So we have a "solve" function whose use of 'amb' is entirely an implementation detail, which means you're guaranteed that it won't interfere with anyone else's use of the amb macro.
Here's the way I defined "make-amb-macro" and "make-assert":
; A hygienic 'amb' macro.
;
; Using a factory function allows us to maintain
; the hygiene of 'amb-fail'.
; Use 'make-amb' to make an
; amb macro that you know won't interfere
; with anyone else's amb macro.
; The ability to do this - return a hygienically
; scoped 'amb' macro from a
; function, is something that is only possible
; in a Lisp with First Class macros and continuations
; (i.e. First Class Lisp!)
(define (make-amb-macro exhausted)
(define amb-fail exhausted)
(define (set-fail thunk)
(set! amb-fail thunk))
(define (get-fail)
amb-fail)
; Based on
; http://c2.com/cgi/wiki/?AmbSpecialForm
(define amb
(define expand-amb (lambda
() (list force (get-fail))
(x) x
(x y)
(with* (old-fail (gensym)
c (gensym))
`(,let ,old-fail (,get-fail)
(,force
(,let-cc ,c
(,set-fail
(,make-thunk
(,set-fail ,old-fail)
(,c (,make-thunk ,y))))
(make-thunk ,x)))))
(x . rest)
`(,amb ,x (,amb ,@rest))))
(macro expand-amb))
amb)
; Given an amb macro, make an appropriate
; assert function. Once again, this sort of
; thing is only possible with first-class macros.
(define (make-assert amb)
(lambda
(#f) (amb)
(condition) #t))
And to top it off: it looks like a forgot a "," before a make-thunk in there. Yes, a "quasiquote" which by default unquotes is a good idea!
"All of the classic examples of amb require some sort of global state to keep all of the continuation instances. But this is inelegant if you wanted to have multiple usages of (amb) which don't interfere with each other: suppose for example you want to create two threads - one for solving each Sudoku puzzle."
Woo! A kindred spirit! I scratched the same itch right when I was getting started with Arc:
I haven't ended up using this code, mainly because Arc isn't a great ecosystem for reentrant continuations. Arc's looping utilities use mutation, and one of the implementations of Arc which I like to support, Jarc, doesn't support reentrant continuations at all. Maybe you can give 'amb a better home.
---
"At the same time, you want (amb) to be a macro which lazily evaluates the possibilities"
Not me. My version of (make-amb) simply returns a procedure that chooses one of its arguments. Given that, we can get by with just one global macro for laziness:
(lazy-amb my-amb (foo a b c) (bar d e f) (baz g h i))
==>
((my-amb (fn () (foo a b c)) (fn () (bar d e f)) (fn () (baz g h i))))
If you can stand to define a separate macro like 'lazy-amb and use it at each call site, it's a design pattern that obviates most use cases for fexprs.
"If you can stand to define a separate macro like 'lazy-amb and use it at each call site, it's a design pattern that obviates most use cases for fexprs."
Indeed. You and I had discussed this quite some time ago, about having convenience macros that expand into thunks. This thunkification is clunky and awful, but it does indeed give many of the benefits of vau.
I still think any new Lisp should use vau, but thunkification is a great way to retrofit many of the benefits of vaus into older (?) Lisps like Arc.
"Outside a macro it's less clear that this is what someone would want. [...] So maybe a "define-hygienic" macro could re-define quasiquote in this fashion - but elsewhere it would work as normal."
I don't see why... have you ever needed the normal quasiquote outside of macros? No? It's quite rare for me to want quasiquote outside of macros.
But the new quasiquote I'm proposing is useful outside of macros, precisely because it doesn't quote: it's a short version of list/cons:
(cons a b)
`(a . b)
(list a b c)
`(a b c)
(list* a b c)
`(a b . c)
(list* a (list b c) d)
`(a (b c) . d)
Then you could actually get rid of the "cons", "list" and list* functions and just use quasiquote instead.
---
"(list one two (list three))"
That is correct. If you want to express (list a b (c)) you can use `(a b ,(c))
---
"I had also realised that the "hygiene" is currently broken by the fact that, in FirstClass Lisp, "quasiquote", "quote", "unquote", "unquote-splicing", "dot" and "slash" are all "special" in terms of what the reader reads in, so something like the following:"
How I would handle this is to make the syntax expand to global values rather than symbols. So, for instance, I would define quote using vau[1] (or a macro, whatever):
(define quote (vau _ (x) x))
And then I would have 'foo expand to (#<vau:quote> foo) where #<vau:quote> is the actual value of the global quote vau, not the symbol "quote". This avoids the redefinition problem you pointed out, because it doesn't use symbols, it uses values. Like I already said, I believe that any time you use symbols, you're hosed. Symbols are only useful for easily referring to values: it's the values that are important.
---
"If what you're saying is true then it might be possible to expand all of the "non" first class macros in a pre-compilation step anyway."
Various people (including rocketnia) are working on ways to inline vau, effectively turning them into compile-time macros. This is probably possible with the kind of immutable environments that Nulan has, but much much harder with mutable envs, because you need to rollback things if anything the vau depends on changes, as you already noted.
Personally, what I would do is provide full-blown Kernel-style vaus, including first-class environments:
(vau env
(a b c) (eval env `(...))
_ (eval env `(...)))
Note that in the above version of vau, the environment argument comes first, and the remaining arguments are the pattern matching/body pairs, just like in your "lambda", except that it's a vau, obviously.
Then I would provide a way to tag a vau saying, "treat this vau like a macro". When the compiler sees something that's tagged as a macro, it will always macro-expand it at compile-time. Something like this:
(define foo
(macro (vau _
(x y z) `(x y z)
_ `())))
Of course, you should also define a "defmacro" or whatever to make the above shorter:
(define defmacro
(macro (vau env (name . bodies)
`(define name
(macro (vau _ . bodies))))))
(defmacro foo
(x y z) `(x y z)
_ `())
I'd like to note that the above is a lot like the hygienic pattern matching macros found in Scheme... except it doesn't need a massively unbelievably horrifically bloated system: it's built on the very simple concept of tagging vaus with a type.
---
Depending on how you implement it, macros in the above system may or may not be first-class, but even if they're not first-class, they're still built on top of the first-class vau. So, most code would use macros for speed, but your amb could use a raw vau.
If macros were no longer first-class, then the performance of the program is now clearer: macros are always expanded at compile-time, so they're always as fast as macros in other Lisps.
But vaus are slower because they're always run at runtime, so this serves as a red flag that says, "hey man, this is slow!". This also suggests a simple rule: "never use a vau when a macro will do". This complements the rule of "never use a macro when a function will do".
---
I would also build "lambda"[2] on top of vau by using a tagging mechanism (like Kernel's wrap/unwrap):
(defmacro lambda bodies
`(wrap (vau _ . bodies)))
Basically, I believe the idea of tagging vaus to provide different callable behavior is extremely useful, and I prefer to see vau as the center of everything, with macros and functions being a subset of vaus (but with orthogonal purposes).
---
* [1]: I use the word "vau" to refer to "fexprs" because it's shorter, nicer looking, easier to say, it has an analog to "lambda", and it's what Kernel uses.
* [2]: Except I would call it "fn" rather than lambda, but that's just bike-shedding:
Arc/Nu definitely wasn't the first: Kernel used the technique before Arc/Nu did. And in fact, Arc/Nu actually took the idea directly from ar: http://arclanguage.org/item?id=14507
I was merely pointing out that 1) the technique isn't new, and 2) it doesn't require first-class macros, so it's usable in Arc 3.1, etc. Sorry if I gave the impression that Arc/Nu was the first to come up with the idea.
When multiple people independently come up with the same idea, I see that as evidence that the idea is good, not that people are a bunch of thieving stealers. I don't believe in the idea of "stealing an idea", since ideas can only be copied, not stolen. Unfortunately, in the past I've used the word "stealing" to mean "copying". I'll stop doing that.
In any case, I don't think people should be penalized in any way shape or form for copying ideas from others. What matters is whether the idea is good or not, not who came up with it first. At the same time, pointing out that an idea has been invented in the past means that:
1) It can give extra validation to it, based on the assumption that if an idea is independently invented multiple times, it's probably a good idea
2) It suggests further research that you can do, to contrast how these other people tweaked the idea for their own uses. Whereas, if an idea is original, you're completely on your own, without much help.
While developing Nulan, it's been a lot of help for me to research how Arc, Clojure, Racket, etc. handle things like immutability, types, laziness, etc. In that case, having existing ideas to draw from is a plus not a minus.
---
"[...] not the link he referred to [...]"
At a quick glance, that link doesn't seem much like the technique that Kernel or Arc/Nu uses... it's more complicated and seems to be more like rocketnia's latemac: http://arclanguage.org/item?id=14521
---
"We're all stumbling around here, discovering and rediscovering things."
That's right, and there's nothing wrong with that. Nulan is almost entirely based on copying the good ideas from other languages: pattern matching, syntax, immutability, variables, vau... it's all been done elsewhere, in some cases long before. What's unique about Nulan is how all the ideas blend together, not the ideas themself. The same is true of Patient0's Lisp, and wart, and indeed most languages.
Secondly, the macro caching idea is very cool: you're memoizing macros based on their input. I'm actually surprised that I haven't heard of that technique before... it seems like a really obvious thing to do.
Maybe it's because Kernel ditches macros and just uses raw vaus, in which case you'd have to memoize based on both the environment and the input... hm... that's actually a really interesting idea...
---
Thirdly, you mentioned your strategy for first-class continuations... I had actually been trying to figure out how to implement continuations in Nulan, but your explanation is simple enough that it should work, so much thanks for that!
Kernel doesn't actually go all the way down the rabbit hole; it can't handle non-functions passed to apply.
As far as we know, no fexpr lisp handles apply. I've been struggling with this; in wart apply handles at least the most common kinds of macros, and warns on the rest: http://arclanguage.org/item?id=16378.
"Kernel doesn't actually go all the way down the rabbit hole; it can't handle non-functions passed to apply."
I would just like to point out that apply does work with vaus in Kernel, you just have to wrap them first:
(apply (wrap $and) ...)
In Kernel, these two are exactly the same:
(apply foo ...)
(eval (list* (unwrap foo) ...))
Which means that you can either call apply with a wrapped vau, or you can just call eval directly: (eval (list* $and ...))
This works because "wrap" in Kernel takes a vau and returns a function, and "unwrap" takes a function and returns a vau. So, yes, you're technically right: you can only apply functions in Kernel... but it's so easy to convert back and forth between functions/vaus that I consider this a non-issue.
But I'm going to assume that when you said that Kernel "can't handle non-functions", what you really meant is that Kernel can't automagically figure out your intent... well, yeah. I'm unaware of any macro system that does any better.
Vaus and functions are fundamentally different things that have orthogonal purposes, so you can't really expect to be able to mix and match them freely without specifying some sort of conversion, like with wrap/unwrap.
..and (I think this is the simplest one to think about, assuming assignment is implemented as a macro):
(apply = '(x 3))
No lisp handles both intuitively. Just unwrapping inside apply won't do the right thing in both cases. If you try to pass in arbitrary macros to (apply (unwrap f) ...) you _will_ run into problems. An earlier discussion[1] pointed out an example using and and all. _No_ combination of quoting will fix this.
It's pretty clear that this problem is ill-posed. But we still want apply, for all its misshapen form in our heads. I spent some time trying to do without it, and did not have a good time. In cases like this I believe the first allegiance of a language is to the intuition of the humans using it, not to some abstract platonic ideal.
> "I'm unaware of any macro system that does any better."
I think wart handles many of the easy cases more intuitively (and continues to flag hard cases outside its ken, just like Kernel). My original link makes the case that it is strictly better. I've been using it and haven't run into bugs in a while. But you haven't commented on it yet; I'd love to hear counter-examples and criticism.
"No lisp handles both intuitively. Just unwrapping inside apply won't do the right thing in both cases. If you try to pass in arbitrary macros to (apply (unwrap f) ...) you _will_ run into problems. An earlier discussion[1] pointed out an example using and and all. _No_ combination of quoting will fix this."
Sure, but there's still some vaus that do work. So it's a question of how many vaus work, not whether they work at all. In any case, we both agree, I was just clarifying for the sake of Patient0.
---
"I think wart handles many of the easy cases more intuitively (and continues to flag hard cases outside its ken, just like Kernel). My original link makes the case that it is strictly better. I've been using it for a while and haven't run into bugs. But you haven't commented on it yet; I'd love to hear counter-examples and criticism."
I'm not really fond of hardcoding things. To me, it smells like a limitation in the language. So I'd rather find a way that works well with both vaus and functions. My plan for Nulan is like so:
Basically, rather than having apply... I instead have @qux expand to (splice? qux). Then vaus/functions will do different things with splice? forms. That should give an intuitive way to deal with the problem.
In other words, rather than specifying the behavior in "apply"... the behavior is specified by the vau/function themself. This also lets you define your own custom behavior for splicing, like with lists and dictionaries.
"rather than specifying the behavior in "apply"... the behavior is specified by the vau/function themself."
Interesting. I'm going to think about this.
I agree that hard-coding is not at all ideal, and I'd love to lose it. But I don't want to give up a feature just because it's ugly to implement. And I find it totally reasonable to articulate limitations of Kernel :)
"And I find it totally reasonable to articulate limitations of Kernel :)"
Sure, otherwise all of us here would be using Kernel, right? I probably went a bit overboard in my nitpicking, sorry.
---
This actually brings me to something I wanted to discuss... I've decided to significantly simplify the way types behave in Nulan. Here's how it works:
There's a function called "type" which when called with 0 arguments will return a special function:
$def foo: type
Now we can call the special function "foo" with any number of arguments:
$def bar: foo 1 2 3
And we can then destructure it:
$let (foo A B C) bar
# in here, A is 1, B is 2, and C is 3
And that's pretty much it. Using this you can create new data similar to Racket's structs (only much more flexible), you can tag data with any values you want... you can implement traits... you can implement normal OOP (albeit it's a bit tricky because this system is much more flexible than traditional OOP, so you'd have to actually add in restrictions to make it less flexible!)
And then there's a small twist... you can pass two callable things to "type":
$def foo; type
$fn ...
$fn ...
The first callable will be called when constructing the type, like when you call (foo 1 2 3), and the second callable will be called when destructing the type, like with (foo A B C).
That's it. That should be enough to implement everything: lists, dictionaries, vau, fn, special data types that have different behavior when splicing on construction or destruction...
Basically, with the above, you can easily implement pretty much any behavior you want, and have it fit seamlessly into the language.
Yeah it definitely seems worth trying. I'm reminded of our argument many months ago about a type system where you could build table-like entities using duck typing[1]. Neither of us was thinking about Go back then. I think it's worth another attempt to replace isa with something like interfaces. rocketnia hates isa too, iirc :)
[1] I was overly nitpicky in that argument, sorry. I don't know if I ever said that.
"I'm reminded of our argument many months ago about a type system where you could build table-like entities using duck typing[1]."
Yeah, but in retrospect it was an awful idea. My latest ideas for Nulan are soooo much better than my old ideas. I still think my old idea would have been better than Arc's annotate/type/rep, but...
1) It was too complicated: something that should be in a library, not in the core
2) It used hardcoded strings/symbols for the keys... something I've really tried to avoid lately. My new idea for Nulan has zero hardcoding: it's all done based on values and positions
3) It was a bit too OOP for my tastes... even back then. But at the time it was the best idea I had for extensible data types
---
"rocketnia hates isa too, iirc :)"
Well, I'm not rocketnia, but I think that's because:
1) It uses symbols for types, at least in Arc 3.1
2) It plays poorly with ssyntax, since you can say foo?.x but you have to say (isa x foo)
3) If you want to add new types that behave like existing types, you need to always extend isa (or type), but with proper foo? functions you only need to extend those individual functions
This suggests that having string?, cons?, etc. rather than (isa ... 'string) and (isa ... 'cons) is a good idea.
But I take it a step further in Nulan and also say that annotate/rep are poor because if you want to tag something with multiple types, you need to wrap it repeatedly: (annotate 'foo (annotate 'bar ...))[1] and then you need to unwrap it repeatedly: (rep (rep ...))
This also means that annotate is not transparent: an annotated thing doesn't behave the same as a non-annotated thing... quite a while back waterhouse talked about having the base primitives like car/cdr/etc. automatically call rep: that way you could annotate things and they'd transparently work with existing stuff.
But as rocketnia pointed out, sometimes you don't want that. So there are times where you want something to behave transparently and other times where you don't want that. I think my type system can accomodate that easily.
---
In any case, I've become pretty convinced that Paul Graham was right at least in theory: you only need the theoretical equivalents of annotate/type/rep to implement any type system you want. The problem is that they were implemented poorly in Arc 3.1, not that the idea itself is bad. Nulan's (current) type system is theoretically equivalent to Arc's annotate/type/rep but I believe it's far superior in implementation.
---
* [1]: Alternatively, you could use a list of types: (annotate (list 'foo 'bar) ...) but that doesn't work with isa, at least not without retrofitting it, and it still doesn't solve the transparency problem.
"Well, I'm not rocketnia, but I think that's because"
You're a pretty good representative of... my past self. ^_^
---
"1) It uses symbols for types, at least in Arc 3.1"
Yep, there's no need to have first-class type tags of any sort when first-class wrapper and unwrapper functions could do that job.
---
"2) It plays poorly with ssyntax, since you can say foo?.x but you have to say (isa x foo)"
Nowadays I'd rather design the technical parts of the system first and then design the syntax to support it the best it can.
---
"3) If you want to add new types that behave like existing types, you need to always extend isa (or type), but with proper foo? functions you only need to extend those individual functions"
These days I would rather give library writers the ability to say what's extensible and what isn't. If it isn't extensible enough to support a new type, too bad. Fork the library.
This applies to languages too. If the language isn't extensible enough, fork it. I once wanted to design languages such that nobody would need to fork them, but I don't put much faith in that anymore, so I see it as a lower priority.
Anyway, I don't like Arc's 'type and 'rep because even though they can be used on any value, that misses the point. If I want to make a value that anyone can take apart, I'll use a table or a cons cell. Type wrappers are good for setting aside new areas of value space that _no_ existing utility knows how to handle yet.
--
"But I take it a step further in Nulan and also say that annotate/rep are poor because if you want to tag something with multiple types, you need to wrap it repeatedly: (annotate 'foo (annotate 'bar ...)) and then you need to unwrap it repeatedly: (rep (rep ...))"
In that particular case, I would consider making an 'unwrap-bar function and extending it to handle both 'foo and 'bar.
I would probably name 'unwrap-bar more descriptively, reducing its association with the concrete implementation details of the 'bar type, if only these weren't metasyntactic variables. :-p
Anyway, what I'm doing isn't necessarily strictly better than what you're doing. We're thinking of different overall systems.
" I once wanted to design languages such that nobody would need to fork them, but I don't put much faith in that anymore, so I see it as a lower priority."
To the club :) It's a pretty select club, I think. Too much effort is expended to avoid forking in the name of 'reuse', IMO.
But reuse in what context? Reuse all across the universe? I think the notion of reuse is often ill-posed. The only reuse that matters is in the context of a codebase. And as long as a codebase only has one fork of any software you can have unlimited forking in the world without reducing reuse.
(This also kinda feeds into our conversations about namespaces and libraries and backwards compatibility.)
"(This also kinda feeds into our conversations about namespaces and libraries and backwards compatibility.)"
Well, I only care about libraries because it means that (theoretically) one guy can write the code and a bunch of people can use it. The same is true of languages.
This is nice for things that are pretty stable, like regexps. Everybody knows regexps. The syntax is reasonably standardized across implementations. It doesn't make sense to have everybody write their own custom regexp implementation: just write one solid one and use it as a library.
But that only works when there really is a Single Right Way to do it. As soon as there's different goals and priorities, you just end up with a lot of little custom libraries (or lots of forks of libraries), in which case they might as well not be libraries to begin with, since they're only really useful to their original author.
This means you should only write libraries when there's some sort of standard or consensus. If there isn't, just write your code in a very flat simple style, no libraries needed.
---
As for namespaces... I don't actually care about those for code reuse. Libraries handle code reuse just fine with or without namespaces: look at Emacs Lisp as an example of a language with no namespaces and dynamic scope yet they seem to manage okay. C also has lots of libraries and code reuse yet doesn't have namespaces.
The reason I care about namespaces is that it makes certain things easier to reason about, that is, it reduces the cognitive load needed to design and understand a program. It also makes your programs shorter because when conflicts do arise, you don't need to do things like prefixing all your global variables, like they do in C/Emacs/Python/JavaScript/etc.
Well, that's the theory, anyways... in practice, I agree with you that traditional namespaces are overrated and only mildly helpful while requiring a lot of infrastructure to support them. An overall net loss that can only be recouped by writing many libraries over a long period of time that make use of the namespace system.
But I think Nulan is a bit unique in that it doesn't have traditional namespaces, but its immutable environments give you simple partitioning that I believe reduces cognitive load while not restricting you too much.
By the way... because they're (mostly) transparent and you can easily have multiple types attached to a single blob of data, you can use the type system to attach any arbitrary metadata you want. I can't help but feel that there's some connection there to monads...
I also just realized it might have some interesting uses as an alternative to exceptions. Rather than throwing an exception, you would instead return an object that has been tagged with an "exception" type.
By default this exception type would be propagated upwards until it reaches the top of the scope, but you can catch it by using pattern matching:
$let (exception? X) ...
...
I still have no clue how to handle errors in Nulan, though... error values, exceptions, Maybe type, something else...
"By the way... because they're (mostly) transparent and you can easily have multiple types attached to a single blob of data, you can use the type system to attach any arbitrary metadata you want. I can't help but feel that there's some connection there to monads..."
Yep! There's a pretty close connection to comonads, the opposite of monads. Comonads are for structures that can be unwrapped into a value at any time, but which can also be transformed without unwrapping by... confusingly, sending them a function that would unwrap them.
extract :: MyWrap x -> x
extend :: (MyWrap x -> y) -> MyWrap x -> MyWrap y
The point is that the wrapper that appears around y is generally related to the wrapper x started with. Note that extend's function argument might actually process multiple (MyWrap x) values, all hidden inside the main (MyWrap x).[1]
To illustrate, a pair of a Kernel expression and a Kernel environment can be used as a comonad, as long as we design Kernel expressions so that they can reify arbitrary values using 'quote. The extract method is eval, and the extend method bundles up the result as a literal expression together with the original environment.
[1] Aside from a few laws this has to follow, this kind of variation of behavior is particular to the implementation of each comonad. There will typically be some special-case utilities outside the comonad interface that take advantage of these specific features.
"I also just realized it might have some interesting uses as an alternative to exceptions. Rather than throwing an exception, you would instead return an object that has been tagged with an "exception" type."
This is monadic. Generally if you build a program out of a lot of functions of type (x -> MyWrap y) you can probably treat MyWrap as a monad, whereas if you build it out of a lot of functions of type (MyWrap x -> y) you can probably treat it as a comonad.
---
"By default this exception type would be propagated upwards until it reaches the top of the scope, but you can catch it by using pattern matching"
In this case you might be implementing it in a monadic way, but for most purposes the language user might as well treat it as a regular side effect.
I've been pondering the question of how to make the syntax of a language generic enough to integrate user-defined side effect models as though they're built in, just like this. However, this is a rabbit hole which has prevented me from getting very much done. :-p It's also straying pretty far from this thread's topic.
I just want to say thanks akkartik for your public writings about "apply" and first class macros - this allowed me to not go down the rabbit hole of figuring out what `(apply <macro> args)` ought to mean and concentrate on getting the Sudoku solver to work instead ;-)
Thanks for that "kernel" of information about "apply"!!
Yeah the one catch to caching the macro expansion is if the macro depends on something which subsequently changes after the expansion is cached. In practise I was not that bothered about this - I never ended up needing to write such macro.
Ah, I've been struggling to build an inliner and partial evaluator for speeding up my interpreter (http://github.com/akkartik/wart#readme). Why didn't I think of just caching macro expansions?! Many thanks.