1.The function of "expand-macro", and the support for reader macros. The former makes code-walkers possible and the latter lets the user create more syntax structures than [+ _ 1], etc. These two features may not be used as often as normal macros do, but if Arc is going to have much emphasis on meta-programming, they are important.
2.Implicit currying. In fact we cannot call it an
axiom. But it's quite handy for writing functional-style programs and would significantly make them shorter and easier to read. Implicit currying is not hard to implement. There might be problems when mixing implicit currying and non-fixed-number parameters. However, the problems are not un-solvable, I guess.
One of my friend is a Haskell programmer and he and I once have tried to put implicit currying into Scheme.
The biggest problem is Scheme's variable arity. You might think you can do currying only based on the required arguments, but that doesn't work in practice. Suppose this simple "complement" procedure:
(def complement (f) (fn args (no (apply f args))))
This loses arity information of original function f. It is extremely annoying that you get it curried for two-arg function f by
and not curried for its complement:
Another example that breaks implicit currying is a simple debugging aid procedure that wraps the given function to trace its invocation. Just redefining f with the wrapped f (which, many simple 'trace' macro does), and implicit currying on f break. It's very fragile.
If you have access to the internal of the implementation, you may be able to extract arity information from the original function and transplant it to the new function. But automating it generally is difficult, and relying programmers to do that manually is cumbersome.
Our conclusion was that implicit currying and variable arity (without type information) didn't mix. You have to choose either one.
I don't quite get the parallel. I'm talking about introduction of a new feature X breaking a model implied by an existing feature Y, even if there's nothing wrong with either X or Y individually. Can you elaborate your remark?
If the [...] syntax had a notation for passing on all arguments, implicit currying would be less important. Being explicit about the currying has the advantage that you don't have to know the arity of a function to recognize that currying is taking place.
How about allowing (+ 3 7 4 6 3) to be written as:
(let numbers '(4 6 3)
(+ 3 7 . numbers))
Couple this with [...] capturing all arguments in a variable and you get something like this (feel free to come up with a better variable than ^):
Yes, by all means. Here, I was talking about currying, though, so splicing was not a concern.
I'm not quite sure whether @ should be allowed to reuse the list when used at the end, but I believe it would be most correct not to. So (like mentioned several times before) the dot notation means cons, while @ means splice, which is not the same thing, even when used at the end of the surrounding list.
In order to benefit most from this, languages such as haskell that include this feature have their core library functions written such that the most useful parameters to be "curried away" (which are usually obvious) are located at the end of the list.
For instance, you may want to redefine the function signature for subseq from:
(def subseq (seq start (o end (len seq)))
(def subseq ((start (o end (len seq))) seq)
With that signature you would write (subseq '(1 4) "qwerty") which is arguably a more elegant style, as well. (yes, you can actually place an optional param inside of a destructuring like this in arc so it isn't at the end) Of course, the need for the quote is perhaps a small minus for some people.
Then you could use it in a curried fashion in a useful way:
;chop two letters off of each word
> (map (subseq '(2)) '("apple" "orange" "banana")))
("ple" "ange" "nana")
However, the core arc library commands, by their nature, are very general and hence have few mandatory parameters. However, most functions for any actual application will be far more specific and hence, I believe, will tend to be loaded with mandatory parameters. This is the use for case which function currying would be very valuable.
According to your essays, a major paradigm of arc is to let hackers do as much as possible, without giving errors when the compiler is worried things are "too dangerous". Partial application/function currying is my #1 request for arc right now... I would love it if any arc function given too few non-optional parameters would perform an implicit curry.
In haskell and all ML children, when you pass too few arguments to a fn, it would just yield a curried version. Of course, they have static type systems to reduce errors caused by passing too few arguments. but I guess Arc's goal is to be good for quick prototyping, not good at elimating run-time errors...
Implicit currying is another "new possibility".
In CL, applying a number to a list causes an error, but it's valid in Arc.
And in CL, passinging less or more arguments to a fixed-number-parameter-function causes an error, but with implicit currying, it's not an error any more.
I wholeheartedly agree with your suggestion that implicit currying be added to Arc. Currying was always one of those things that I felt was missing from Lisp, especially since it was based on lambda calculus from the beginning. Also, combining currying with composition provides tons of flexibility for creating new functionality in the language, just look at what all can be done with it in Haskell.