Some times ago I started a GTK+ binding, now "paused". It's more boring than I thought initially. If you wish look at it for a starting point (file gtk.arc in Anarki). I now think a binding towards tcl/tk would look nicer and easier to use, though.
Specifically, MzScheme 4+ uses immutable lists, which don't work well with Arc. You might be able to try http://arclanguage.org/item?id=3954, but I personally have no experience in doing so.
> That is much more complicated to implement, though, and requires environments as first-class objects.
Given the interpreted nature of Arc, first class environments shouldn't be too hard to implement, but in a compiled implementation it would be a lot more difficult.
They could not be implemented on top of the existing arc-to-scheme translator, because that's what it is: an arc-to-scheme translator, not an interpreter. Scheme doesn't have first-class environments, so we can't shove them into arc without writing a full-fledged arc interpreter.
If the environments are temporaries created only while processing macros, and are discarded afterwards, they don't necessarily have to be difficult for compilers.
Which is a bit hard not to know.. I mean, in order to install Arc you have to install scheme.. and the install link right next to the tutorial link makes that clear.
This depends on the implementation, not the language. With the great majority of implementation this is true, though. LispWorks and Allegro CL have support for this, but I don't know the real size of the executables, because I've never used them. The problem is that you have to carry the runtime together with the application, because if your application uses 'eval, then to execute it you need a compiler and all the runtime. The comparision with 'grep' isn't fair: grep uses the C runtime that comes pre-installed on the system. You dont' notice it, but it is there and it is quite big. If you had a pre-installed lisp system, then you could deliver small fast starting executables.
> grep uses the C runtime that comes pre-installed on the system.
Quite right. The fact that the OS itself is (usually) written in C means that nearly every OS-using computer has a C runtime.
As an aside, consider executable file sizes in the Windows world, where the OS does not provide its C library to other programs. Many programs in Windows include their own versions of the C library, increasing their sizes.
Functions can be passed around, so the lexical scope is really handy in this case. Macros, instead, are expanded in place: e.g. the macro
(mac m (a)
`(f ,a))
"f" is just a symbol. In hygienic macros f would be changed to some unique symbol (actually you wouldn't write such code in a hygienic macro if your intent was to make the expansion call f). In a standard macro system, you voluntarily capture the symbol 'f, maybe because it will be bound to a function at runtime. In that example f doesn't have dynamic scope nor lexical scope: it is just a symbol that the macro has put in the result of its computation. Its meaning (variable or not, lexical scope or not) will be decided on the successive compilation pass.
Yes, exactly. I see why that is true. But think about a parallel function:
(def n (f a)
(f a))
In the function, you can pass f as an argument to call. In a macro, f is passed implicitly in the environment the macro is expanded in. The method of passing f to the macro m seems very much like using dynamic scoping to pass arguments to functions. My question is, what about a macro system where you could pass real arguments to macros? I.e., other macros (or functions, etc.)? What makes these things different?
> what about a macro system where you could pass real arguments to macros?
Like this
(mac n (f a)
`(,f ,a))
(n [+ _ 1] 9)
==> 10
where you pass a form that is then evaluated by the macro, or did you mean something else? Macros' arguments aren't evaluated, so you can pass only forms. To pass the result of a computation in CL (not in Arc) you can use read macros, that are evaluated before macro expansion time:
(n #.(computation) 9)
This is quite unusual, because macros are intended mainly to modify the syntax, so it's quite natural to make them work on the syntax itself (i.e. the forms).
Ah, I see this. I think I have been thinking of macros differently than you have (and probably wrongly). I suppose lexical scoping for macros would make the most difference in the case where a macro expands to a function call, like this:
(let x 0 ; call (count) to keep a count of things
(def count () (set x (+ x 1))))
; count-ops: count how many lines of code you run.
(mac count-ops body
(if body
(cons (car body)
(cons '(count)
(count-ops (cdr body))))
'())
(def foo ()
(count-ops ; this count-ops call fails
(with (count 1 step 2)
; do some loopy stuff here
)))
If that's not a compelling example, pretend that count-ops is inserting an interrupt check between every two lines of code. Why is dynamic scoping better for cases like these?
As for real arguments to macros, yes, I meant something like the CL stuff. You're right, though, that macros modify syntax, and I wasn't thinking about them that way. Two posts up you said that macros are "expanded in place". I think that you were thinking about the effect macros have on the code they're called on, whereas I was thinking about the call to the actual macro procedure, and passing arguments to it.
Regexps are a particular tool. They are useful in many cases, but I think that they aren't general enough to put them in the core. They should be in the standard library, though.
There are two great problems when learning Arc: the lack of documentation and the quite cryptic error messages. For these two reasons I would reccomend to know a little lisp before learning Arc. A really good book is Practical Common Lisp (http://www.gigamonkeys.com/book/). I suggest reading at least the first ten chapters.
If you want to learn Scheme first try How to Design Programs or Structure and Interpretation of Computer Programs (both free online). HtDP is more beginner oriented, so if you can program I recommend SICP. There are valuable lessons in HtDP even if you are a programmer, but it's not as challenging as SICP so HtDP might be boring.
Besides the fact that he used 212 diferent modules (I wonder what kind of things does his blog do behind the curtains...), the point of the post is that the Perl community has a real libraries culture: when someone solves a task, he/she puts the solution inside a module and usually documents it. This is a great thing, because it encourages code reuse more than any other programming technique (e.g. OO programming) and it is something that the Lisp community has always missed, in particular the documentation part.
If the Arc community could learn that important lesson from Perl, that would be a huge leap forward for the Lisp world.