It's a bug. Probably comes from the fact that Arc's variables are prepended by _ (for Arc2) or __ (Anarki).
arc> an-unknown-variable
Error: "reference to undefined identifier: __an-unknown-variable"
^
|
This one
Scheme-side 'ac converts quasiquotes by scanning for escaped expressions and converting symbols in those expressions to __, but apparently there's a bug here
> The source of this problem is not only in chaining itself: redefining standard functions will almost always lead to problems when distinct libraries redefine the same function.
But this should be supported. For instance, in C++, you can redefine the "standard function" operator+ :
Foo something(Foo a, Foo b){
return a + b;
}
Or in Ruby, you can redefine the "standard function" each:
a = Foo.new()
a.each{ |i|
puts i
}
C++ does it by strict static typing, while Ruby does it by attaching to the object. Neither way feels very Arcish; what is a better Arc solution?
> Were you thinking about a multimethod dispatch such that of CLOS?
The CLOS solution is pretty similar to the Ruby solution: it checks the type passed to the function and dispatch to the correct function. The big difference is that CLOS looks at all the arguments, while Ruby only at the first argument (the object). The CLOS approach seems to me the best to solve the problem, but I think that applying it to functions as basic as car would kill performance without a really good optimizer.
> The CLOS solution is pretty similar to the Ruby solution: it checks the type passed to the function and dispatch to the correct function. The big difference is that CLOS looks at all the arguments, while Ruby only at the first argument (the object)
IMO the difference is big enough for CLOS Way != Ruby Way. ^^
Still, I wonder - how does CLOS implement this? How about for variadic functions?
> but I think that applying it to functions as basic as car would kill performance without a really good optimizer.
Hmm. I've been trying to grok through dynamic dispatching speedup techniques - the Sun Self papers are pretty good, Sun's index doesn't have the papers themselves but you can ask Google for them.
I think that for every method with the same name an hash table indexed on the types of the argument is created, and when the method is called, a lookup is made on the table.
> Hmm. I've been trying to grok through dynamic dispatching speedup techniques - the Sun Self papers are pretty good, Sun's index doesn't have the papers themselves but you can ask Google for them.
I've found those papers in the past (a few weeks ago), but I've never found the will to read them, they are written for people already in the compilers' world and I'm still learning the basics about compilers.
BTW, a good type inferencer + a good function inliner could solve the problem, but I wouldn't know even where to start to implement them :(.
Use a "Polymorphic Inline Cache" (PIC). Basically if a piece of code could call several different methods, we figure out which one it is, then we create a copy of the calling function which does the type checking at the top and specialize all method calls to that type:
> Still, I wonder - how does CLOS implement this? How about for variadic functions?
I don't think CLOS lets you check types on &rest, &optional, or &key parameters. So you couldn't use CLOS for the current behavior of '+.
Also note that CLOS only works on methods with "congruent lambda lists", that is, methods with the same number of required, optional, and keyword arguments. So you can't have
ah, I see. So I suppose this greatly simplifies things then.
Hmm. This certainly seems easier to hack into arc. We could have each method start with a generic function whose lambda list we have to match.
As an aside, currently the base of Arc lambda lists are simply &rest parameters (i.e. optional parameters are converted into rest parameters with destructuring). Should we match on the plain rest parameters or should we properly support optionals?
> The easiest solution to your problem might be a callable table whose keys are the types of arguments and whose values are different function implementation.
Yes. But there's a problem: how do you define the table for, say, the variadic function '+ ?
Your solution(s) are approximately what I had in mind, although I think I found some problems with it last night, and I just woke up and can't remember quite yet (brain is still booting or something).
(def + rest
(reduce <base>+ rest))
(defm <base>+ ((t x string) (t y string))
(join x y))
Further, optional parameters are simply ignored and considered as part of the rest parameter (i.e. they can't be typed). Basically, the typesystem matches on the first N parameters, where N is how many type parameters you care to define.
Why the first N parameters? So that we can protect against optional parameters defaulting to values that are inappropriate for other types, such as the 'scanner example I gave above.
Yes, but. That only holds if in (annotate t1 (annotate t2 data)), we have (isnt t1 t2) (there was a thread about this somewhere, but I can't find it). Try it!
arc> (annotate 'foo (annotate 'bar obj))
#3(tagged foo #3(tagged bar #3(tagged mac #<procedure>)))
arc> (annotate 'foo obj)
#3(tagged foo #3(tagged mac #<procedure>))
arc> (annotate 'foo (annotate 'foo obj))
#3(tagged foo #3(tagged mac #<procedure>))
I was surprised too; that's why this was a bug I had to fix. I suppose this behaviour makes sense, but it does break the second of the two identities
(is (type (annotate t r)) t)
(is (rep (annotate t r)) r)
Huh. Have you tried it on PG's ArcN? It might have been me hacking this onto Anarki (I have multiple personalities. No, don't believe what I just said, that wasn't me. LOL). Don't have access to an arc right now, sorry ^_^
1) No, you should implement in the math in the underlying machine instructions, which are guaranteed to be as precise and as fast as the manufacturer can make it. The underlying machine instructions are fortunately possible to access in standard C libraries, and the standard C library functions are wrapped by mzscheme, which we then import in arc.
2) It should be, and it isn't.
(defmemo fac (n)
((afn (n a)
(if (> n 1)
(self (- n 1) (* a n))
a))
n 1))
3) Yes, arc-on-mzscheme handles this automagically. arc2c does not (I think it'll overflow)
Implementing numerically stable and accurate transcendental functions is rather difficult. If you're going down that road, please don't just use Taylor series, but look up good algorithms that others have implemented. One source is http://developer.intel.com/technology/itj/q41999/pdf/transen...
That said, I don't see much value in re-implementing math libraries in Arc, given that Arc is almost certainly going to be running on a platform that already has good native math libraries.
I figured that being close to machine instructions was a good thing, but I thought that we should do that via some other method, not necessarily scheme, which may or may not remain the base of arc in the future.
That being said, if you think that pulling from scheme is a good idea, why don't we just pull all of the other math functions from there as well?
Actually I think it might be better if we had a spec which says "A Good Arc Implementation (TM) includes the following functions when you (require "lib/math.arc"): ...." Then the programmer doesn't even have to care about "scheme functions" or "java functions" or "c functions" or "machine language functions" or "SKI functions" - the implementation imports it by whatever means it wants.
Maybe also spec that the implementation can reserve the plain '$ for implementation-specific stuff.
Looks good, although probably needs more examples, maybe some actual use cases.
On the implementation side... I notice you use (apply list 'do ...). I suspect you can probably make it a very big `(do ...) form, although you may have felt that it was getting bit complicated I guess ^^;.
For use cases: see Haskell code? :P I wrote this because I really liked them in other languages and in EOPL, and thought they might come in handy. If I come across somewhere in my Arc code where I use them (I haven't written too much new Arc recently) I'll report out.
I also wrote most of this a while ago (I think I started in March), and my coding has improved quite a bit since then, so the implementation could probably get cleaned up... :)
> If I come across somewhere in my Arc code where I use them (I haven't written too much new Arc recently) I'll report out.
Please do ^^. As an aside, it may be possible to transform the AST data structures in arc2c to tagged unions; we could transform the (if (alam ast) ... (alit ast) ...) to tcase forms. Care to try? It strikes me as a possible demonstration of your code, and may be useful to shake out bugs.
I likemaybe :) Also, it's simple and good for testing (e.g. it has a zero-ary type).
The arc2c thing sounds sensible---I have to fix a bug in tagged-union.arc first (you can't currently have a constructor with the same name as the datatype), but then I'll try to work on it.
I looked into this some more, and it looks like this is a difference between arc and scheme. I just wanted to put this out there for anyone else looking at Scheme materials through an Arc lens.