Hmm. In any case I think our disagreement here has more to do with how we view types. It appears that you have the view of "type" as similar to non-abstract classes in C++: Each class defines a set of static functions for accessing class objects. Applying a function to the object dispatches according to the type of the object. An object of another class cannot be used as an object for a different class.
On the other hand, I view "type" as similar to abstract base classes in C++ (or type classes in Haskell). That is, a "type" defines what the object's interface is. The type defines a set of virtual functions which actual implementations of the type must have. So a 'table type must provide a 'keys virtual function, and a '= virtual setterfunction. Applying a function to the object dispatches according to the actual object, instead of the object's proclaimed type.
That sounds about right. My thought process was more along the lines of CLOS-style generic functions, but I think these are roughly equivalent to a more generalized notion of C++'s instance methods.
As vehemently as I may argue, I'm not actually entirely convinced that this way is best. I come from a Ruby tradition, which is deeply based on the message-passing object model, which ends up behaving a lot like type classes. I've seen this end up working very nicely in practice, facilitating duck typing and allowing all sorts of cool tricks via encapsulation.
At the same time, though, CLOS is supposed to be very excellent. And the one thing, more than any other, that appeals to me about a generic-function style object system is that it can be implemented in pure Arc. It doesn't require any more axioms than the very-simple type, rep, and annotate (née tag).
The message-passing/type-class model, on the other hand, requires what seems to me to be an incredibly radical change to the core of the language: built-in per-object tables. This just strikes me as fundamentally un-Lispy. In a sense, it eclipses lists as the fundamental data type - they're not much more than tables with "car" and "cdr" keys.
Also, from a more practical sense, I'm not convinced that the message-passing/type-class model offers anything that generic functions don't. The main benefit that I've seen is duck typing - the ability of a function to rely on its input implementing to a given interface (in this case, having functions work properly on it), rather than being a given type.
But I think this works just as well whether or not the core functions are implemented with a type-class-style system or a generic-function-style system.
Consider, for example, Ruby's favorite duck type: enumerable objects. In Ruby, every object that implements an each method that calls a lambda on each element can be declared "Enumerable," and get various methods like map for free. This can be done using attachments like so:
(let (foo bar baz) ("foo" "bar" "baz")
'each (fn (f)
(def each (obj f)
((get-attachment 'each c) f))
The point of all this being that duck typing (or whatever you want to call the versatility granted by assuming a type implements an interface rather than specifically checking its type) is independent of whether or not the interface is implemented by attaching functions to objects or by defining generic methods.
I'd tend to agree with your vision, almkglor. It is more generic than the other way around : the former can be modelized with the latter, but the opposite is not true. I think Arc should remain generic and thus leave us the choice.
The problem is that Arc is not currently generic enough to allow both models. Only generic functions can be implemented with the primitives we're given - the core language has to be modified to allow arbitrary attachments.