Neither do I. Just trying to be terse (not that I am anyways; I have a problem with conciseness). :)
I never said it needed to be transparent
No, but I was. In my limited experience, using ByteStrings in Haskell is still too much work, as opposed to having the standard Prelude functions work "out of the box". Instead of being able to say
map (\x -> 'x') "abc"
you wind up needing to
import qualified Data.Bytestring as B
B.map (\x -> 'x') (B.pack "abc")
When you need to start changing every string-related function, the reader, the writer, blah blah blah, it gets to be a hassle. Perhaps it's less of a pain in dynamically-typed languages like Arc. I don't know.
Why another sequence type? Any sequences could be coerce to simple list which work with the core functions.
Not everything is a linked list. And forcefully coercing every data structure into a linked list wrecks the time & space complexities that give many data structures their purpose. You'd force ranges (virtual sequences, a la Python) to eat up memory, though you certainly want to (say) map over them. You force arrays to have O(n) random access times. You couldn't have immutable sequences because linked lists are mutable. Etc.
Use map-hash, or map-file, or map-whatever if you need specialized version for other sequences. (Or as I said in the last example)
Frankly, ad-hoc polymorphism is ugly. Take Scheme, for instance, whose (R5RS) standard comparison operators include:
"When you need to start changing every string-related function, the reader, the writer, blah blah blah, it gets to be a hassle. Perhaps it's less of a pain in dynamically-typed languages like Arc. I don't know."
I also don't know :) Maybe advanced arc users could share their opinion on that?
"Not everything is a linked list. And forcefully coercing every data structure into a linked list wrecks the time & space complexities that give many data structures their purpose."
Yes, this is true.
In fact, it's a decision that needs to be taken.. should the core high level function work against different data structure? (As it does with clojure?)
As for now, map does work with string (but has a weird behavior in my opinion) and doesn't work with hash. Is this what we want?
Also, what do you think about multimethods? Do you think it's something useful to be added in Arc?
---------
About the map input versus output, I get what you mean. However, (map [string "test" _] "123") should work :-/ Maybe the problem lies in the concatenation operator while constructing the new string. i.e.
(map [coerce _ 'int] "123") could give "116101115116"
(I know it's not a good example... however, look at this one) :
should the core high level function work against different data structure?
The answer seems to be a resounding yes. Polymorphism was one of Arc's main principles when it was 3 weeks old (http://paulgraham.com/arcll1.html), and one that's been more-or-less preserved to this day -- just clunkily.
doesn't work with hash
I don't care for maptable in Arc. Seems like something that should be done by map. I address the point about hash-tables-as-sequences more in http://arclanguage.org/item?id=12341.
Also, what do you think about multimethods? Do you think it's something useful to be added in Arc?
From what I've seen, generic functions (single dispatch or multimethods) seem a "Lisp-y" way of solving the type-dispatch problem, and you don't really need to go full-blown OO about it. But I don't know much about other options.
However, (map [string "test" _] "123") should work :-/
Oh, I agree. I was just saying, the case could be made. :P
I think my map-as (from your original thread: http://arclanguage.org/item?id=12341) reflects map's intended behavior best, since coerce is really our standard for how types should interact. Not that it should be implemented that way, necessarily. But since coerce will flatten strings, it seems fair to say that
The map-as approach would still preserve the input type == output type behavior, but do it by coerce's rules, which happen to play nicely with strings here.
I've been thinking about the semantics of maptable. Right now it seems misnamed; it iterates over the table before returning it unmodified. But what should the right semantics be? Should it return a table with modified values for the same set of keys? Or should it return a list? Or should each iteration return a (k v) pair so you can get a new table with entirely new keys (I think map-as does this)? All of these could be useful; I think we need a more elaborate language than just map/fold to describe them.