Arc Forumnew | comments | leaders | submit | fallintothis's commentslogin

Good discussion. :)

shader already said the first thing I thought of:

Ideally, the 'results' in the documentation would still be computed dynamically, so you can always see whatever the current output would be, but also have the expected results available for separate regression testing.

All the talk about stdin/stdout/randomness got me realizing that, really, any state is going to be an issue (as state is wont to be). Not that it's often a problem in a mostly-functional language like Arc.

On the one (liberating) hand, Arc's philosophy clearly guides us to not lose sleep over the potential problems of state---just be careful. I see your destructive examples are careful to use ret, for instance, rather than global assignment. On the other hand, we are losing the ability to express certain kinds of tests here. Ideally, we could eval the examples in their own "sandbox" and capture their output, just as if it had been at a REPL.

This is an interesting problem, because it feels so familiar. After some slow, tired thoughts (been a long day for me), it hit me: "Oh! We're doing doctests, except in the other direction."

In Python (

1. Copy/paste a REPL interaction into a docstring.

2. Looking up the documentation shows the static string.

3. A test function (doctest.testmod()) combs through docstrings, parses them apart, runs them in isolated environments, and checks their results.


1. You give some literal code as examples (yeah, Lisp!).

2. Looking up the documentation evaluates the example in the current environment.

3. Running the example generates some output that we print out as though it had been pasted from a REPL interaction.

For point 1, using sexps seems more befitting to Arc than parsing out docstrings. Yet we do lose something here: writing literal strings of expected output is cumbersome, as you mentioned. Much better would be to just copy & paste an entire REPL interaction (inputs & outputs) into a multiline string and call it a day.

Point 3 is a small detail: to run the tests, you "have" to look at the docs. Except writing a doctest.testmod() equivalent isn't hard---just evaluate all the examples without printing their docs.

But Point 2 is an interesting one. Arc doesn't have much support for environment-mangling: eval doesn't like lexical variables, lack of modules/namespaces, the difficulties of environment introspection, the weirdnesses that macros introduce, etc. About the only way I see to have a guaranteed clean slate (without requiring us to just "be careful" with the examples) is if we could fork off another Arc interpreter, run the example in that process, and capture the output (including stderr!) for comparison with the docstring. I reckon it would be slow as hell (one interpreter for each block of examples), but is that otherwise feasible?

I like the idea of just having to compare output strings universally (the repl-output comment I made), so that we don't really have to decide which expecteds to selectively evaluate. Instead, just look at the strings: (is "#hash(...)" "#hash(...)"). But there are probably ways that just comparing the output strings could break. Obvious things like random number generation, race conditions, user input. But even tables: are two tables with the same contents guaranteed to serialize the same way? I'm too tired to effectively think through or look up an answer.

Pleasant dreams, me.


1 point by akkartik 3638 days ago | link

"We're doing doctests, except in the other direction."

Yeah. Another key bit of prior art is Go's documentation., for example.


Actually, stdout should be easy to test with tostring, no?

In fact, if you're just testing REPL interactions...

  (mac repl-output (expr)
       (write (eval ',expr))

  arc> (repl-output (prn 'hi))
  arc> (repl-output (prn "hi"))
  arc> (repl-output 'hi)
  arc> (repl-output (do prn.1 prn.2 prn.3))
(Any gotchas with the above?)

It makes sense to specify the example expr / expected pairs as sexps & strings: run repl-output on exprs, compare those strings to the expected strings.


2 points by akkartik 3638 days ago | link

Thanks for thinking through that! Yes, I'd had a similar though not as fully-formed idea, but was stuck on distinguishing stdout from return value in the examples. But yeah, that's overthinking it -- we have to mentally parse the two at the repl anyway.

Only other quibble: a string isn't very nice to read next to code, even if it looks nice at the repl. Hmm..


Short answer: to reproduce the exact example you have...

  arc> (n-of 3 (range 0 9))
  ((0 1 2 3 4 5 6 7 8 9) (0 1 2 3 4 5 6 7 8 9) (0 1 2 3 4 5 6 7 8 9))
Long answer:

Depends on what you're trying to do. :)

I could sit here and rattle off builtin Arc helper functions & macros, but there's a bigger lesson to be learned here about functional programming. Specifically, the list comprehensions you naturally gravitate towards are just syntactic sugar for the famous map & filter higher-order functions ( &

In Python (at least in Python 2.x, where map and filter return lists instead of generators), the equivalence looks like this:

  [expr_involving_x for x in xs]      == map(lambda x: expr_involving_x, xs)
  [x for x in xs if expr_involving_x] == filter(lambda x: expr_involving_x, xs)
For example,

  >>> [x + 1 for x in [1,2,3]] == map(lambda x: x + 1, [1,2,3])
  >>> [x for x in [1,2,3] if x == 2] == filter(lambda x: x == 2, [1,2,3])
You can apply these transformations recursively until you remove the list comprehension notation altogether. E.g.,

  [x + 1 for x in [1,2,3] if x == 2] -> map(lambda x: x + 1, [x for x in [1,2,3] if x == 2])
                                     -> map(lambda x: x + 1, filter(lambda x: x == 2, [1,2,3]))
Nested comprehensions have a transformation rule, too, but it's somewhat more complicated:

  [expr_involving_y for x in xs for y in expr_involving_x] == sum([[expr_involving_y for y in expr_involving_x] for x in xs], [])
For example,

  >>> [y+1 for x in [1,2,3] for y in [x+1,x+2,x+3]] \
  ... == sum([[y+1 for y in [x+1,x+2,x+3]] for x in [1,2,3]], []) \
  ... == sum([map(lambda y: y+1, [x+1,x+2,x+3]) for x in [1,2,3]], []) \
  ... == sum(map(lambda x: map(lambda y: y+1, [x+1,x+2,x+3]), [1,2,3]), [])
So, list comprehensions are just a way of saying the same things as higher-order functions. Arc generally favors the higher-order function flavor of this idiom. map is still called map, filter is called keep, and Python's sum(..., []) is spelled (apply join ...):

  arc> (map (fn (x) (+ x 1)) (list 1 2 3))
  (2 3 4)
  arc> (keep (fn (x) (is x 2)) (list 1 2 3))
  arc> (apply join (map (fn (x) (map (fn (y) (+ y 1)) (list (+ x 1) (+ x 2) (+ x 3)))) (list 1 2 3)))
  (3 4 5 4 5 6 5 6 7)
You might think Arc's preference for these comes from Lisp dialects naturally eschewing specialized syntax. But, Arc does like syntactic shortcuts; just ones that apply more generally to functions, like bracket notation (e.g., [+ _ 1]) and ssyntax (e.g., ~f and f:g come up a lot in higher-order calls).

It's also interesting to note that due to the previous rules, you can easily write a macro that turns list comprehensions into the equivalent chain of higher-order function calls.

There are many Arc builtins that can help build lists, too. They're all defined in Arc itself, so just take a look through arc.arc to figure out how they work. Functions I spy that may help for list building (list of lists or otherwise): rem, mappend, trues, firstn, nthcdr, cut, tuples, pair, n-of, bestn, split, retrieve, dedup, reduce, rreduce, range, ... Really just depends on what you're looking to do!


2 points by bh 3703 days ago | link

Wow, awesome answer. Thanks!


2 points by fallintothis 3718 days ago | link | parent | on: How to read mutable hash tables

Disk serialization of variables & tables is already part of Arc 3.1's fromdisk, diskvar, disktable, and todisk:

  $ arc
  Use (quit) to quit, (tl) to return here after an interrupt.
  arc> (disktable h "/tmp/nested")
  arc> (= h (obj this "this" that 3 nested-hash-table (obj one "one" two "two" three 3) some-list (list 1 2 "three" 4)))
  #hash((that . 3) (some-list . (1 2 "three" 4 . nil)) (this . "this") (nested-hash-table . #hash((one . "one") (two . "two") (three . 3))))
  arc> (todisk h)
  ((nested-hash-table #hash((one . "one") (two . "two") (three . 3))) (this "this") (some-list (1 2 "three" 4)) (that 3))
  arc> (quit)
  $ cat /tmp/nested && echo
  ((nested-hash-table #hash((one . "one") (two . "two") (three . 3))) (this "this") (some-list (1 2 "three" 4)) (that 3))
  $ arc
  Use (quit) to quit, (tl) to return here after an interrupt.
  arc> (disktable nt "/tmp/nested")
  #hash((that . 3) (some-list . (1 2 "three" 4)) (this . "this") (nested-hash-table . #hash((one . "one") (two . "two") (three . 3))))
But, as I'm sure you're aware, the error with the above is that the serialized table (returned by todisk) has a literal Scheme #hash object nested inside, and when you read that back in, it's immutable:

  arc> (= (nt "this") "THIS")
  arc> (nt "this")
  arc> (= (nt 'some-list) (list 5 6 "seven"))
  (5 6 "seven")
  arc> (nt 'some-list)
  (5 6 "seven")
  arc> nt
  #hash((that . 3) (some-list . (5 6 "seven" . nil)) (this . "this") (nested-hash-table . #hash((one . "one") (two . "two") (three . 3))) ("this" . "THIS"))
  arc> (= ((nt 'nested-hash-table) 'one) "ONE")
  Error: "hash-set!: contract violation\n  expected: (and/c hash? (not/c immutable?))\n  given: #hash((one . \"one\") (two . \"two\") (three . 3))\n  argument position: 1st\n  other arguments...:\n   one\n   \"ONE\""
So, the actual salient operation here has little to do with immutable tables, but rather "deep copying" them, an operator that vanilla Arc lacks. It only has copy:

  (def copy (x . args)
    (let x2 (case (type x)
              sym    x
              cons   (copylist x) ; (apply (fn args args) x)
              string (let new (newstring (len x))
                       (forlen i x
                         (= (new i) (x i)))
              table  (let new (table)
                       (each (k v) x 
                         (= (new k) v))
                     (err "Can't copy " x))
      (map (fn ((k v)) (= (x2 k) v))
           (pair args))
But we could model a deep-copy off of that:

  ; (map deep-copy xs) wouldn't work on dotted lists, hence this helper
  (def deep-copylist (xs)
    (if (no xs)
        (atom xs)
         (deep-copy xs)
         (cons (deep-copy (car xs))
               (deep-copylist (cdr xs)))))

  (def deep-copy (x . args)
    (let x2 (case (type x)
              sym    x
              char   x
              int    x
              num    x
              cons   (deep-copylist x)
              string (copy x)
              table  (let new (table)
                       (each (k v) x
                         (= (new (deep-copy k)) (deep-copy v)))
                     (err "Can't deep copy " x))
      (map (fn ((k v)) (= (x2 (deep-copy k)) (deep-copy v)))
           (pair args))
Then we have

  arc> (disktable nt "/tmp/nested")
  #hash((some-list . (1 2 "three" 4)) (this . "this") (nested-hash-table . #hash((one . "one") (two . "two") (three . 3))) (that . 3))
  arc> (= nt (deep-copy nt))
  #hash((nested-hash-table . #hash((three . 3) (one . "one") (two . "two"))) (this . "this") (that . 3) (some-list . (1 2 "three" 4 . nil)))
  arc> (= (nt "this") "THIS")
  arc> (= ((nt 'nested-hash-table) 'one) "ONE")
  arc> (= (nt 'some-list) (list 5 6 "seven"))
  (5 6 "seven")
  arc> nt
  #hash((nested-hash-table . #hash((three . 3) (one . "ONE") (two . "two"))) (this . "this") (that . 3) ("this" . "THIS") (some-list . (5 6 "seven" . nil)))
You could build a macro atop of fromdisk to handle deep copying automatically.

I seem to remember discussion about deep copying before, but all I could find was


This is an interesting approach.

I'm using arc's type system here, but I'm not creating new datatypes or 'objects'.

And here my first thought was "we've come full-circle to object-orientation". :) I mean, inasmuch as OO amounts to type dispatch (number 3 in I guess).

Where variables of a user-defined datatype are created and used over potentially long lifetimes, the goal with these control-abstraction tags is to briefly wrap an underlying value in a navigation policy that the callee immediately takes off.

I totally get this, though (despite my comment above). In effect, it's like what a more OO language might do with iteration types: Python's generators, Ruby's Enumerator, Factor's virtual sequence protocol ( - honestly the most similar, because of the generic-based object system), Java's Iterable, etc.

The focus in those languages is on having a specific pattern in which iteration happens over an arbitrary sequence. Then you implement the "hooks" into this pattern in whatever specific way you need to. Which is really what's going on here, with the different walk implementations. It's just that walk is a bit coarser-grained than the others. In Python/Ruby/Java, you implement a specific method that just returns an object of a particular type---then that object implements an interface saying "hey, primitive iteration knows what to do with me". E.g.,

  Python 2.7.3 (default, Jan  2 2013, 16:53:07)
  [GCC 4.7.2] on linux2
  Type "help", "copyright", "credits" or "license" for more information.
  >>> class c:
  ...     def __init__(self): pass
  ...     def __iter__(self): return (x for x in range(5))
  >>> for x in c(): print x
Factor's a bit different, and I think we might learn something from it. Like other CLOS-alikes, its object system isn't based on classes that own methods. So rather than returning a specific "iterable" object, the sequence iteration functions are written to use a handful of very specific methods: Then, any object that implements those methods works with map, reduce, each, and the whole gang:

There's a difference in granularity. In Factor, the required methods are more primitive than walk: just length and nth. Then map/each/etc. simply loop over integers, indexing into the sequences. So, where Arc would have some ad hoc thing like

  (def map (f seq)
    (if (isa seq 'string)
        (map-over-strings ...)
        (map-over-conses ...)))
Factor would be more like

  ; rely on (len seq) & (seq i) being well-defined

  (def map (f seq)
    (for i 0 (len seq)
      (let x (seq i)
With walk, you have to implement the whole iteration shebang---and even then it's only used in each right now, not the whole suite of sequence functions.

The obvious problem with Factor's exact approach as applied to Arc is Arc's reliance on linked lists. Each index access is O(n), which would turn any iteration (map, each, whatever) over a linked list into an O(n^2) affair. So just defining something like walk might ultimately be preferable; and you could still surely implement map/reduce/keep/etc. in terms of it. (Even the string versions could be reworked to walk over the string characters. And for the love of everything, do away with general functions that hard-code recursion on cdrs, like vanilla Arc's counts! I see Anarki's already done that, but still.)

That whole digression aside, I think the actual pattern you demonstrate here kind of gives me some warm fuzzies. With the realization that ontree & company are just higher-order functions like map at heart, it liberates us from writing a million names for the same actual operation. To keep harping on Factor (it's the most Lisp-like non-Lisp that I know...), the same thing happens, where instead of a reverse-each function, you just have

  IN: scratchpad { 1 2 3 } <reversed> [ . ] each
and instead of using some each-integer looping function (though that does exist, it's an implementation detail), you just have

  IN: scratchpad 5 iota [ . ] each
I'm liking this idea even more now, since I've realized that it has parallels with my feelings about the Scheme standard using <, char-ci<?, char<?, string-ci<?, and string<? instead of just <. It's just that lists vs trees was a more subtle distinction.


1 point by akkartik 3761 days ago | link

Thanks for the parallels with Factor! I love Factor too, though I'm less knowledgeable than you.

I'm going to keep an eye out for lower-level primitives to genericize.



3 points by akkartik 3763 days ago | link

Holy crap, how have I never heard of noparen.arc? I'm gonna see how PG's design choices were different from mine in Did I do something he didn't consider, or did he dismiss my approach for aesthetic reasons?

Edit 11 minutes later: Ugh, he's relying on keyword names like a vim or emacs mode. So any new macros would need to be manually added to the right global variable. Contrast

And he seems to also have been trying to translate f(a) syntax.


2 points by fallintothis 3763 days ago | link

Ugh, he's relying on keyword names like a vim or emacs mode. So any new macros would need to be manually added to the right global variable.

"Ugh" is right. :) I think this is also a major problem with pprint.arc. A similar issue arose in

In the past I've tried to think about ways to handle it automatically, but I usually get hung up on macros and their pesky propensity for undoing assumptions. E.g., for my Vim ftplugin (, I used the heuristic "is the macro's rest parameter named body?" to figure out the indentation of so-called "bodops"...except that I had to manually maintain white/blacklists of troublesome macros with code similar to


2 points by fallintothis 3764 days ago | link | parent | on: Clojure's syntax-quote in arc

If you want to do substitution using a function, it seems to me like the operation is more of a roundabout map. Particularly if it's just over the atoms of a tree, it's easy to think of the higher-order functions from a different angle, instead of trying to make tree-subst do everything:

  (def leafmap (f tree)
    ; really, just (treewise cons f tree)
    (if (atom tree)
        (f tree)
        (cons (leafmap f (car tree))
              (leafmap f (cdr tree)))))

  (def whenf (test f)
    [if (test _) (f _) _])

  arc> (leafmap (whenf even [+ _ 1]) '((1 . 2) . (3 . 4)))
  ((1 . 3) 3 . 5)
Other name ideas: treemap, maptree, hyphenated versions of any of those, map-leaves.

Digression: the issue I've been more annoyed by in Arc's suite of "tree" utilities is sloppiness with direct cdr recursion. Functions have no way of discerning whether some list (a b c) they're looking at is a "top-level" form or if it's just from recursing on cdrs of (x y z a b c). E.g., using ontree in

  arc> (let if 2 (+ if (do 2)))
  arc> (count-ifs-with-do (list '(let if 2 (+ if (do 2)))))
  arc> (ifs-with-do (list '(let if 2 (+ if (do 2)))))
  ((if (do 2)))
  arc> (ifs (list '(let if 2 (+ if (do 2)))))
  ((if 2 (+ if (do 2))) (if (do 2)))
I usually want the recursion to instead occur more like

  (def ontree (f tree)
    (f tree)
    (unless (atom tree)
      (each subexpr tree
        (ontree f subexpr))))
Or even just have the above as a separate function, like on-exprs or something.

Maybe a similar maptree would be helpful:

  (def maptree (f tree)
    (let newtree (f tree)
      (if (atom newtree)
          (map [maptree f _] newtree))))

  arc> (let uniqs (table)
         (maptree [if (caris _ 'gensym) (or= uniqs._ (uniq)) _]
                  '(let (gensym a) 5 gensym <--on-its-own (+ (gensym a) 10))))
  (let gs1736 5 gensym <--on-its-own (+ gs1736 10))
Although the above formulation has the potential for infinite looping; e.g., (maptree [cons 'a _] '(a))...maybe better to do something like

  (def maptree (f tree)
    (if (atom tree)
        (f tree)
        (let newtree (f tree)
          (check newtree atom (map [maptree f _] newtree)))))

  arc> (maptree [cons 'a _] '(a))
  ((a . a) (a . a))


2 points by malisper 3764 days ago | link

I'm pretty sure we have to do the car/cdr recursion because functions like map and each don't work on pairs, they only work on proper lists and most trees are not proper lists. Code such as the following wouldn't work mainly because of the use of map/each.

  (treemap [cons 'a _] '(a . a))
I'm pretty sure we would also only want treemap to work only on atoms (your first example), it doesn't make much sense to apply the function then recur on the new tree (I find it very hard to figure out what is going happen). The problem with this is we can no longer call functions on subtrees. I do like the idea of treemap which could be written more cleanly as:

  (def treemap (f tree)
     (treewise cons f tree))
I'm pretty sure we should stick to akkartik's implementation of tree-subst (we can actually remove the first clause because it is covered in the second clause) because it works just like all of the other higher order functions where they testify their input and it works on entire subtrees.

Note: you can use iff instead of writing a new function whenf


2 points by fallintothis 3764 days ago | link

I'm pretty sure we have to do the car/cdr recursion because functions like map and each don't work on pairs, they only work on proper lists and most trees are not proper lists.

I think the obvious problem with that reasoning is it reinforces the notion that map, each, etc. "should" break on dotted lists. Even if that's the case, it's not like ontree & pals couldn't open-code a recursive function that worked on dotted tails; I merely used each/map in my example definitions as a convenience. If you prefer:

  ; Again, not that this has to replace the standard 'ontree...just can't think
  ; of a better name.

  (def ontree (f expr)
    (f expr)
    (unless (atom expr)
      ((afn (subexpr)
         (if (atom subexpr)
             (and subexpr (ontree f subexpr))
             (do (ontree f (car subexpr))
                 (self (cdr subexpr)))))

  arc> (ontree prn '(def f (x . xs) (cons nil xs)))
  (def f (x . xs) (cons nil xs))
  (x . xs)
  (cons nil xs)
This versus the standard ontree:

  arc> (ontree prn '(def f (x . xs) (cons nil xs)))
  (def f (x . xs) (cons nil xs))
  (f (x . xs) (cons nil xs)) ; <-- this potentially looks like a call to f!
  ((x . xs) (cons nil xs))
  (x . xs)
  ((cons nil xs))
  (cons nil xs)
  (nil xs)
On that note, I certainly find myself working around Arc's rampant disregard for dotted lists in my projects:

Then some trouble spots I found with avoiding sloppy cdr-recursion:

Note: you can use iff instead of writing a new function whenf

Not in vanilla Arc, I'm afraid. :) And I've always thought Anarki's iff is poorly named, due to the English collision with the "if and only if" abbreviation.


3 points by malisper 3763 days ago | link

It does seem like a good idea, abstracting as a list of lists instead of a binary tree. The best name I can think of is onlol. I really don't think that we should extend map and each to work on dotted lists, instead we should create other functions treemap and treeach since dotted lists are actually trees and no longer lists.


1 point by akkartik 3764 days ago | link

Hmm, it sounds like we need three sets of primitives: for operating on lists, trees and code (like trees, but aware of call vs arg positions). Perhaps we could use the same names like map and count but first tag the arg as tree or code..?


3 points by fallintothis 3765 days ago | link | parent | on: Clojure's syntax-quote in arc

Hey, this is promising work so far. Couple of observations:

1. FYI, uniq doesn't take any arguments in vanilla Arc:

  arc> (syntax-quote (a@ b c))
  Error: "procedure ar-gensym: expects no arguments, given 1: a@"
I'm apparently the sole weirdo who doesn't use Anarki at all, so I had to change this. It's not really necessary for the functioning of this macro, so no biggie.

2. In Arc, the t would be implied by if having an odd number of clauses:

  (if test1 then1
      test2 then2
In fact, the recent discussion at is related.

3. The uniqs!exp ssyntax stands for (uniqs 'exp) (quoted argument), when you want (uniqs exp) (unquoted argument). You could either write (uniqs exp) by hand, or else use uniqs.exp. As it stands now, this is a bug because you're always looking up the same hash table key, namely 'exp:

  arc> (syntax-quote (a@ b c))
  (gs1745 b c)
  arc> (syntax-quote (a@ b@ c))
  (gs1746 gs1746 c)
  arc> (ssexpand 'uniqs!exp)
  (uniqs (quote exp))
  arc> (ssexpand 'uniqs.exp)
  (uniqs exp)
4. It has several problems including relying on quasiquote

If it helps, here's a pure-Arc implementation of quasiquote:

5. it autogensyms every symbol that ends with "@" within exp, even if it was unquoted first

Simple to fix, if it is an issue:

  (mac syntax-quote (exp)
    (let uniqs (table)
      (list 'quasiquote
            ((afn (exp)
               (if (auto exp)     (or= uniqs.exp (uniq))
                   (atom exp)     exp
                   (unquoted exp) exp
                                  (map self exp)))

  (def auto (exp)
    (and (atom exp) (endmatch "@" (string exp))))

  (def unquoted (exp)
    (or (caris exp 'unquote)
        (caris exp 'unquote-splicing)))

  arc> (let a@ 'd (syntax-quote (a@ b c)))
  (gs1758 b c)
  arc> (let a@ 'd (syntax-quote (,a@ b c)))
  (d b c)
6. I was hoping that by fixing these problems we could easily replace quasiquote with syntax-quote (make ` be syntax-quote instead of quasiquote) because it should make writing macros easier.

Could also just hook into mac instead of quasiquote, as in This would be an easier way to extend vanilla Arc: just redefine the mac macro, or else define a variant of it if you need to preserve the old "raw" behavior.


2 points by malisper 3765 days ago | link

I managed to implement a version that redefines mac. If people want mac to stay the same we could just switch them so defmacro becomes the new version and mac remains the same. I realized that I actually liked the fact that it autogensyms the symbols that are within an unquote because otherwise you would still need to use uniq in cases where you quote inside of an unquote and you need the symbols to be the same.

  (= defmacro mac)

  (defmacro mac (name args . body)
    (let uniqs (table)
      `(defmacro ,name ,args ,@((afn (exp)
				  (if (auto exp) (or= uniqs.exp (uniq exp))
				      (atom exp) exp
				      t          (map self exp)))  

  (def auto (exp)
    (and (atom exp) (endmatch "@" (string exp))))


2 points by malisper 3765 days ago | link

First of all thanks for finding 3, I only tested the code very in very simple cases.

In response to 2, when I am reading lisp code in the format you suggested, I cannot tell whether the else case is actually part of the clause above it or is actually an else case. I just use t to make it explicit that this is the else case.

And I agree with you for 6. We should just extend mac so that it autogensyms the code it is given. There is no harm in doing this since all of the code previously written should still work.


1 point by akkartik 3765 days ago | link

Yeah, I too like not relying on an implicit 'else'. In fact, I use :else instead of t to be even more clear.

This is all most excellent. Don't be afraid to commit and push! Feel free to override the default mac if there aren't any obvious problems. There aren't very many people using anarki, and we can always roll it back later if necessary.


2 points by malisper 3765 days ago | link

I'm having issues figuring out where to put it in anarki since it has to be defined after "endmatch". I'm going to have to commit it later.


2 points by fallintothis 3767 days ago | link | parent | on: The trouble with arc's if statement

I can't think of a single non-lisp that allows adjacent condition and action without an intervening token.

I mean, isn't this requirement mostly because those languages are infix anyway? Parsing gets easier with explicit ways of separating things. I could easily imagine shift/reduce conflicts or what-have-you cropping up when you try to eliminate the requirement for, say, parentheses around conditionals in Java.

For example, in some hypothetical infix language that doesn't require conditional separators (parens, braces, then, etc.), would

  if x==10-x ...

  if (x == 10) -x ... // maybe this language doesn't use the "return" keyword,
                      // so the conditional is returning negative ten.

  if (x == (10 - x)) ...

Because Lisps use s-expressions, "intervening tokens" (per se) are unnecessary. As you say, Arc's choices don't seem so unreasonable, considering that.


2 points by akkartik 3767 days ago | link

Yeah, that's a good point. Non-lisps use keywords and punctuation for readability and to make parsing tractable, and the two reasons are often hard to separate in any single design decision.

To summarize my position: I have some sympathy for the specific argument that multi-branch 'if's are harder to read in lisp than in traditional languages[1]. But this affects any arrangement of parens, whether traditional 'cond' or arc 'if'.

[1] I see I said so to you before, at the end of


3 points by fallintothis 3767 days ago | link | parent | on: The trouble with arc's if statement

Most of my if blocks _are_ a single s-expression for each branch.

I was going to say...

  (def if-with-do (expr)
    (and (caris expr 'if)
         (some [caris _ 'do] expr)))

  (def accum-expr (accumf testf exprs)
    (map [ontree (andf testf accumf) _] exprs))

  (def keep-expr (testf exprs)
    (accum a (accum-expr a testf exprs)))

  (def count-expr (testf exprs)
    (summing s (accum-expr s testf exprs)))

  (defs count-ifs-with-do (exprs) (count-expr if-with-do exprs)
        count-ifs         (exprs) (count-expr [caris _ 'if] exprs)
        ifs-with-do       (exprs) (keep-expr if-with-do exprs)
        ifs               (exprs) (keep-expr [caris _ 'if] exprs))

  arc> (each file '("arc.arc"
         (prn "=== " file)
         (let exprs (readfile file)
           (prn (count-ifs-with-do exprs) "/" (count-ifs exprs))))
  === arc.arc
  === strings.arc
  === pprint.arc
  === code.arc
  === html.arc
  === srv.arc
  === app.arc
  === prompt.arc
I also find it very interesting that the author doesn't indent the code at all.

I have a feeling (hope) that that was unintentional---like the blog formatting is just eating whitespace or something. Also, the first "Arc if" has a missing right parenthesis and uses = for comparison. :)


3 points by zck 3767 days ago | link

Nice code. Some of my lying-around arc files:

unit-test.arc 2/12

All of my project euler solutions: 4/98


2 points by fallintothis 3767 days ago | link

I was hoping other people would investigate their code! :) My personal arc directory filtered by hand for completeness (I have a lot of half-finished projects lying around), plus some stock Arc stuff I forgot in the last post:

  (each file (lines:tostring:system "find . -name \\*.arc")
    (prn "=== " file)
    (let exprs (errsafe:readfile file)
      (prn (count-ifs-with-do exprs) "/" (count-ifs exprs))))

  === ./transcribe/transcribe.arc
  === ./qq/qq-test.arc
  === ./qq/qq.arc
  === ./macdebug/macdebug.arc
  === ./bench/nbody.arc
  === ./bench/pidigits.arc
  === ./ansi.arc
  === ./news.arc
  === ./sscontract.arc
  === ./blog.arc
  === ./libs.arc
  === ./profiler/profiler.arc
  === ./vim/gen.arc
  === ./hygienic.arc
  === ./trace/trace.arc
  === ./contract/test.arc
  === ./contract/contract.arc
  === ./contract/special-cases.arc
  === ./contract/reconstruct.arc
  === ./contract/defcase.arc
  === ./contract/test/canonical.arc
  === ./contract/test/ssyntax.arc
  === ./contract/ssyntax.arc
  === ./contract/util.arc
  === ./quick-check/quick-check.arc
  === ./arity/arity.arc


1 point by akkartik 3767 days ago | link

Nice trick, passing accum functions around!