Arc Forumnew | comments | leaders | submit | rntz's commentslogin
1 point by rntz 5783 days ago | link | parent | on: Bug in atomic when using kill-thread

That doesn't help if you want to kill a thread which is stuck (or just taking a long time) within an (atomic ...) expression, which seems to me an important use case. Unfortunately the plt doc website appears to be down now, otherwise I'd look into solving it myself.

-----

1 point by akkartik 5783 days ago | link

True, but that sounds like an enhancement to the semantics of atomic (a way to interrupt an atomic operation) rather than a bug in kill-thread.

-----

1 point by aw 5783 days ago | link

There shouldn't be a way to interrupt an atomic operation, because then it won't be atomic.

Your elegant solution to make kill-thread atomic is a good one, if the "killed" thread is in fact guaranteed to be terminated instantly... I'll have to look into it to see if I can find out.

-----

1 point by akkartik 5783 days ago | link

There shouldn't be a way to interrupt an atomic operation, because then it won't be atomic.

An interruptable atomic is basically a transaction that can be rolled back. If it gets interrupted nothing is changed. Probably doesn't make sense to run it from anywhere but kill-thread and similar operations.

To restate my original point: reasonable semantics for atomic are that threads inside atomic can't be killed. How to kill long-running atomics is a separate issue, needing a lot more engineering.

-----

1 point by aw 5783 days ago | link

reasonable semantics for atomic are that threads inside atomic can't be killed

We agree.

-----

1 point by rntz 5791 days ago | link | parent | on: Function overloading

Take a look at aw's 'extend macro: http://awwx.ws/extend

It takes a less efficient but more general approach than type-based dispatch; I find it more useful, at least in a language like arc. An older (incompatible, unfortunately; I may fix that at some point, but I'm unconvinced that labels weren't a good feature) and somewhat modified version of it is on anarki at lib/extend.arc (http://github.com/nex3/arc/blob/master/lib/extend.arc).

-----

1 point by aw 5791 days ago | link

Say, have you ever used labels? When I wrote my first version, I said, "ah, labels, good idea, I'll put them in". Then after using it for months I noticed that I wasn't ever using the label. So I took it out on my next iteration.

-----

2 points by rntz 5791 days ago | link

I have used them. Mostly I used them for debugging, rewriting, "code sketching" as it were. Without them, extending or reextending a function at the REPL is just annoying; if you ever make a mistake in the test function you have to reload the original function. I suppose it might make sense to have two macros - extend and extendl, maybe - one which doesn't take a label, and one which does.

-----

1 point by aw 5791 days ago | link

How about an undo, to undo the last extend you did?

Or a reextend macro, which undoes the last extend and then applies the new extend?

What I found annoying myself about the label was that I had to type it every time, even if it turned out that I didn't need to rewrite it.

-----

1 point by rntz 5791 days ago | link

Undo & reextend could easily be implemented on top of labeled extend - just store the label last used when extending a given function in a table, along with whatever information is needed to undo that extension, and have an unlabeled extension macro that uses a gensym as the label. That way we get labeled & unlabeled extend with undo & replacement.

-----

1 point by adm 5784 days ago | link

extend looks good. but I don't understand where one can use it, apart from extending a function for different types.

-----

1 point by aw 5784 days ago | link

In the Arc server, I moved creating the request object out of 'respond (http://awwx.ws/srv-misc), so that 'respond is now being passed the request object and does the usual Arc srv action to respond to a request: look to see if there is a defop etc. defined for the request path.

Now I can extend respond to implement my own ways of responding to requests: I can easily have my own way of choosing which action to take (I can have http://myserver.com/1234 display the "1234" item page, for example), or implement my own kinds of defop's.

-----

1 point by adm 5784 days ago | link

I see it now. So type based dispatch is just one of the cases where it can be used. Thanks for the explanation. but why rntz says it's less efficient?

-----

2 points by rntz 5791 days ago | link | parent | on: Arc wiki?

That's from back before the transition to arc3, so it'll now be on the arc2.master branch rather than anarki master. (Obviously, if anyone would care to translate it and push it, that would be great.)

-----


GHC (the de facto standard Haskell compiler) uses "implicit parameters" as the name of an extension that is almost, but not quite, equivalent to dynamic binding. Namely, in Common Lisp, you have the following:

  CL-USER> (defvar *foo*)
  *FOO*
  CL-USER> (let ((f (let ((*foo* 0)) (lambda () *foo*)))) 
             (let ((*foo* 1)) (funcall f)))
  1
Whereas in Haskell with GHC extensions, you have:

  Prelude> :set -XImplicitParams
  Prelude> let f = (let ?x = 0 in \() -> ?x) in let ?x = 1 in f ()
  0
I think that, as "dynamic variables/binding" is the name that has always been used before, and "implicit variables/parameters" is a name used to refer to something subtly different, it might be better to use the former; but it doesn't really matter that much, and "implicit" does do a better job of getting across the purpose of dynamic binding than just calling it "dynamic".

-----

1 point by aw 5810 days ago | link

That's interesting that Haskell also uses the term "implicit". The differences seem mostly related to the type system. For example, since the variables are statically typed, it's easy for them to overload "let" to do dynamic binding with implicit parameters. Is there some other subtle difference that I'm missing?

The "it's always been called X before" argument would prevent us from ever improving the language by coming up with better names; we'd still be calling anonymous functions "lambda" instead of "fn" etc.

-----

1 point by rntz 5794 days ago | link

Er, see the post you replied to: that code I gave is precisely the same modulo syntax, but in Common Lisp it evaluates to 1 and in Haskell to 0. Essentially, Haskell's implicit variables don't allow rebinding/shadowing. It's a bid hard to explain, but look at the example and play around a bit and you'll see what I mean.

-----

1 point by aw 5793 days ago | link

Ah, I don't know enough Haskell to really be able to see what's going on. Thank you for the description though.

-----


It's consistent, but it means that passing special forms to higher-order functions is essentially just a form of punnery - it lends you no more expressiveness. There is no way (unless I'm mistaken) to get (map if '(t nil) X), where X is some expression, to evaluate the first element of X (or its evaluation) but not the second. So I could as well just define a function:

    (def iffn a
      (iflet (c . a) a
        (iflet (x . a) a
          (if c x (apply iffn a))
          c)))
    
    (map iffn '(t nil) '(1 2) '(5 6))
    => (1 6)

-----

1 point by rntz 5826 days ago | link | parent | on: Multiple cases in 'case branches

It may not matter to you, but IMO it matters to the community (what little of it there is) that anarki and arc not break compatibility in such a simple case as this.

-----

2 points by rntz 5826 days ago | link | parent | on: Multiple cases in 'case branches

I very much expect the current behavior. Can't really say why, it just seems natural to me that a list as a case means "pattern-match", not "any one of these". Also it would break compatibility with pg's arc. Adding a new macro, say, 'mcase ("multi-case") or 'orcase, would be totally fine though.

-----

1 point by akkartik 5826 days ago | link

Or add a sym to the case itself.

    (case x (in 1 3 5) 'odd ..)
Harder to implement, though.

-----

1 point by aw 5826 days ago | link

That's a clever idea, I think I probably would use a macro that let me check a value against a series of expressions (perhaps using some form of currying). I don't think I'd call it "case" though :-)

-----

1 point by akkartik 5826 days ago | link

How about allowing cases to be functions?

  (case x
    [in _ '(1 3 5)] 'odd
    ..)

-----

1 point by twilightsentry 5826 days ago | link

I've got a macro like that, which I call 'test. The problem with combining that and 'case is distinguishing functions and constants before the cases are eval'ed.

  (mac testlet (var expr . args)
    (letf ex (args)
           (if cdr.args
               `(if (,car.args ,var) ,cadr.args
                    ,(args cddr.args))
               car.xs)
      `(let ,var ,expr ,(ex args))))
Anyway, I'll probably just define 'casein as aw and rntz suggested.

-----

1 point by aw 5826 days ago | link

'casein occurs to me, reminiscent of 'in

-----

2 points by rntz 5857 days ago | link | parent | on: "Magical" on-disk persistence

You are correct. 'assign doesn't depend on 'sref. '= does.

-----

2 points by rntz 5861 days ago | link | parent | on: The power of function combinators

Have you used Parsec? It's Haskell's parser-combinator library, and I noticed quite a few similarities. Haskell's typeclasses also turn out to provide a lot of general abstractions that can be applied to parser-combinators as well; for example, your 'on-result is approximately equivalent to Haskell's 'fmap, which is the function for genericised mapping. (And of course there's the fact that the parsec parser type is a monad, but I won't get into that...)

On a side note, Parsec is a little different in nature, because it distinguishes between a failure without consuming input (soft) and failure after consuming input (hard); soft failures fall back to the closest enclosing alternation, while hard failures fall back to the nearest enclosing "try" block, and then become soft failures. This means that, if you're careful, you can write with Parsec without having to worry about the exponential blowup typical of the parser-combinator approach. I'd be interested to see whether your JSON parser has any such pathological cases.

-----

2 points by aw 5861 days ago | link

I have used Parsec. One of the most dramatic differences is that Parsec supports backtracking, and so it's a more powerful parser. Since parsing JSON doesn't need backtracking, I got to avoid both the complexity of implementing backtracking and the need to carefully avoid exponential backtracking blowups :D

-----

2 points by rntz 5895 days ago | link | parent | on: Lib/net.arc

No, pushing doesn't require permission; that's more or less the point of a world-writable repository. You don't really need to put info on libraries in CHANGES/, though, as is said in CHANGES/README; it's more for changes to the core language.

-----

More