Arc Forumnew | comments | leaders | submitlogin
Prototype dataflow framework for arc
5 points by rntz 5352 days ago | discuss
I'm working on a library/framework implementing vaguely dataflow-like abilities. It is extremely unfinished at this point, but can be reached as a darcs repository at http://www.rntz.net/darcs/arc/flow/. It expects to be in its own subdirectory of the libs/ directory in arc, and it will only work in anarki (it requires a lot of hacking with plt scheme under the hood, and also makes heavy use of the utility functions in anarki's lib/util.arc).

The basic idea is that you define "dataflow-aware" functions using the special form 'flowdef, and these functions will cache their values (as with 'defmemo) until any of the computations they depend on change - these computational dependencies being automatically determined by what other dataflow-aware functions are called by the function.

Then, you can specifically force a given computation to be updated in various ways, and when you call any function which would depend on it (transitively), it will be recomputed.

A simple example:

  arc> (flowdef test (x) (prn "test: " x) x)
  #<procedure: test>
  arc> (flowdef incby () 1)
  #<procedure: incby>
  arc> (flowdef inctest (x) (prn "inctest: " x) (inc (test x) (incby)))
  #<procedure: inctest>
  arc> (test 0)
  test: 0
  0
  arc> (test 0)
  0 ; the value is cached
  arc> (inctest 1)
  inctest: 1
  test: 1
  2
  arc> (inctest 1)
  2 ; again, cached
  arc> (flowdef incby () 2)
  *** redefining incby
  #<procedure: incby>
  arc> (inctest 1)
  inctest: 1 ; no longer cached, because 'incby changed
  3
There is also a simple and barely functional example of using this framework to implement a make-like system in build.arc in the repository.