Arc Forumnew | comments | leaders | submit | applepie's commentslogin
1 point by applepie 5674 days ago | link | parent | on: Last call for Arc bugs

I think I may have lost my faith in Arc.

But the comparison with math notation was enlightening.

You definitely have a point.

-----

3 points by applepie 5689 days ago | link | parent | on: Request for Bugs

I sincerely don't understand.

If you want to make a good abstract language, you shouldn't care too much about the bugs of this particular implementation. Fixing the bugs of the current version is kind of useless.

If you want to make a good practical language, don't you think it's a bit late to fix the bugs? Why don't you, for instance, just make Anarki the official version?

-----

5 points by thaddeus 5689 days ago | link

Anarki has a lot of great stuff in it, but it also has alot of crap in it. I spent 2 weeks just going through the code pulling out the pieces I like from Anarki and not the rest. I certainly hope pg keeps the arc code down to the dozen or so core files forcing the community to build libraries on top rather than integrate adhoc extras at such a low level that it's brutal to separate the good from the bad.

-----

5 points by pg 5679 days ago | link

I care because to do a good job at language design you have to use the language to write applications.

-----

2 points by conanite 5688 days ago | link

I sincerely don't understand

Version n of anything usually consists of incremental improvements to version n-1. Including bug-fixes as well as new features. To deliver a new version written from scratch would be to deliver a new set of bugs written from scratch.

-----


Should we really be so worried about efficiency?

Of course you are free to design and implement any language you want, and I know you've probably been the main contributor.

But I think the original spirit of Arc was to design a simpler language even if it couldn't be implemented as efficiently:

(Quoting pg, The Hundred-Year Language):

There's good waste, and bad waste. I'm interested in good waste-- the kind where, by spending more, we can get simpler designs. How will we take advantage of the opportunities to waste cycles that we'll get from new, faster hardware?

-----

2 points by almkglor 5898 days ago | link

Of course not. Conceptually, Arc-F is just as clean as Arc. You can implement CLOS-style monomethods in Arc-F from the Arc-F side. You can implement the '+, '-, '* and '/ operators on the Arc-F side (you just need to provide the basic '<base>+ etc. operators on the underlying base system side). And I'm willing to bet most of you would never have imagined that not all functions in Arc-F are functions (some of them are special objects that are interpreted by Scheme) if I never told you about that.

For example:

  <User>tl: +
  #<procedure>
  <User>tl: (list +)
  (#4(l-reductor #<procedure> #<procedure> #<procedure>))
Arc-F divides itself into the "IDEAL" and the "REAL". The "IDEAL" is how the language would be implemented in terms of the most basic axiomatic operators. "REAL" is how it has to be implemented to prevent it from being 10x slower than Anarki (yes, I actually did the IDEAL part first, and only later ported to the REAL part where bits of it are in scheme. We're talking 12 second load times for arc.arc, and arc.arc only).

Even PG himself abandoned the purely axiomatic approach when he found that performance was very, very bad. My aim is to return to a purely axiomatic approach in the IDEAL and to make appropriate hooks for the REAL part to optimize everything. As at is, Arc-F is about 2x-3x slower than Anarki. For me, that's acceptable. And so far, most of the speed loss is from function calls.

-----

1 point by almkglor 5898 days ago | link

As a concrete example: in Arc-F, it's possible to overload '+ by simply overloading <base>+. If you have a new type foo and want to define how to add two foos together, you just overload <base>+. The neat thing is that this won't slow down arithmetic very much, if at all; if you were to write the equivalent overload '+ in Anarki (Anarki doesn't have the <base>+ hook, after all), arithmetic performance suddenly drops precipitously.

Also, overloading <base>+ is simpler: <base>+ only requires that you consider the case of adding two objects. + determines how to handle things when you do (+ foo1 foo2 foo3 foo4), specifically it converts the call to (<base>+ (<base>+ (<base>+ foo1 foo2) foo3) foo4)

So yes: my position is, by focusing on efficiency in implementation, it frees us to actually use the axiomatic approach without throwing in the towel and going non-axiomatic.

-----

2 points by stefano 5896 days ago | link

I like the REAL/IDEAL distinction. Always following the IDEAL path to make the "one hundred year language" without considering performance easily leads to unacceptable performance. Your experience with 'load clearly shows that.

-----

1 point by applepie 6014 days ago | link | parent | on: Rethinking macros

Maybe you'd like them more if you called them "macros" instead of "unhygienic macros" ;)

No, really, macros, being essentially compilers, give you enough power to build everything you'd ever want into Lisp, including "hygienic" macros, and even to build facilities to write hof-like things in less painful ways.

Maybe they're a pain in the ass if you don't "go meta", in the same way computers are a pain in the ass if you don't build OSes and compilers first.

-----


I wonder if the aim of the vm is to be faster than the standard Arc implementation, or just to have a means of reifying the continuation and thus be able to stop a process to do process management.

I would go for simplicity and copy always. Unless Arc had some kind of inmutable data structure and the vm were able to copy just the mutable parts.

-----

3 points by almkglor 6019 days ago | link

> I wonder if the aim of the vm is to be faster than the standard Arc implementation, or just to have a means of reifying the continuation and thus be able to stop a process to do process management.

More of a way of taking advantage of multiple processors/cores while not having to do a lot of process management. Of course, there's not much point in using multiple processors anyway if the VM itself is slow ^^

> I would go for simplicity and copy always. Unless Arc had some kind of inmutable data structure and the vm were able to copy just the mutable parts.

I agree, I did have a plan for a sometimes-sharing, copy-on-write-and-if-receiver-has-a-shared-copy VM, but it felt more complex than necessary.

It would probably be useful to also have binary blobs like in Erlang.

-----

2 points by applepie 6027 days ago | link | parent | on: What's next?

Well, this is not a game where we have to follow rules.

This is real life, so if we can make a better language, we don't have to wait pg to come. (And I've lost the hope he does).

-----

2 points by almkglor 6027 days ago | link

Anarki! Anarki! Go go Anarki!

-----

5 points by applepie 6028 days ago | link | parent | on: What's next?

The sad part in Lisp dialects is that most other programming constructs can be faked more or less decently with macros...

-----

3 points by stefano 6027 days ago | link

I would say this is the funny part...

-----

2 points by applepie 6027 days ago | link

This is fun in that one can do a lot with a small core language. But I also think that core languages may need to evolve, and Lisp expresiveness somehow makes that difficult.

E.g. I think we have a lot to learn from Prolog, Icon, Haskell, Dylan, Erlang, and many others.

-----

3 points by almkglor 6027 days ago | link

> E.g. I think we have a lot to learn from Prolog, Icon, Haskell, Dylan, Erlang, and many others.

Such as?

I know we need to have more Erlang-style VM's. In particular I think we need VM's that pretty much enforce shared-nothing by causing all messages to be copied at the VM level. Hence my current project, which will also tie in some code from the AST-based compiler of arc2c.

As for the other languages... what lessons do we need to learn? I know Haskell makes static type checking really really cool, but I'm not certain if we can bash that into a Lisp. Lisp macros can automagically give you lazy evaluation (you just need to insert special syntax to denote lazy evaluation, in much the same way that Haskell requires special syntax for forced evaluation)

-----

8 points by applepie 6027 days ago | link

I'm not necessarily thinking in the current Arc, but in the hypothetical hundred-year Arc.

- Could we make pattern matching through unification a part of the core language?

- Could we make some tasks easier (I mean terser) using Icon or Prolog-like goal-oriented programming?

- Could Haskell typeclasses be adapted to a dynamic language? (I know that is not possible in general, but maybe in the most common cases). CLOS-like oop is the way?

- Easy defstruct as in Haskell is also a win.

- Which would be the best module system? Python's first order modules are cool, but I also Haskell's module system seems better to do ADTs.

- Could we use generators using a lazy list-like interface?

- To which point could the system be reflective? In the past it was considered that "flambda" (user defined, first class special forms) and redefition of eval with "reflective towers" are not that good ideas, because of efficiency. Should we spend more cycles to allow such things?

- Should the read s-expressions also have associated meta-information?

- Could Arc live in an image?

- Could Arc be just the specification of its own compiler, making the bytecode an official part of the language (as pg once said)?

- Could Arc be blazingly fast?

- Could Arc be prepared for multiprocessor architectures like the guys (pun intended ;)) in Fortress want it to be? Or to intensive scalability like Erlang seems to have?

Of course all that questions are implementable. There probably is a library for every one of them in Cliki.

I am just asking which of them we should take and encourage to be seen as part of "Arc's core". Commonality and practice of an idiom, more than its mere possibility, is what actually defines what a language is.

Currently, Arc's core is not much more than sugar for Scheme (modulo defcall, mostly unseen and great). This doesn't want to be insultive, I am happy that people find Arc useful.

Maybe it is that I am looking for a revolution and Arc is Lisp, which is good but not revolutionary.

-----

2 points by almkglor 6026 days ago | link

> - Should the read s-expressions also have associated meta-information?

IMO yes. It might even be possible to have the meta-information attached to the nearest cons cell instead, so a cons cell might have, say, (cons a d (o line-number) (o file-name)). This makes symbols always equal other symbols without having to worry about attached meta-information for each "symbol".

Might be useful to add in my VM then ^^.

> - Could Arc be just the specification of its own compiler, making the bytecode an official part of the language (as pg once said)?

Interesting. Might be useful to define Arc as a set of macros which just expand to a bunch of bytecode.

Anyway I'm thinking my VM should be "bytecode" based. The bytecode won't have a numeric representation, but it will have a symbolic one, like ((localvar 0) (globalvar foo) (plus))

> - Could Arc be prepared for multiprocessor architectures like the guys (pun intended ;)) in Fortress want it to be? Or to intensive scalability like Erlang seems to have?

Well, this is what I want to do ^^.

> - Could we use generators using a lazy list-like interface?

Scanners? I've actually used scanners for something like this in Arki, the wiki in Anarki.

-----

1 point by rincewind 6027 days ago | link

The Problem with unification and logical variables would be that they have to be dynamic in scope and work with a call-by-name evaluation strategy, at least in Prolog.

Anyway, you could build a Prolog-Like DSL with pat-m and amb

  http://arclanguage.org/item?id=2556
  http://arclanguage.org/item?id=6669
While logic Programming in Arc can make problem-solving easier, i think it introduces problems for libraries:

- How should you call functional functions from logical functions?

- How should you call logical functions from functional functions without violating referential transparency?

- Do closures over logical variables make any sense?

- Should goals be limited in scope? Lexical or dynamic? If I call 2 logical functions from different modules I may or may not want the first to be retried when the second fails, depending on the context.

-----

1 point by tung 6027 days ago | link

How do Haskell's modules differ from Python's?

-----

2 points by applepie 6027 days ago | link

Python's modules are first class. For example:

  import module

  x = module
  x.function(...)
In Haskell, names are resolved during compile-time...

Also, in Haskell modules can define which names to export. In Python everything is visible.

-----

1 point by applepie 6028 days ago | link | parent | on: Ask AL: pprint + vim

You've obviously taken the lead!

-----


Because arc is for smart programmers and pg wants to be second-guessed.

-----

5 points by absz 6030 days ago | link

Or maybe there's a bug somewhere? The default assumption should not be "pg is an elitist jerk," but "pg is a fallible human being."[1] Whether or not pg is an elitist jerk is irrelevant, as the answer is probably that there's a bug/some unexpected behaviour hiding somewhere in arc.arc or ac.scm. In fact, I think that is the case, and the problem has to do with translating lists between Arc and mzscheme: see http://arclanguage.org/item?id=6993. But it's always best to analyze first, since you get more done that way.

[1]: In fact, "so-and-so is a fallible human being" is almost always the best first stop: Hanlon's Razor ("Never attribute to malice that which is adequately explained by stupidity.") and all that.

-----

1 point by sacado 6029 days ago | link

There are bugs in the implementation of hash tables, he confirmed this. Don't try them with strings as keys for example (I mean, strings you modify).

-----

2 points by applepie 6029 days ago | link

I don't think pg is an elitist at all.

-----

1 point by absz 6028 days ago | link

My apologies then—I misread the sentiment behind your comment. In that case, I direct the sentiment of my previous comment away from you :)

-----

2 points by applepie 6039 days ago | link | parent | on: Don't understand how macros work

Spirit of Lisp?

Lisp is what you want it to be, not pg's fundamentalist view on it.

-----

6 points by absz 6039 days ago | link

That's not quite true. You would not see Lisp code written as though it were assembly, full of gotos and working with a finite number of registers, to take an extreme example. Lisp is a multiparadigm language, but it is heavily functional. As such, writing imperative code is not generally advisable ("not in the spirit of Lisp"), but is possible. Similarly, code with such a heavy use of variables is likely unfunctional, and thus would similarly be outside the "spirit of Lisp". Yes, you can do anything you want in Lisp (cf. the Church-Turing thesis), but it may not be advisable ("in the spirit of Lisp"). Again, for instance, if you want to work in a stack-based manner, it would probably be advisable to either (1) rework your code, or (2) switch to a stack-based language like Factor.

This is not unique to Lisp: if you want to write a tail-recursive, functional, list-based program in C, that's probably not a good idea. You can, but since C is optimized for iteration (with many implementations not even supporting tail call elimination) and for imperative programming (it lacks closures and anonymous functions), and since memory management in C is…clunky…you would be better off (1) reworking your code, or (2) switching to a more functional language with lists and tail call elimination, such as a Lisp.

In short, yes, as a Turing-complete language, Lisp can do anything. And as a multi-paradigm language (with macros), it can do a good job at performing a given task in any way. But it has strengths, inclinations, and intentions, which together do comprise what could be called a "spirit of Lisp".

-----

5 points by almkglor 6039 days ago | link

> Lisp is what you want it to be, not pg's fundamentalist view on it.

The Turing Machine can be anything you want it to be, but it's not always easy - or desirable - to force the machine into doing what you want.

You can use hedge clippers to cut your toenails, but I doubt you'd want to do that.

-----

More