Arc Forumnew | comments | leaders | submitlogin
7 points by andreyf 5720 days ago | link | parent

What about "100 year language" don't people understand? PG's point isn't to make the language popular now, or in a decade... it's to make it popular 100 years from now.

Personally, I think it underestimates how different the world will be 100 years from now to talk about how programming languages will be then (just imagine what kind of "computers" we'll have then!), but I suppose the "100" number is arbitrary... I think PG just means generally looking at "long term", not "short term", and having a community is a short term advantage.

7 points by bOR_ 5720 days ago | link

PG won't be alive in a 100 years.. and there's a difference between slowly searching for an optimal language and not working at arc at all.

It is a bit sour for the community if their effort are not affecting the progress of the language towards its goal.

To be fair, PG has a point though that n months isn't that much if Arc is supposed to be 'finished' in about 30 years from now. We might just have wrongly expected things to move at a somewhat faster pace.


2 points by shader 5719 days ago | link

Speaking of optimizing for "100 years", what are some design decisions which would make the language more flexible/easy to use that, while possibly sacrificing speed or efficiency, would make the language more useful in the long run?

It seems to me that computers (currently, it may not last) are doubling in speed rather frequently, so any efficiency problems in the language will quickly be outweighed by it's usability. That's partly why one of the most popular tools for making websites is ruby. It may not be fast, but it's cheaper to double the server capacity than hire more programmers.



3 points by nlavine 5719 days ago | link

If you're looking at long-term prospects, my understanding is that the idea that computers have "speed", which "matters", is actually temporary (should stop around 2040 if current trends continue). At that point you run into the thermodynamic limits of computation, where the question is how much energy you're willing to put into your computation to get it done.

I don't know quite enough about that yet to talk about details, but see for some more information (haven't read all of it yet). The paragraph right under Figure 1 seems to be crucial.

However, I'm not sure that affects language design a great deal. It seems to me the goal of a language is to let you express algorithms in a concise and efficient manner, and I don't see that becoming unimportant any time soon.


2 points by shader 5703 days ago | link

Well, I think we're sort of agreeing with each other. You said that speed should be irrelevant, because language design was about describing algorithms concisely and efficiently; I said that we should try and think outside the box, by ignoring our sense of efficiency. If it didn't matter how the feature was implemented, what features could we come up with that could radically change how easy it is to code? Maybe there aren't any, but it is an interesting idea. That is not to say that I'm only interested in inefficient improvements ;)

Another possibility: Lisp is an awesome family of languages partly because they have the ability to create higher and higher levels of abstraction via macros. "Complete abstractions" they were called in Practical Common Lisp. Are there any other means that could create "complete abstractions"? Or easily allow higher level abstraction?


2 points by rntz 5719 days ago | link

Moore's law has actually been failing as of late. I suspect that exponential increase in computer capability is a feature of the yet-nascent state of computer technology. Assuming that it will continue into the foreseeable future is a bad idea. Performance is important; it's just not as important as some think, and in particular, it's not so much all-around performance as the ability to avoid bottlenecks that's important. This is why a profiler for arc would be nice, as would some easy and standard way to hook into a lower level language for speed boosts.


2 points by andreyf 5719 days ago | link

If you generalize Moore's law to cost of computation per second, it isn't failing. In 100 years, network availability, speed, and capacity will lead to yet-unimagined changes.

How will programming languages take advantage of a thousand or a million CPU's?


3 points by jimbokun 5719 days ago | link

Isn't Google (and others) already answering that question today?


3 points by andreyf 5718 days ago | link

OK, so let's rephrase it practically - how do you design a programming language if the IDE is running on Google's infrastructure?