A failure that is unexpected and unplanned for is a bug. Thus it's a bug if a file doesn't exist and my code doesn't handle that situation.
The boundary is what I want to happen in response to a bug vs. a failure. When I hit a bug, an actual bug, I want to capture the entire state and history of my program, to the fullest extent possible, so that I can find out why the bug occurred. I don't care if this a core dump is GBs in size or might be expensive to generate. If a bug occurs I want all possible information that might help me, everything that the language runtime can produce.
For failures, for expected failures, for failures I handle, I don't need to capture anything. I don't even need a stack trace. I don't need the language runtime to generate a stack trace every time I hit an expected, handled failure.
Existing languages don't allow me to do this. At the point where for example the "file not found" exception is being thrown there isn't enough information to tell whether that's a failure or a bug, so they have to be handled the same.
Especially for modules, I do like using function and macro values in macro expansions.
Consider a module which contains a macro such as `for`. I can import the macro into my module simply by copying the macro value into my module, e.g.: `(= for other-module!for)`. By using function and macro values in the macro expansion (as you do in `for2`), the functions and macros that the imported macro uses don't also need to be copied into my module.
I haven't worried myself about how most terms end up getting a comma prefix, but I agree there could be an alternative syntax where injecting the value of a symbol (instead of the symbol itself) could be the default.
To handle special forms, a solution is to come up with a globally unique random prefix long enough that no one could generate it accidentally (e.g. "xVrP8JItk2Ot"), and then rename the special forms `assign`, `fn`, etc. to `xVrP8JItk2Ot-assign`, `xVrP8JItk2Ot-fn` etc; and then add `assign` and `fn` macros etc. that expand into the prefixed special form. Now `assign`, `fn`, etc. are macros that can be used as values in a macro expansion.
(Making `fn` a macro also has the advantage that features such as default arguments can be implemented in Arc).
I was at first concerned that using such a random prefix was ugly, however eventually I realized that no one ever needs to see the prefixed versions :-)
> (aka for them to not be "hackable")
I was indeed curious about maximizing hackability as a general principle... but with modules even if a macro such as `for` is imported into my default namespace and thus expansions such as `with` wouldn't use my version, in practice it's easy enough to create a new module with my version `with` etc. and simply load the `for` source into the new module. (Or simply reload the `for` source into my default module).
> (eval (list + 1 2))
3
This is unusual in Lisp implementations, and I was quite pleased when I discovered that Racket supported this.
(#M:with (i nil gs3342 1 gs3343 (#F:+ 10 1))
This is a nifty idea... hmm, is there a reason to use different prefixes for macros and functions? We could have a read syntax which evaluated to the value of the symbol (whatever it is), and that would work for both functions and macros.
> To handle special forms, a solution is to come up with a globally unique random prefix long enough that no one could generate it accidentally (e.g. "xVrP8JItk2Ot"), and then rename the special forms `assign`, `fn`, etc. to `xVrP8JItk2Ot-assign`, `xVrP8JItk2Ot-fn` etc;
Would this prefix be present in the source files of the language implementation? Or generated at runtime, or something else? If the latter, it seems like gensyms are the right approach. If the former, what's the use case—someone writing programs that assume someone might have redefined "assign" and want to work with the builtin? (If it's just hacking arc3.1 to do what we want, I think you can create more or less arbitrary unique objects (vectors, cons cells, a new kind of struct), give them bindings in the base Arc environment, and replace e.g. '(eq? (xcar s) 'quote) with '(eq? (xcar s) the-quote-object).)
Speaking of gensyms, I had a plan recently. I like PG's approach of just sequentially named variables, but (a) my ideal language would probably expose its interned-symbol table, and it'd therefore be more orthogonal and simpler for it to directly expose the "create a symbol without interning it" function, so uninterned symbols should be available for free; (b) it is possible that people will name a variable "gs1", so that's another reason to use true gensyms; (c) many macros use gensyms, but I might like to bootstrap a system that likes macros the expansion of which incurs zero side effects, and incrementing a gensym counter is a side effect. For (c) there might be another approach, but I came up with this one: Have the printer assign sequential numbers to each uninterned symbol it encounters (with a weak hash table). This has the nice effect that, when you do print macroexpansions, the numbers you see will begin at 1, rather than in the hundreds or thousands and varying depending on how many libraries have been loaded.
> (Making `fn` a macro also has the advantage that features such as default arguments can be implemented in Arc).
Yes, that is nice. I actually did this in my Lisp in Summer Projects submission[1] many years ago.
> is there a reason to use different prefixes for macros and functions? We could have a read syntax which evaluated to the value of the symbol (whatever it is), and that would work for both functions and macros.
Eh, no strong reason. I'd thought about doing a reverse variable lookup for every value and printing the variable name if it existed, but it makes less sense if e.g. the value 10 gets printed as #V:<some global variable that happened to be bound to 10>. Also, I figured an IDE-type setup (such as DrRacket) might use different colors or fonts to distinguish macros/functions/special forms/other. Last, you do need a way to print lambdas that don't have a global name (and might not even have a local name), and I'm guessing you'd want that to appear as #f:<some attempt at indicating line number / REPL input>, looking analogous to but different from the global case.
[1] https://github.com/waterhouse/emiya : "[O]ptional arguments are implemented by redefining "fn" (Arc's equivalent of "lambda") as a macro that expands to a call to "underlying-fn", which is bound to the fn special object--again, by the user. Then everything is recompiled, so that this redefinition does not cost anything at runtime; some care is needed here, to avoid an infinite loop, if a function used to recompile "fn" itself uses "fn"."
It's reading the invalid sequence as � U+FFFD REPLACEMENT CHARACTER, which translates back to UTF-8 as EF BF BD (as we can see in the actual results above). The replacement character is what Unicode offers for use as a placeholder for corrupt sequences in encoded Unicode text, just like the way it's being used here.
Looks like Arc vs. Clojure has been pretty well covered by the other comments. To take a step back and look at your goals, in case you might find it useful... consider separating the need to eat from your other ambitions.
The demand for hackers is very high right now, so it's easy to find work.
Most people when they get a job and make more money, immediately raise their standard of living. I.e. they find a nicer place to live, they eat more expensive food, maybe buy a car or get a nicer one, etc. But you don't have to do that if you don't want to. You can, if you choose, keep your expenses low while working, and save a lot of money instead.
When your expenses are lower than your income, you don't need to work full time. For example, you could work part time. Or, you could work full time for part of a year and not work the rest of the year.
With "I need to eat" covered, then you have time to pursue your interests without fearing that you're going to starve if you don't get things going quickly enough.
YC has a less than 3% acceptance rate (https://blog.ycombinator.com/yc-portfolio-stats), so applying to YC isn't a great strategy for keeping from starving. (YC is great if what you want to do is build a world changing startup. For meeting your own basic income needs there are many far easier and much more certain ways to do that).
I don't mean to discourage you in any way from applying to YC if you want to do a startup. Just suggesting you have a plan B for the "so I can eat" part :)
I think you might find TripleByte's blog post on what kinds of programmers YC companies want to hire interesting for several reasons:
First, if you want to do a startup, it's interesting to see what kinds of technical skills have turned out to be important for startups.
Second, if you want to create an app, it's interesting to see what kinds of technical skills have turned out to be important for startups creating apps.
And third, if you want to get a job, it's interesting to see what kinds of technical skills are most in demand.
A highlight is that the most demand is for product-focused programmers.
Thus, if I were looking for something to study for the purpose of starting a startup, or creating an app that lots of people use/love, or for finding work, I'd consider focusing on:
But keep in mind that for most apps, for most startups, you don't need Lisp. Reddit, for example, started in Lisp and switched to Python because the libraries were better. Nor are most of the YC companies using Lisp.
Of course, every startup is different, and every app is different. For a particular app, or for a particular startup, Lisp might be a strong advantage. For Paul Graham's original startup ViaWeb, for example, Lisp was a decisive advantage.
Lisp is a programmable programming language. When do you need to program your programming language? When your programming language isn't doing enough for you on its own.
As other programming languages have gotten better, there's less of a gap between them and Lisp. ViaWeb was using Lisp competing against companies writing in C++. Nowadays the mainstream languages are higher level.
Lisp is a useful skill to learn because if you ever do get into a situation where it would be helpful to be able to program your programming language you can say "aha! A macro would make this easier!"
And yet, to get into YC, or to write an app that lots of people use/love, most of the time, in most cases, that's not necessary. (Or else startups would be looking for Lisp programmers).
I think for it to be a good tutorial, it ought to do something that's useful on its own. Maybe an implementation of generators or something like that. `capture-dyn` is a useful tool for working with continuations (it seems like), but I think as a tutorial it may be a bit esoteric since you'd already need to know about the interaction of continuations with dynamic scope to see when and why you'd want to use `capture-dyn`.
Sounds like it. The idea is that you have some irreducible amount of imperative state out in the "real world" (files in the OS, your hardware screen, etc), and the goal is to keep that as small a part of your language as possible -- so that the rest of your language can be as functional as possible.
Arc handles HTTP requests character by character, which is fine for form submits or REST APIs, but not what you want for handling large binaries.
To be able to handle image uploads directly in Arc, you'd want an HTTP server implementation which supported binary buffers.
But, not everything needs to be implemented directly in Arc. Instead you could use e.g. Ngnix or Apache to handle the image uploads, or upload the images directly to AWS S3 from the browser.
Interesting! Arc is reading the request one character at the time. I think I see where it's happening, in the handle-request-thread function where readc is called.
So, for Nginx/Apache, are you suggesting Arc might be able to handle uploads if it's sitting behind Nginx/Apache acting as reverse proxy doing buffering?
Or is it more like doing anything with big files in Arc is a no-go, so it's better to have a separate setup of Nginx/Apache (with PHP or whichever) handling file the uploads?
> So, for Nginx/Apache, are you suggesting Arc might be able to handle uploads if it's sitting behind Nginx/Apache acting as reverse proxy doing buffering?
Nope, that won't help.
> so it's better to have a separate setup of Nginx/Apache (with PHP or whichever) handling file the uploads
That's the easiest approach. Not that you'd need PHP, just pop in a module to handle file uploads for you.
> Or is it more like doing anything with big files in Arc is a no-go
Not Arc as such but the implementation in srv.arc isn't designed for large files. You could read the old MZScheme documentation at http://docs.racket-lang.org/mzscheme/index.html and figure out how to read an input stream into a binary buffer, but it would be more work.
Thanks a lot for helping clear that up. Those limitations of Arc are definitely not immediately obvious from reading the docs and essays available on the language.