In a continuing effort to teach myself arc (and programming in general), I am playing around with news.arc.
Currently, I have run into an issue, that seems like it should be straight-forward, but I'm stumped.
I have allowed users to select other users as friends. In their prof, they have a list of friends i.e. (friends ("bob" "mary")). -That's well and good. However, when visiting a friends' profile page, I want to change some things conditional upon their friend status. I figured I could simply test whether the subject of the profile page was among the users' friends, so I tried:
(if (in subject (uvar user friends))
(do such and such...))
I've tried several variations on the theme, but can't seem to get a true case. Is there a way to test if a string is in a list of strings that I am missing? This must be easier than it seems. :/
The benefit of using 'find' is that it returns the value if found and nil otherwise. Depending upon how you're implementing your code you may discover 'find' works better. ie what if your going to pass the result into another function?
arc> (some 'c '(a b c d c b a))
t
arc> (find 'c '(a b c d c b a))
c
arc> (mem 'c '(a b c d c b a))
(c d c b a)
arc> (pos 'c '(a b c d c b a))
2
arc> (keep 'c '(a b c d c b a)) ; significantly less efficient
(c c)
All of these ('some, 'find, 'mem, 'pos, and 'keep) use 'testify, so if the thing you're searching for is a function, it will actually use that function as a predicate instead of searching for the function value itself.
arc> (pos no '(a b c nil e))
3
arc> (pos no (list idfn no 72))
nil
arc> (pos [is no _] (list idfn no 72))
1
By the way there's a gotcha specifically with 'find, which makes it an odd one out: You can't find nil with it. The function will be called with nil (if it's a function), but a success will just result in an intermediate return value of nil (the found value), which is then considered to be a failure, and the loop keeps going.
Thanks, rocketnia. That really does clear some things up. Here, 'some works great for me, since I am passing the result to 'case with two instances.
I can't say I have got a feel for Arc yet, but I think I am starting to. The more I work with it, the more it seems to open up. Not so much that feeling with the HTML definitions, however. But I am not going down that road until I have a much better handle on the language.
BTW, are there any good editors other than vi and emacs? I used vi years ago for FORTRAN, didn't like it, and have never used emacs. Currently I am using Wordpad. :p
> BTW, are there any good editors other than vi and emacs? I used vi years ago for FORTRAN, didn't like it, and have never used emacs. Currently I am using Wordpad. :p
Not sure how anyone could would want something besides vi/vim or emacs, but I use emacs with vim key bindings so maybe I'm biased. ;)
vi and emacs are great editors, but each has a mammoth learning curve. I think this is why they often make bad first impressions. I'd be interested to know more about your experience with vi and what you disliked about it.
Other editors I see people using include Notepad++, Textmate, jEdit, nano, NetBeans and DrRacket (which you might like for Arc and Racket code). Unfortunately, I don't know much about these firsthand.
I guess I expected this answer. :) Although I hadn't heard of nano. I should probably bite the bullet and start to familiarize myself with emacs or vi since they are so universal. Probably emacs.
My experience with vi involved some physics modeling years ago in undergrad. I recall 'zz'? I never stop forgetting to toggle the input mode. I don't know why. Maybe I am a bit right-brained, but I have a sloppy way of working. Something about vi felt so 'tight', and I prefer a canvas feel. Not really the best mindset for programming, no doubt, but I have learned it's ok to be a bit sloppy and get things done rather than be very neat and unproductive.
For what it's worth, you're not the only one. I haven't gotten the hang of anything but the Windows text box conventions. :)
Pretty much all I want is an editor that loads quickly, shows whitespace, highlights matching parens, and does regex search-and-replace and find-in-files. Usually my choice has been EditPlus, but Notepad++ is a great open-source alternative, and gedit's another good one (although I think I've had to use grep separately with gedit).
I turn to jEdit for editing Unicode documents, NetBeans for editing C/C++, and DrRacket for editing big Racket projects, but those are extremely rare situations for me. Most of the time I avoid all three of them because of how long they take to load; I can almost always get in and out faster with plain Notepad.
> Something about vi felt so 'tight', and I prefer a canvas feel. Not really the best mindset for programming, no doubt, but I have learned it's ok to be a bit sloppy and get things done rather than be very neat and unproductive.
As hasenj said, 'some is the right choice here. A slightly-worse choice is 'apply, as in (apply in (uvar user friends)).
I say it only so you'll know about it when you need it; when beginning Lisp myself, I kept trying to figure out how to "unbox" lists to apply the values within as separate arguments to a function.
I heard pg & rtm tried first-class macros but the performance was unacceptable. I would really like to have tried them out, seen results from some performance tests... or something!
How difficult of a hack would it be to give Arc first-class macros again?
If you want to try out first-class macros to see what they could do for you, that's easy enough: write an interpreter for Arc. It'd be slow of course, but enough so that you could try out some different kinds of expressions and see if you liked what you do with it.
I was a fan of fexprs not too long ago, and I still kinda am, but they lost their luster for me at about this point: http://arclanguage.org/item?id=11684
Quoting from myself,
Quote syntax (as well as fexprs in general) lets you take code you've written use it as data, but it does little to assure you that any of that computation will happen at compile time. If you want some intensive calculation to happen at compile time, you have to do it in a way you know the compiler (as well as any compiler-like functionality you've defined) will be nice enough to constant-propagate and inline for you.
I've realized "compiler-like functionality you've defined" is much easier to create in a compiled language where the code-walking framework already exists than in an interpreted language where you have to make your own.
If part of a language's goal is to be great at syntax, it has a conflict of interest when it comes to fexprs. They're extremely elegant, but user libraries can't get very far beyond them (at least, without making isolated sublanguages). On the other hand, the dilemma can be resolved by seeing that an fexpr call can compile into a call to the fexpr interpreter. The compiler at the core may be less elegant, but the language code can have the best of both worlds.
This is an approach I hope will work for Penknife. In a way, Penknife's a compiled language in order to support fexpr libraries. I don't actually expect to support fexprs in the core, but I may write a library. Kernel-style fexprs really are elegant. ^_^
Speaking of such Kernel-like libraries, I've thrown together a sketch of a Kernel-like interpreter written in Arc. It's totally untested, but if the stars have aligned, it may only have a few crippling typos and omissions. :-p https://gist.github.com/778492
Can't say I'm a fan of PicoLisp yet, though. No local variables at all? Come on! ^_^
Speaking of speaking too soon, I may have said "user libraries can't get very far beyond [an fexpr language's core syntax]," but I want to add the disclaimer that there's no way I actually know that.
In fact, I was noticing that Penknife's parse/compile phase is a lot like fexpr evaluation. The operator's behavior is called with the form body, and that operator takes care of parsing the rest, just like an fexpr takes care of evaluating the rest. So I think a natural fexpr take on compiler techniques is just to eval code in an environment full of fexprs that calculate compiled expressions or static types. That approach sounds really familiar to me, so it probably isn't my idea. :-p
No harm done. :) PicoLisp appears to have lexical scoping but dynamic binding, although my PLT is too weak to understand all the implications of that. From the FAQ:
> This is a form of lexical scoping - though we still have dynamic binding - of symbols, similar to the static keyword in C. [1]
> "But with dynamic binding I cannot implement closures!" This is not true. Closures are a matter of scope, not of binding. [2]
Sounds like transient symbols are essentially in a file-local namespace, which makes them lexically scoped (the lexical context being the file!), and that transient symbols are bound in the dynamic environment just like internal symbols are. So whenever lexical scope is needed, another file is used. Meanwhile, (====) can simulate a file break, making it a little less troublesome.
But the let example I gave a few comments ago didn't use a transient symbol. Why does it work?
I chatted with PicoLisp's author, Alexander Burger, yesterday on IRC. If I catch him again, I can ask for clarification about the scoping/binding quirks.
I think it works because while you're inside the let, you don't call anything that depends on a global function named x. :) That's in the FAQ too:
-
What happens when I locally bind a symbol which has a function definition?
That's not a good idea. The next time that function gets executed within the dynamic context the system may crash. Therefore we have a convention to use an upper case first letter for locally bound symbols:
(de findCar (Car List)
(when (member Car (cdr List))
(list Car (car List)) ) )