Hello,
I am currently learning lisp, so this is a novice basic question and I
apologize for wrong use of words, I am still not used to some terms
(hobby-coder in c/c++ tcl/tk).
It is about destructive functions. As far as I understand, destructive
functions "destroy" their arguments, right?
So, if one wants to use their result, the argument should be a variable
(somehow unsure which vocabulary to use, is symbol in lisp the same (in
this regard) as variable in c?)?
The book, I am reading gives this as an example for the use of mapcan (a
filter for numbers):
CL-USER> (mapcan #'(lambda (x)
(cond ((numberp x) (list x)) (t nil)))
'(a 2 b c 3 4 d 5)) ;; my problem
(2 3 4 5)
(sorry, I have to figure out how indentation is done with spaces in rpl
of clisp (slime))
I do have problems with this.
What happens to '(a 2 b c 3 4 d 5)?
Is this a temporary object which will be changed?
If such a code would be in a source file, the code (the argument) would
be changed?
Where are those objects stored?
When will their space be freed (by GC?)?
If my question shows deep misunderstanding of basic concepts of lisp,
please feel free to give me the keyword I should have a closer look at.
Thanks for replies,
Bernd
--
https://gna.org/projects/mipisti - (microscope) picture stitching
T_a_k_e__c_a_r_e__o_f__y_o_u_r__R_I_G_H_T_S.
P_r_e_v_e_n_t__L_O_G_I_C--P_A_T_E_N_T_S
http://www.ffii.org, http://www.nosoftwarepatents.org
Bernd Schmitt <··················@gmx.net> writes:
> Hello,
>
> I am currently learning lisp,
Welcome to the community.
> so this is a novice basic question and I
> apologize for wrong use of words, I am still not used to some terms
> (hobby-coder in c/c++ tcl/tk).
That's ok. We learn a lot from the intuitions of outsiders coming to
our language and taking the time to relate their confusions. Sometimes
we've been too long away from those confusions ourselves to remember
without occasional reminders, and I felt I learned some interesting things
from your taking the time to ask these questions.
> It is about destructive functions. As far as I understand, destructive
> functions "destroy" their arguments, right?
You'd think. But that's not the right model. It really means "able to modify".
> So, if one wants to use their result, the argument should be a
> variable (somehow unsure which vocabulary to use, is symbol in lisp
> the same (in this regard) as variable in c?)?
A symbol is a data structure.
A variable is a program element.
C programs are text.
Lisp programs are objects.
Lisp parses text into objects.
C programs have no representation as objects.
C macros yield more text.
Lisp macros transform objects into other objects.
C semantics is defined directly as text.
Lisp semantics is defined not on text, but on objects.
Some objects are constructed by program, not by the Lisp reader (i.e., not by its parser).
Consequently, some Lisp programs have no source text.
It is not routinely possible nor convenient to run a C program that
results in text output that is subsequently compiled and loaded into the
same running image as produced that output.
In Lisp, it is a routine occurrence that programs are compiled either
directly into the current image (using the COMPILE function) or into a
file and then loaded directly back into that same image (using
COMPILE-FILE and LOAD).
So, back to your original statement, it is sometimes useful to refer to a symbol
as if it were on paper or in a file, since it maps directly to a data structure
that can be written back the same way. But technically, a symbol is an in-core,
constructed object.
Lisp variables are symbols, but not all symbols are variables. You can
have symbols that are used for other purposese, or for dual use. A
symbol is a variable when it is used as a variable, just as you are a
cook when you are cooking, but not when you are programming. Whether
you consider yourself a cook, a programmer, or something else is a
subjective judgment, not an issue of representation type. A symbol is
a representation type. A variable is not.
In Lisp, a variable is a symbol while in C it is a region of text.
But all that said, the other thing you should know is that in C/C++,
data that live in variables use storage that is tightly bonded to the
variable. That is, there is no identity to that data other than in
the variable. In Lisp, objects are all in the heap and variables are
only ever pointers to those objects.
> The book, I am reading gives this as an example for the use of mapcan
> (a filter for numbers):
>
> CL-USER> (mapcan #'(lambda (x)
> (cond ((numberp x) (list x)) (t nil)))
> '(a 2 b c 3 4 d 5)) ;; my problem
> (2 3 4 5)
>
> (sorry, I have to figure out how indentation is done with spaces in
> rpl of clisp (slime))
>
> I do have problems with this.
> What happens to '(a 2 b c 3 4 d 5)?
Nothing. (It is around for another call to MAPCAN were you to re-evaluate
the expression. e.g., if this had happened within a program. At toplevel,
it will go away quickly, but you might want to read about the +, ++, and +++
variables in the interactive environment.)
> Is this a temporary object which will be changed?
No. There is (almost) no such thing as a temporary object. (You can make
a declaration to the compiler specially that you are willing to allow
something to be temporarily allocated on the stack, using DYNAMIC-EXTENT,
and this can create bizarre situations in which you can still have a pointer
into the stack after the lifetime of something. As a result, you should be
very careful not to use the DYNAMIC-EXTENT declaration until you understand
the regular behavior of the language thoroughly. The ordinary model of
Lisp is that all objects persist forever and are reclaimed by the GC as
an optimization only when it can prove they can no longer be referenced.)
> If such a code would be in a source file, the code (the argument)
> would be changed?
No.
> Where are those objects stored?
In the heap, not that it matters. CL is not about "where" objects are.
It is about what they do. The model is that they persist until not needed,
and then they go away. That's what automatic memory management does.
It's more like Java than C or C++.
> When will their space be freed (by GC?)?
Yes, the GC will free it when you lost a pointer to it. Which means
worrying about when it goes away is not something you should do. If you
can still do something to the object, it is still around.
If the MAPCAN expression is your only pointer, as interactively
in an interpreter, then after 3 REPL iterations when the +, ++, and +++
variables are no longer pointing to it.
In the compiler, though, the '(A 2 ...) may be written out to a file
and loaded into some kind of pure space owned by your program. In that
case, whether it goes away or not is an implementation detail that may
depend on how your implementation manages quoted structure, functions,
modules of compiled functions, etc.
> If my question shows deep misunderstanding of basic concepts of lisp,
> please feel free to give me the keyword I should have a closer look at.
Your question shows awareness of a set of issues you should be concerned
about.
You should NOT think of destructive functions, though, as functions that
destroy their arguments. They are functions that MIGHT.
e.g., (NREVERSE NIL) does not destroy NIL. In fact, an implementation
is not required to destroy the argument even for (NREVERSE (LIST 'A 'B 'C)).
This is permitted to do the same as REVERSE would do, though most
production implementations don't do it that way.
The reason MAPCAN is destructive is that in the case of
(MAPCAN #'IDENTITY '((1 2) (3 4)))
you have to be careful you are not destroying your argument. MAPCAN puts
data into the function which is all-too-easily destroyed if you don't
know what you are doing, but in the case of the normal 'filter idiom',
(MAPCAN #'(LAMBDA (X) (IF (filter-test X) (LIST X) '())) ...)
the only thing "destroyed" is the (LIST X), and it's not destroyed in the
sense of C 'free' nor C++ 'delete'. It's destroyed in the sense of
modified by destructive action, to make its cdr point to something else.
Destruction in that sense is always subjective because if what you wanted
is the modified list, it's been "enhanced" not "destroyed".
This is one reason I really find it irritating that people put "!" on
the end of operator names to make it scary. To me, scary is when your
program doesn't do what you want, not when you call an operator that
peforms its well-defined meaning and that you wouldn't be calling if
you didn't need that meaning.
Geez, logging in to a computer changes the state of the computer.
Is that a destructive act? Maybe it should be called "login!".
Maybe it should be prosecuted by the FBI as a destructive act.
You get my point, I hope.
Anyway, in spite of some stumbling on terminology it sounded to me
like you had gotten the essential points, and that if anything, you
were just being confused by some sloppy terminology that we as a
community sometimes use without thinking about the misconceptions
we might be promoting.
Does any of this explanation help you get a better sense of it?
Kent M Pitman wrote:
> Bernd Schmitt <··················@gmx.net> writes:
>[... lots of information guiding to deeper insight..]
> Does any of this explanation help you get a better sense of it?
Yes. Thank you very much, that you took time to write this. It made
details much more clear which prevented deeper understanding of how data
"lives" in lisp (as an object). This had been my main problem and made
me posting my question even if I did not finish the book, because I felt
I somehow lost contact ...
The second thing I am keen learning on are closures (I only heard of
them before in a discussion in comp.lang.tcl). The book covered them
only as a (function <lambda...>). But I think I have to be patient and
first finish this book in order to read others.
Thanks,
Bernd
--
https://gna.org/projects/mipisti - (microscope) picture stitching
T_a_k_e__c_a_r_e__o_f__y_o_u_r__R_I_G_H_T_S.
P_r_e_v_e_n_t__L_O_G_I_C--P_A_T_E_N_T_S
http://www.ffii.org, http://www.nosoftwarepatents.org
Bernd Schmitt wrote:
> Hello,
>
> I am currently learning lisp, so this is a novice basic question and I
> apologize for wrong use of words, I am still not used to some terms
> (hobby-coder in c/c++ tcl/tk).
>
> It is about destructive functions.
First comment: Destructive functions aren't really important in the
beginning, because they are used for optimizing space usage. In the
beginning, it's better to learn about the higher level ways to express
solutions in Lisp, and only care about optimization when you detect
overheads in your programs that you want to avoid.
> As far as I understand, destructive
> functions "destroy" their arguments, right?
> So, if one wants to use their result, the argument should be a variable
> (somehow unsure which vocabulary to use, is symbol in lisp the same (in
> this regard) as variable in c?)?
It's correct to say variable in this context, symbol has a somewhat
different meaning in Lisp.
However, destructive functions are "destructive" to the actual data
structures, which means that they are allowed to side effect them - for
example delete elements in lists. They are used when you are sure that
you don't need them anymore afterwards, and you should be very careful
in that regard.
What you have in mind is that variables are updated to see new values.
However, destructive functions in Lisp are allowed _not_ to update the
corresponding variables. (There are good reasons for this.) So in
general, you still need to use assignment via setq or setf to store the
results in the right places. That's why it in general also doesn't
really matter whether you use the destructive or non-destructive
functions, except for optimization purposes.
> The book, I am reading gives this as an example for the use of mapcan (a
> filter for numbers):
>
> CL-USER> (mapcan #'(lambda (x)
> (cond ((numberp x) (list x)) (t nil)))
> '(a 2 b c 3 4 d 5)) ;; my problem
> (2 3 4 5)
[...]
> I do have problems with this.
> What happens to '(a 2 b c 3 4 d 5)?
> Is this a temporary object which will be changed?
The latter. As mapcan iterates over the input list, it generates an
intermediate list for the result, and that list is destructively
changed. mapcan is interesting because it can be used as a filter, such
that only certain elements of the input list are returned in the output
list. That's something you cannot do with mapcar.
However, mapcan is somewhat old-fashioned. It's especially odd that the
filtering function has to return the elements wrapped in lists.
It's clearer (IMHO) to use, for example, the LOOP macro instead:
(loop for x in '(a 2 b c 3 4 d 5)
when (numberp x)
collect x)
It seems to me that you are using an "old-fashioned" book. It may be
better to go for something more recent, like Peter Seibel's "Practical
Common Lisp", David Lamkins's "Successful Lisp" or Peter Norvig's
"Paradigms of Artificial Intelligence Programming".
> If such a code would be in a source file, the code (the argument) would
> be changed?
Not for mapcan. But indeed, destructive functions are in danger of being
used for literal data if you're not careful enough, and ANSI Common Lisp
merely states that the consequences are undefined. As I said, don't
worry about those functions in the beginning, they're for advanced uses
of Lisp.
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Hello Pascal,
Pascal Costanza wrote:
> Bernd Schmitt wrote:
> What you have in mind is that variables are updated to see new values.
> However, destructive functions in Lisp are allowed _not_ to update the
> corresponding variables. (There are good reasons for this.) So in
> general, you still need to use assignment via setq or setf to store the
> results in the right places. That's why it in general also doesn't
> really matter whether you use the destructive or non-destructive
> functions, except for optimization purposes.
As far as i understand (is there no apprev. for this AFAIU? I have the
feeling I will use this term frequently ...), nconc is a destructive
version of append, isnt't it? Nconc does change variables, if i
understand the following in the right way:
CL-USER> (setq x '(1 2) y '(3 4))
(3 4)
CL-USER> (nconc x y)
(1 2 3 4)
CL-USER> x
(1 2 3 4)
So, when do I know, whether a variable can be used to catch the result
of a destructive function (or was my example implementation dependent)?
> [...] As mapcan iterates over the input list, it generates an
> intermediate list for the result, and that list is destructively
> changed.
Hm. So the result of
CL-USER> (nconc '(1 2) '(3 4))
(1 2 3 4)
is intermediate as well, right?
> However, mapcan is somewhat old-fashioned. It's especially odd that the
> filtering function has to return the elements wrapped in lists.
>
> It's clearer (IMHO) to use, for example, the LOOP macro instead:
>
> (loop for x in '(a 2 b c 3 4 d 5)
> when (numberp x)
> collect x)
Ok. Loop is 90 pages away ... But as i see in the index, there is no
collect ...
> It seems to me that you are using an "old-fashioned" book. It may be
> better to go for something more recent, like Peter Seibel's "Practical
> Common Lisp", David Lamkins's "Successful Lisp" or Peter Norvig's
> "Paradigms of Artificial Intelligence Programming".
The reason why i read this book ("Programmieren in Common Lisp" O.
Mayer) is, that I need a basic vocabulary build up in my mother language
(german) to understand english computer texts (i am no computer science
student, just a hobby coder in c/c++ tcl/tk). Otherwise things like
"side-effect", "slot" ... would have no/wrong meaning to me. That is the
main reason why somehow can not read pcl now.
Thank you for your reply,
Bernd
--
https://gna.org/projects/mipisti - (microscope) picture stitching
T_a_k_e__c_a_r_e__o_f__y_o_u_r__R_I_G_H_T_S.
P_r_e_v_e_n_t__L_O_G_I_C--P_A_T_E_N_T_S
http://www.ffii.org, http://www.nosoftwarepatents.org
Bernd Schmitt <··················@gmx.net> writes:
> ... nconc is a destructive version of append, isnt't it?
> Nconc does change variables,
No, it changes objects, not variables.
> if i
> understand the following in the right way:
>
> CL-USER> (setq x '(1 2) y '(3 4))
> (3 4)
> CL-USER> (nconc x y)
> (1 2 3 4)
> CL-USER> x
> (1 2 3 4)
Before, there is a variable X and the value associated with it is a certain
cons cell, call it #0#, which looks like ( 1 . #1# ) where the #1# is
another cons that is ( 2 . () ) There is also a variable Y which has a
cons cell we'll call #2# and which looks like ( 3 . #3# ) where #3# is
another cons cell that looks like ( 4 . () ).
After, all the same is true except that the cons cell #1# now looks like
( 2 . #2 ), that is, its cdr points into the cons cell that is still
held by Y.
Nothing about the storage of X has changed. It still points to #0#.
Nothing about the storage of Y has changed. It still points to #2#.
But when X is viewed, it will point to ( 1 2 3 4 ) because it points
to #0=( 1 . #1=( 2 . #2=( 3 . #3=( 4 . () )))), while Y still points
to #2=( 3 . #3=( 4 . () )).
> So, when do I know, whether a variable can be used to catch the result
> of a destructive function (or was my example implementation dependent)?
You're using bad terminology because "the result" is what NCONC returns,
and most people will think by "catch the result" you mean
(let ((z (nconc ....))) ...)
where z "catches the result". A side-effect is not a result, it is an effect.
The question you want to ask is when a variable you've used will observe
a side-effect that has been made, and the answer is that it will see that
side-effect when the function is either permitted to make it and happens
to make it, or when the function is required to make it.
NCONC makes a side-effect to the object that is its first arg when the
first arg is not NIL. Any variable pointing to such a non-NULL object
will see that.
But you should pick up the result so you aren't confused by what you'll
perceive as a failure in the case of NIL. e.g.,
(let ((x '()) (y '(a b))) (nconc x y) x) => NIL ; not (A B)
even though
(let ((x '()) (y '(a b))) (nconc x y)) => (A B)
so you should generally prefer to do
(let ((x '()) (y '(a b))) (setq x (nconc x y)) ...)
since then x will really have the "result" of the NCONC. The result is
what's returned. The rest is just an effect.
If you're looking for a rule, focus on results as being returned.
Unlike other languages, which use reference variables to achieve multiple
results, Lisp will return more than one value if it needs to. It doesn't
resort to effects as cheating ways to return multiple values because it
has a first class way to do that.
(floor 5 3) => 1, 2
The values are returned, you don't have to write some goofy
void myfloor(int num, int denom, int& quo, int& rem)
in order to "pick up the result" in quo and rem. You will sometimes see
side-effects on given structures, but it's best early on if you don't
focus on those effects as your principle way of seeing results from
most operations.
Bernd Schmitt wrote:
> Hello Pascal,
>
> Pascal Costanza wrote:
>
>> Bernd Schmitt wrote:
>
>> What you have in mind is that variables are updated to see new values.
>> However, destructive functions in Lisp are allowed _not_ to update the
>> corresponding variables. (There are good reasons for this.) So in
>> general, you still need to use assignment via setq or setf to store
>> the results in the right places. That's why it in general also doesn't
>> really matter whether you use the destructive or non-destructive
>> functions, except for optimization purposes.
>
> As far as i understand (is there no apprev. for this AFAIU?
IIUC - If I understand correctly...
> I have the
> feeling I will use this term frequently ...), nconc is a destructive
> version of append, isnt't it? Nconc does change variables, if i
> understand the following in the right way:
>
> CL-USER> (setq x '(1 2) y '(3 4))
> (3 4)
> CL-USER> (nconc x y)
> (1 2 3 4)
> CL-USER> x
> (1 2 3 4)
>
> So, when do I know, whether a variable can be used to catch the result
> of a destructive function (or was my example implementation dependent)?
It works just accidentally in your example. You should _always_ say this:
(setq x (nconc x y))
...as you would say...
(setq x (append x y))
For example, try this:
(setq x '() y '(1 2))
(nconc x y)
And, additionally, don't use nconc on literal lists, as you did in your
example. Instead, generate the lists at runtime:
(setq x (list 1 2) y (list 3 4))
(setq x (nconc x y))
Otherwise, you'll get undefined behavior.
It cannot be stressed enough: Don't use the destructive functions as
your regular tools, and better forget completely about them in the
beginning. They should be used exactly like the non-destructive
versions, and only ever for optimizations. There's no other advantage in
using them, and you should only optimize those parts of your code that
you have measured before so that you know that optimizations actually
buys you anything. Everything else is a waste of your (valuable!)
development time.
>> [...] As mapcan iterates over the input list, it generates an
>> intermediate list for the result, and that list is destructively changed.
>
> Hm. So the result of
> CL-USER> (nconc '(1 2) '(3 4))
> (1 2 3 4)
> is intermediate as well, right?
It should be regarded as intermediate, and needs to be bound or assigned
to something if you want to get hold of it.
>> However, mapcan is somewhat old-fashioned. It's especially odd that
>> the filtering function has to return the elements wrapped in lists.
>>
>> It's clearer (IMHO) to use, for example, the LOOP macro instead:
>>
>> (loop for x in '(a 2 b c 3 4 d 5)
>> when (numberp x)
>> collect x)
>
> Ok. Loop is 90 pages away ... But as i see in the index, there is no
> collect ...
[...]
> > It seems to me that you are using an "old-fashioned" book. It may be
> > better to go for something more recent, like Peter Seibel's "Practical
> > Common Lisp", David Lamkins's "Successful Lisp" or Peter Norvig's
> > "Paradigms of Artificial Intelligence Programming".
>
> The reason why i read this book ("Programmieren in Common Lisp" O.
> Mayer) is, that I need a basic vocabulary build up in my mother language
> (german) to understand english computer texts (i am no computer science
> student, just a hobby coder in c/c++ tcl/tk). Otherwise things like
> "side-effect", "slot" ... would have no/wrong meaning to me. That is the
> main reason why somehow can not read pcl now.
OK, I have checked that book at amazon.de, and it seems to be indeed
somewhat old-fashioned. There are seemingly not enough good books about
Common Lisp in German. I am aware of a German translation of Paul
Grahamn's "ANSI Common Lisp" - maybe you can find a copy at ebay.
Somewhat should translate Peter's book into German... ;)
Viele Gruesse,
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza wrote:
> Bernd Schmitt wrote:
>> Pascal Costanza wrote:
>>> Bernd Schmitt wrote:
>> CL-USER> (setq x '(1 2) y '(3 4))
>> (3 4)
>> CL-USER> (nconc x y)
>> (1 2 3 4)
>> CL-USER> x
>> (1 2 3 4)
>>
>> So, when do I know, whether a variable can be used to catch the result
>> of a destructive function (or was my example implementation dependent)?
>
>
> It works just accidentally in your example.
Oh, this is an example of the book. Bad.
> And, additionally, don't use nconc on literal lists, as you did in your
> example. Instead, generate the lists at runtime:
>
> (setq x (list 1 2) y (list 3 4))
> (setq x (nconc x y))
>
> Otherwise, you'll get undefined behavior.
Ok, I will do so (even if I do not understand the difference between '(1
2) and (list 1 2) right now).
> It cannot be stressed enough: Don't use the destructive functions as
> your regular tools, and better forget completely about them in the
> beginning. They should be used exactly like the non-destructive
> versions, and only ever for optimizations. There's no other advantage in
> using them, and you should only optimize those parts of your code that
> you have measured before so that you know that optimizations actually
> buys you anything. Everything else is a waste of your (valuable!)
> development time.
Ok. I just finished reading the chapter. Thank you for this hint, so I
will know that they exist if I need to speed up parts of my program, but
I will only use non-destructive functions for first implementations.
>> [... ("Programmieren in Common Lisp" (O. Mayer) ...]
> OK, I have checked that book at amazon.de, and it seems to be indeed
> somewhat old-fashioned. There are seemingly not enough good books about
> Common Lisp in German. I am aware of a German translation of Paul
> Grahamn's "ANSI Common Lisp" - maybe you can find a copy at ebay.
A very valuable tip, thank you.
I tried neither amazon nor ebay do have it - i will try harder (the
remarks on amazon: "perfect") ...
> Somewhat should translate Peter's book into German... ;)
That would be it!
Thanks,
Bernd
--
https://gna.org/projects/mipisti - (microscope) picture stitching
T_a_k_e__c_a_r_e__o_f__y_o_u_r__R_I_G_H_T_S.
P_r_e_v_e_n_t__L_O_G_I_C--P_A_T_E_N_T_S
http://www.ffii.org, http://www.nosoftwarepatents.org
Bernd Schmitt wrote:
> Ok, I will do so (even if I do not understand the difference between '(1
> 2) and (list 1 2) right now).
All programming languages have literal values. It's so common that it's
hard to understand why that's something special. But it really is.
For example, when you bind 42 to a variable, the "42" is a value that
you have literally written as part of your source code:
(setq x 42)
Whereas when you increment x afterwards you will have a number stored in
x that you have never written down:
(setq x (+ x 1))
x => 43
The latter output ("42") was not part of your program source code.
Lisp is special in many ways, including the fact that anything can be
made part of some source code (as Kent already explained). When you type
in '(1 2), the parser (called reader in Lisp terminology) parses the
list consisting of the elements 1 and 2, and makes it part of your
source code. The quote ("'") takes care that this will be treated as a
literal datum in your program.
Now think about it: Assume you would change the contents of that list at
runtime - this would theoretically mean that you would change the source
code, since that list is part of that source code! That doesn't make
sense, so ANSI Common Lisp states that the result is undefined when you
try to do this.
On the other hand, list is a function that takes some elements and
creates a list at runtime. Since that list is not part of some source
code, it is perfectly legal to change its contents. So, as an analogue,
(setq l '(1 2)) is similar to (setq x 42) whereas (setq l (list 1 2)) is
similar to (setq x (+ x 1)), that is, the former value is taken from the
source code whereas the latter value is generated at runtime.
I hope this helps.
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza wrote:
> Bernd Schmitt wrote:
> On the other hand, list is a function that takes some elements and
> creates a list at runtime. Since that list is not part of some source
> code, it is perfectly legal to change its contents. So, as an analogue,
> (setq l '(1 2)) is similar to (setq x 42) whereas (setq l (list 1 2)) is
> similar to (setq x (+ x 1)), that is, the former value is taken from the
> source code whereas the latter value is generated at runtime.
>
> I hope this helps.
It does. Thank you very much.
Now all questions I asked and those I had in mind are answered and I
/think/ I understand those answers.
Very nice.
I hope that I can finish this book ASAP in order to read & understand PCL.
Thanks,
Bernd
--
https://gna.org/projects/mipisti - (microscope) picture stitching
T_a_k_e__c_a_r_e__o_f__y_o_u_r__R_I_G_H_T_S.
P_r_e_v_e_n_t__L_O_G_I_C--P_A_T_E_N_T_S
http://www.ffii.org, http://www.nosoftwarepatents.org
Pascal Costanza <··@p-cos.net> writes:
> Somewhat should translate Peter's book into German... ;)
IIUC, you speak some German...
Lemonodor-fame is but a translation away! Just send your proposal to
Apress and Peter.
--
My .sig file sucks. Can anyone recommend a better one?
David Steuber wrote:
> Pascal Costanza <··@p-cos.net> writes:
>
>>Somewhat should translate Peter's book into German... ;)
That should, of course, be "someone"...
>
> IIUC, you speak some German...
>
> Lemonodor-fame is but a translation away! Just send your proposal to
> Apress and Peter.
;) Yep, German is my native language, but no, I don't have the time for
that...
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza wrote:
> It cannot be stressed enough: Don't use the destructive functions as
> your regular tools, and better forget completely about them in the
> beginning. They should be used exactly like the non-destructive
> versions, and only ever for optimizations. There's no other advantage in
> using them, and you should only optimize those parts of your code that
> you have measured before so that you know that optimizations actually
> buys you anything. Everything else is a waste of your (valuable!)
> development time.
Rubbish. That is a defense of ignorance, and a recipe for newbies
writing code so painfully slow they abandon Lisp.
/Always/ use the destructive version if one can get away with it.
Understanding when one can get away with it is not so hard, and requires
no more than the same understanding one needs anyway to program in Lisp.
We should place a Graham Tax on remove, append, and their ilk.
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton wrote:
>
> Pascal Costanza wrote:
>
>> It cannot be stressed enough: Don't use the destructive functions as
>> your regular tools, and better forget completely about them in the
>> beginning. They should be used exactly like the non-destructive
>> versions, and only ever for optimizations. There's no other advantage
>> in using them, and you should only optimize those parts of your code
>> that you have measured before so that you know that optimizations
>> actually buys you anything. Everything else is a waste of your
>> (valuable!) development time.
>
> Rubbish. That is a defense of ignorance, and a recipe for newbies
> writing code so painfully slow they abandon Lisp.
>
> /Always/ use the destructive version if one can get away with it.
> Understanding when one can get away with it is not so hard, and requires
> no more than the same understanding one needs anyway to program in Lisp.
Code doesn't get automagically faster by replacing non-destructive
functions with destructive ones. Consider append vs. nconc, they both
probably suck when it could actually be better to rearrange the code
such that you only do one pass through your data.
Furthermore, the majority of the destructive functions are defined on
conses/lists. It's better to get the whole picture that Lisp isn't
"just" about list processing, but also has other data structures that
are probably more important for creating efficient programs than
destructive functions.
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza <··@p-cos.net> writes:
> Furthermore, the majority of the destructive functions are defined on
> conses/lists. It's better to get the whole picture that Lisp isn't
> "just" about list processing, but also has other data structures that
> are probably more important for creating efficient programs than
> destructive functions.
I'm not clear on what observation you're making here.
Do you mean "the majority of functions that are ever called destructive
are among this set" or do you actually mean there are more ways to (sigh)
"destroy" (i.e., "modify") a cons/list than there are other data structures.
Certainly there are many destructive operations on other data structures,
and they are quite important. How many people seriously expect array
operations to be non-destructive, yet we do not call AREF by some goofy
name like AREF! nor even NAREF.
(The history of "N" for destruction is a pretty silly story itself. At
least Interlisp had the good sense to use "D".)
Kent M Pitman <······@nhplace.com> writes:
> (The history of "N" for destruction is a pretty silly story itself. At
> least Interlisp had the good sense to use "D".)
Please share.
-Tom.
--
Thomas A. Russ, USC/Information Sciences Institute
Kent M Pitman <······@nhplace.com> wrote:
> Certainly there are many destructive operations on other data structures,
> and they are quite important. How many people seriously expect array
> operations to be non-destructive, yet we do not call AREF by some goofy
> name like AREF! nor even NAREF.
For exactly that reason, methinks. I do not seriously expect array
operations to be non-destructive, but I *do* intuitively expect list
operations to be non-destructive. The other difference is that
destructive list operators destroy the structure of the list, not just
the data inside. This is a fuzzy concept obviously, since a CDR pointer
is "data" just as much as an atom or some class object would be, but
hopefully you get my point about intuition.
Also, unless I'm missing something, Aref *itself* isn't destructive, it
just offers the occasion of destruction by being a setf-able place. If
you only read the aref, nothing is destroyed. I always expect setf to
be destructive.
> (The history of "N" for destruction is a pretty silly story itself. At
> least Interlisp had the good sense to use "D".)
I've always wondered where the 'n' comes from.
Michael
Kent M Pitman <······@nhplace.com> writes:
> Pascal Costanza <··@p-cos.net> writes:
>
>> Furthermore, the majority of the destructive functions are defined on
>> conses/lists. It's better to get the whole picture that Lisp isn't
>> "just" about list processing, but also has other data structures that
>> are probably more important for creating efficient programs than
>> destructive functions.
>
> I'm not clear on what observation you're making here.
>
> Do you mean "the majority of functions that are ever called destructive
> are among this set" or do you actually mean there are more ways to (sigh)
> "destroy" (i.e., "modify") a cons/list than there are other data structures.
I'm not Pascal but my observation is that many people use
"destructive" to describe operations that may leave one or more of the
arguments in an unusable state, i.e. destroyed. And using this meaning
of "destructive" indeed most functions that are so called operate on
conses.
> Certainly there are many destructive operations on other data structures,
> and they are quite important. How many people seriously expect array
> operations to be non-destructive, yet we do not call AREF by some goofy
> name like AREF! nor even NAREF.
So I think it's a misuse of the term "destructive" to use it to
describe (SETF AREF) (what I assume you meant) because it doesn't
"destroy" it's argument--it merely changes it's state. Of course, as
you know, that does "destroy" it as far as being a functional data
type but, as you also of course know, Common Lisp is not a purely
functional language so we don't get all worked up about it. (Also the
SETF is a pretty good clue that state is being mutated.)
I do recognize that there are folks (Schemers often) who use
"destructive" as a synonym for "has side effects". Which is a sign of
their functional bias. In an attempt to avoid this confusion I
introduced a new (as far as I know) term in my book, "recycling" to
refer to what *I* would mean by "destructive" if I used the
term--operations that may leave their arguments in an unusable
state. I picked this term because most of the "recylcing" ops are the
ones that recycle the cons cells in their argument or arguments in
order to avoid creating excessive garbage. (Of course nothing is that
simple because some "recycling" operations, such as NCONC, while
recycling their arguments also have reliable side effects and can be
used purely for those side effects.
> (The history of "N" for destruction is a pretty silly story itself. At
> least Interlisp had the good sense to use "D".)
Didn't it stand for "Non-consing". That at least gets at the
distinction that seems to me to be the key one.
-Peter
--
Peter Seibel * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/
Kent M Pitman wrote:
> Pascal Costanza <··@p-cos.net> writes:
>
>>Furthermore, the majority of the destructive functions are defined on
>>conses/lists. It's better to get the whole picture that Lisp isn't
>>"just" about list processing, but also has other data structures that
>>are probably more important for creating efficient programs than
>>destructive functions.
>
> I'm not clear on what observation you're making here.
The observation is that the OP apparently reads a book about Common Lisp
that makes the mistake to give the impression that Lisp is indeed mostly
about list processing, and so discusses technical details of list
processing functions too early that are probably not as relevant as the
stuff that is discussed later or maybe even completely left out.
> Do you mean "the majority of functions that are ever called destructive
> are among this set" or do you actually mean there are more ways to (sigh)
> "destroy" (i.e., "modify") a cons/list than there are other data structures.
I mean "the majority of functions that are described as destructive
versions of otherwise non-destructive counterparts that are defined in
some pre-ANSI dialect of Common Lisp, and that typically begin with an
N". ;)
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza wrote:
> Kenny Tilton wrote:
>
>>
>> Pascal Costanza wrote:
>>
>>> It cannot be stressed enough: Don't use the destructive functions as
>>> your regular tools, and better forget completely about them in the
>>> beginning. They should be used exactly like the non-destructive
>>> versions, and only ever for optimizations. There's no other advantage
>>> in using them, and you should only optimize those parts of your code
>>> that you have measured before so that you know that optimizations
>>> actually buys you anything. Everything else is a waste of your
>>> (valuable!) development time.
>>
>>
>> Rubbish. That is a defense of ignorance, and a recipe for newbies
>> writing code so painfully slow they abandon Lisp.
>>
>> /Always/ use the destructive version if one can get away with it.
>> Understanding when one can get away with it is not so hard, and
>> requires no more than the same understanding one needs anyway to
>> program in Lisp.
>
>
> Code doesn't get automagically faster by replacing non-destructive
> functions with destructive ones.
Uh, yeah, actually it does. You are simply defending ignorance. In any
given situation I can safely use delete or I cannnot. If I can use
delete, then using remove conses for no better reason other than to save
me from thinking. Or, in the case of newbies, from learning Lisp.
btw, this is not about optimization. Optimization is doing extra work
and maybe even obfuscating things a little to get better performance
than is possible with straightforward code. There is nothing
un-straightforward about destructive functions.
Except I am starting to think /you/ are not sure when destructive
functions can be safely used. Hey, good question. Do /you/ follow your
own recommended practice of always using non-destructive functions? Is
the only time you use delete when you have a performance problem and are
trying to fix it?
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton <·······@nyc.rr.com> writes:
> Pascal Costanza wrote:
>> Kenny Tilton wrote:
>>
>>>
>>> Pascal Costanza wrote:
>>>
>>>> It cannot be stressed enough: Don't use the destructive functions
>>>> as your regular tools, and better forget completely about them in
>>>> the beginning. They should be used exactly like the
>>>> non-destructive versions, and only ever for optimizations. There's
>>>> no other advantage in using them, and you should only optimize
>>>> those parts of your code that you have measured before so that you
>>>> know that optimizations actually buys you anything. Everything
>>>> else is a waste of your (valuable!) development time.
>>>
>>>
>>> Rubbish. That is a defense of ignorance, and a recipe for newbies
>>> writing code so painfully slow they abandon Lisp.
>>>
>>> /Always/ use the destructive version if one can get away with
>>> it. Understanding when one can get away with it is not so hard, and
>>> requires no more than the same understanding one needs anyway to
>>> program in Lisp.
>> Code doesn't get automagically faster by replacing non-destructive
>> functions with destructive ones.
>
> Uh, yeah, actually it does. You are simply defending ignorance. In any
> given situation I can safely use delete or I cannnot. If I can use
> delete, then using remove conses for no better reason other than to
> save me from thinking. Or, in the case of newbies, from learning Lisp.
Well it's possible that using a destructive (or, as I prefer,
"recycling") function will slow things down--by recycling old conses
you may move them out of the youngest generation in a generational
garbage collector where they could have been collected much more
efficiently than they will now. Or so I understand but IANAGCI.
At any rate, one interesting (to me anyway) stat I came across while
working on my book was the use of recycling functions in CLOCC: I
examined all the uses of recycling functions in CLOCC and found that
the PUSH, PUSH, NREVERSE idiom accounted for nearly half of all uses
of recycling functions and the next 30% were cases of immediately
reassigning a value back to the place containing the potentially
recycled value such as:
(setf foo (delete nil foo))
For whatever that's worth. Of course the real savings of PUSH/NREVERSE
comes from the PUSH not the NREVERSE. (I mean as opposed to doing
something awful like we've seen so many newbies do such as:
(do ((x ())) ((whatever) x) (setf x (append x (list (new-thing)))))
That is the Big-O complexity of PUSH, PUSH, NREVERSE would be the same
if you used REVERSE instead of NREVERSE.
-Peter
--
Peter Seibel * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/
Peter Seibel wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
> > Uh, yeah, actually it does. You are simply defending ignorance. In any
> > given situation I can safely use delete or I cannnot. If I can use
> > delete, then using remove conses for no better reason other than to
> > save me from thinking.
Saving me from thinking is always a good thing.
Structure-sharing is never on the feature list. Certain cases are
performance enablers, but those are the only ones worth thought, at
least for those of us who are overcommitted.
> Well it's possible that using a destructive (or, as I prefer,
> "recycling") function will slow things down--by recycling old conses
> you may move them out of the youngest generation in a generational
> garbage collector where they could have been collected much more
> efficiently than they will now. Or so I understand but IANAGCI.
A copy can be more cache-friendly, which is often more important.
On reasonable-performance systems, floating-point cosine is much
cheaper than an L2 cache miss, so if you're optimizing for operation
count, you're not optimizing for performance. It's the software analog
to "it's not the gates, it's the wires".
······@earthlink.net wrote:
> It's the software analog
> to "it's not the gates, it's the wires".
I always thought it was the ballmer.
--
My ideal for the future is to develop a filesystem remote interface
(a la Plan 9) and then have it implemented across the Internet as
the standard rather than HTML. That would be ultimate cool.
Ken Thompson
Peter Seibel wrote:
> Well it's possible that using a destructive (or, as I prefer,
> "recycling") function will slow things down--by recycling old conses
> you may move them out of the youngest generation in a generational
> garbage collector where they could have been collected much more
> efficiently than they will now. Or so I understand but IANAGCI.
I would think so too. BTW is there a good comparison of the different
Lisp to what GC technique they use (just curious)? IIRC I remember
hearing that CLisp has generational GC and SBCL is working on one.
> At any rate, one interesting (to me anyway) stat I came across while
> working on my book was the use of recycling functions in CLOCC: I
> examined all the uses of recycling functions in CLOCC and found that
> the PUSH, PUSH, NREVERSE idiom accounted for nearly half of all uses
I'd call PUSH more imperative than recycling. It changes the variable
binding, but not anything on the heap.
> That is the Big-O complexity of PUSH, PUSH, NREVERSE would be the same
> if you used REVERSE instead of NREVERSE.
Sure, but you create a new spine for the list (which may not be bad at
all if cons is fast and garbage cheap).
--
My ideal for the future is to develop a filesystem remote interface
(a la Plan 9) and then have it implemented across the Internet as
the standard rather than HTML. That would be ultimate cool.
Ken Thompson
Peter Seibel <·····@gigamonkeys.com> writes:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
> > Pascal Costanza wrote:
> >> Kenny Tilton wrote:
> >>
> >>>
> >>> Pascal Costanza wrote:
> >>>
> >>>> It cannot be stressed enough: Don't use the destructive functions
> >>>> as your regular tools, and better forget completely about them in
> >>>> the beginning. They should be used exactly like the
> >>>> non-destructive versions, and only ever for optimizations. There's
> >>>> no other advantage in using them, and you should only optimize
> >>>> those parts of your code that you have measured before so that you
> >>>> know that optimizations actually buys you anything. Everything
> >>>> else is a waste of your (valuable!) development time.
> >>>
> >>>
> >>> Rubbish. That is a defense of ignorance, and a recipe for newbies
> >>> writing code so painfully slow they abandon Lisp.
> >>>
> >>> /Always/ use the destructive version if one can get away with
> >>> it. Understanding when one can get away with it is not so hard, and
> >>> requires no more than the same understanding one needs anyway to
> >>> program in Lisp.
> >> Code doesn't get automagically faster by replacing non-destructive
> >> functions with destructive ones.
> >
> > Uh, yeah, actually it does. You are simply defending ignorance. In any
> > given situation I can safely use delete or I cannnot. If I can use
> > delete, then using remove conses for no better reason other than to
> > save me from thinking. Or, in the case of newbies, from learning Lisp.
>
> Well it's possible that using a destructive (or, as I prefer,
> "recycling") function will slow things down
I've actually seen this in the wild - though I never investigated it
enough to see exactly why it happened in the particular set of cases
involved. This concerned a use of reverse instead of nreverse which
could have been safely used. Changing to nreverse actually resulted
in slightly _longer_ runtimes. This seemed a bit puzzling, but
fooling around with the code a bit never changed this slight
differential. Implementation was Allegro 6.0.
I suppose what this really highlights is the old saw about never
optimizing via simply intuition and use profiling to pin things down
when its important.
/Jon
--
'j' - a n t h o n y at romeo/charley/november com
jayessay wrote:
> I've actually seen this in the wild - though I never investigated it
> enough to see exactly why it happened in the particular set of cases
> involved. This concerned a use of reverse instead of nreverse which
> could have been safely used. Changing to nreverse actually resulted
> in slightly _longer_ runtimes. This seemed a bit puzzling, but
> fooling around with the code a bit never changed this slight
> differential. Implementation was Allegro 6.0.
If the lisp has a generational garbage collector, destructive
operations that may introduce links from older to newer generations
will require some kind of write barrier. IIRC, Allegro implements
the write barrier with a (highly optimized) subroutine call.
I imagine one could optimize this in various ways, for example
if one knew that the cons cells in the list being reversed
were allocated in the order they appear in the list (so if one
is in newspace, so must all the following ones.)
Paul
Christophe Rhodes wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
>
>>Pascal Costanza wrote:
>>
>>>Code doesn't get automagically faster by replacing non-destructive
>>>functions with destructive ones.
>>
>>Uh, yeah, actually it does.
>
>
> Uh, in general, actually it doesn't.
Uh, generally does not contradict actually, does it?
Anyway, this is just another argument for ignorance. Golly, maybe an
efficient Lisp compiler will save my sorry ass from having to understand
what I am doing!
So your advice to newbies is to avoid learning when it is safe to delete
by instead learning when the compiler and GC mechanism will avoid excess
consing? Portably?
When is that, btw? I have been programming Lisp heads-down for ten years
and I do not know.
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton <·······@nyc.rr.com> writes:
> Christophe Rhodes wrote:
>
>> Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>>Pascal Costanza wrote:
>>>
>>>>Code doesn't get automagically faster by replacing non-destructive
>>>>functions with destructive ones.
>>>
>>> Uh, yeah, actually it does.
>> Uh, in general, actually it doesn't.
>
> Uh, generally does not contradict actually, does it?
My assertion, that of the negation of the universal quantifier, is in
direct opposition to yours, the universal quantifier. Saying that
code automagically becomes faster by replacing non-destructive
functions with destructive ones is just silly, but sadly I can't give
you the benefit of the doubt because your persona here seems to
involve a lot of silliness. (If you were saying "some code in some
circumstances gets faster by replacing non-destructive functions with
destructive ones", then yes, that's true.)
> So your advice to newbies is to avoid learning when it is safe to
> delete by instead learning when the compiler and GC mechanism will
> avoid excess consing? Portably?
When did this become about _my_ advice to newbies? Unless you're
casting yourself in that role, in which case my advice is something
like "don't make general assertions that aren't true". For what it's
worth, I would suggest that the newcomer learn the consing variants of
all the operators until they have a thorough grasp of the underlying
representation, at which point the semantics of the non-consing /
destructive variants will be obvious. (Maybe not least because the
newbie is sure to ignore my suggestion and experiment anyway, and be
bitten by unexpected structure-sharing or similar. Making mistakes is
still the best way of learning, as long as those mistakes don't kill
you)
The argument you level against the "compiler and GC mechanism" is one
that has been levelled at garbage-collected memory management
strategies before; this doesn't stop GC being a generally good idea,
as long as the facility is there to do manual memory management for
applications which need it.
Christophe
Kenny Tilton <·······@nyc.rr.com> writes:
> Except I am starting to think /you/ are not sure when destructive
> functions can be safely used. Hey, good question. Do /you/ follow your
> own recommended practice of always using non-destructive functions? Is
> the only time you use delete when you have a performance problem and
> are trying to fix it?
Sometimes I've had shared structure where I did not expect it. This
is what makes destructive operators dangerous.
Of course that does not stop me from using the destructive forms of
operators when I am sure there is no shared structure involved.
--
My .sig file sucks. Can anyone recommend a better one?
In article <··············@david-steuber.com>,
David Steuber <·····@david-steuber.com> wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
> > Except I am starting to think /you/ are not sure when destructive
> > functions can be safely used. Hey, good question. Do /you/ follow your
> > own recommended practice of always using non-destructive functions? Is
> > the only time you use delete when you have a performance problem and
> > are trying to fix it?
>
> Sometimes I've had shared structure where I did not expect it. This
> is what makes destructive operators dangerous.
>
> Of course that does not stop me from using the destructive forms of
> operators when I am sure there is no shared structure involved.
And novices typically are terrible at figuring out when
shared structure is involved.
It's even worse when you have a Lisp that lets you modify
quoted lists. Students in my Lisp classes can never figure
out why a function works the first time but not the second,
and, 90% of the time, it's because they're modifying a quoted
list in the code.
Chris Riesbeck wrote:
> In article <··············@david-steuber.com>,
> David Steuber <·····@david-steuber.com> wrote:
>
>
>>Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>
>>>Except I am starting to think /you/ are not sure when destructive
>>>functions can be safely used. Hey, good question. Do /you/ follow your
>>>own recommended practice of always using non-destructive functions? Is
>>>the only time you use delete when you have a performance problem and
>>>are trying to fix it?
>>
>>Sometimes I've had shared structure where I did not expect it. This
>>is what makes destructive operators dangerous.
>>
>>Of course that does not stop me from using the destructive forms of
>>operators when I am sure there is no shared structure involved.
>
>
> And novices typically are terrible at figuring out when
> shared structure is involved.
The problem is that even an interesting homework problem can lead to
exponential consing explosions, as we sometimes see here on c.l.l. So I
think the solution is to learn when destructives are cool. How about a
simple lesson with one example of a consing explosion (in as simple a
bit of code as possible) and one or more examples of mistaken destruction?
Then again, I happen to think it is important to understand list
programming when using Lisp, so maybe I am a dinosaur.
>
> It's even worse when you have a Lisp that lets you modify
> quoted lists. Students in my Lisp classes can never figure
> out why a function works the first time but not the second,
> and, 90% of the time, it's because they're modifying a quoted
> list in the code.
I am on record against '(a b c) in newby examples, though I understand
the motivation (not wanting to offer instead (list 'a 'b 'c)).
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton wrote:
> Chris Riesbeck wrote:
> > It's even worse when you have a Lisp that lets you modify
> > quoted lists. Students in my Lisp classes can never figure
> > out why a function works the first time but not the second,
> > and, 90% of the time, it's because they're modifying a quoted
> > list in the code.
>
> I am on record against '(a b c) in newby examples, though I understand
> the motivation (not wanting to offer instead (list 'a 'b 'c)).
Why not use COPY-TREE? I see it so rarely in examples.
Maybe there could be a nice read macro, like Marco Baringer mentioned.
http://groups.google.com/group/comp.lang.lisp/msg/110f2819b121039d?hl=en&
David Steuber wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
>
>>Except I am starting to think /you/ are not sure when destructive
>>functions can be safely used. Hey, good question. Do /you/ follow your
>>own recommended practice of always using non-destructive functions? Is
>>the only time you use delete when you have a performance problem and
>>are trying to fix it?
>
>
> Sometimes I've had shared structure where I did not expect it. This
> is what makes destructive operators dangerous.
"did not expect it"? Sounds like you went into a usage space I am not
recommending. There are times when it is impossible to hit shared
structure, because the code at hand either generated the structure or
because the application semantics intend a certain structure to be truly
shared, ie, after I whack it I positively want any and all other code to
see the whacked form.
>
> Of course that does not stop me from using the destructive forms of
> operators when I am sure there is no shared structure involved.
>
Thank you. :)
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton <·······@nyc.rr.com> writes:
> Pascal Costanza wrote:
> > Code doesn't get automagically faster by replacing non-destructive
> > functions with destructive ones.
>
> Uh, yeah, actually it does. You are simply defending ignorance. In any
> given situation I can safely use delete or I cannnot. If I can use
> delete, then using remove conses for no better reason other than to
> save me from thinking. Or, in the case of newbies, from learning Lisp.
Actually, I wrote some really ad hoc tests to poke at this claim and found
that it's sometimes one way and sometimes the other, probably
depending a lot on the implementation, the gc settings, the particulars
of the code, etc. Like everything else in performance tuning, the answer
is that you should probably
(a) write in a style that you personally consider clear
(b) carefully tune your actual program on actual examples after you're
sure it's correct, it has known-typical data, it has a known platform,
etc.
Premature adherence to one or the other code style on the belief it will
make you faster is probably bogus.
Moreover, and perhaps more importantly, prematurely believing that any
given program you write will ever be used in a program where its
performance matters is probably the single biggest waste of time I've
seen programmers do. Programs mostly get thrown away. Of the few
that don't, many have adequate performance with no tuning. Of the few
that don't from that tiny set, THOSE are the ones to which you should
be applying your tuning energies--on a particular platform, particular data,
etc.
However, I think it's pretty clearly wrong to think that consing doesn't
matter, so even if you start thinking that you want to do some style that
prefers non-destructive over destructive, I don't think that means you're
suddenly free of the cost of gc and consing. In both styles, you still
have to worry about storage wastage, about intermediate expression swell,
etc.
I'll try to post the example I was using later, though it's probably
easy enough to create your own. Mine is mostly interesting because the
same program shows different results (with respect to the above claim,
not just raw number differences) on different platforms.
Kent M Pitman <······@nhplace.com> writes:
> However, I think it's pretty clearly wrong to think that consing doesn't
> matter, so even if you start thinking that you want to do some style that
> prefers non-destructive over destructive, I don't think that means you're
> suddenly free of the cost of gc and consing. In both styles, you still
> have to worry about storage wastage, about intermediate expression swell,
> etc.
Consing doesn't matter as much as it used to. There's no longer a
need to watch your allocation like a hawk(inson).
Of course, you can always slow things down by *excessively* depending
on the GC.
Joe Marshall <···@ccs.neu.edu> wrote:
+---------------
| Kent M Pitman <······@nhplace.com> writes:
| > However, I think it's pretty clearly wrong to think that consing doesn't
| > matter, so even if you start thinking that you want to do some style that
| > prefers non-destructive over destructive, I don't think that means you're
| > suddenly free of the cost of gc and consing. In both styles, you still
| > have to worry about storage wastage, about intermediate expression swell,
| > etc.
|
| Consing doesn't matter as much as it used to. There's no longer a
| need to watch your allocation like a hawk(inson).
|
| Of course, you can always slow things down by *excessively* depending
| on the GC.
+---------------
Generational GCs that use Ungar's trick of a fixed-location "nursery"
typically find that the best nursery size is roughly the size of the
secondary cache [or in more recent CPUs, the tertiary cache or whichever
one is just before main memory!]. One assumes that the GC designer will
have done their job right, and initialized the size of the nursery
dynamically based on the cache size of the currently-running platform.
In which case, the best strategy for the user [meaning programmer]
is to try to arrange that the vast majority of intermediate storage
consed is garbage by the time a nursery-size worth's has been consed,
whereupon almost none of the data in the nursery is still "live" at
the next minor GC. Hence allocations & minor-GCs keep using the same,
cached(!) address range over & over...
In that context, note that "fresh" consing is often cheaper than
"mutating" recently-consed objects [the latter requiring a range
check to consider (and reject) adjusting the "remembered set"], but
the former will use up space in the nursery faster and trigger more
minor GCs. Hence which one is better "just depends"... as always. ;-}
-Rob
-----
Rob Warnock <····@rpw3.org>
627 26th Avenue <URL:http://rpw3.org/>
San Mateo, CA 94403 (650)572-2607
Kenny Tilton wrote:
>> Code doesn't get automagically faster by replacing non-destructive
>> functions with destructive ones.
>
> Uh, yeah, actually it does.
No, it doesn't. Program efficiency is hard to predict and can be highly
counter-intuitive. AFAICT, this is a well-known fact.
For example, destructive operations may invalidate caches while
non-destructive ones leave caches intact which can be beneficial.
> Except I am starting to think /you/ are not sure when destructive
> functions can be safely used. Hey, good question. Do /you/ follow your
> own recommended practice of always using non-destructive functions? Is
> the only time you use delete when you have a performance problem and are
> trying to fix it?
I try to use destructive list processing functions when I assume that
otherwise too much memory will be allocated. However, I typically first
try to rearrange the code so that several passes through the same data
are not necessary. I only use destructive functions when I am very
certain that it's safe to do so. When I am not sure - which could likely
be more often than you are - I try not to waste too much time on this
issue and just use the non-destructive one. The (unexpected) overhead
will be somewhere else anyway.
In my code, list processing functions are typically used in macros.
Performance of the macro expansion is typically not that important in
compiled Common Lisp AFAICT.
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza <··@p-cos.net> writes:
> In my code, list processing functions are typically used in
> macros. Performance of the macro expansion is typically not that
> important in compiled Common Lisp AFAICT.
One thing I really like about macros is that I don't have to worry
about the macro expansion time. I use compiler only Lisps so the
macro expansion code doesn't run at run time.
When I'm writing macros, I'm actually more concerned that I'm not
breaking some rule that causes a macro function to require knowledge
of the runtime environment (which it obviously can't have).
This has been a useful thread for me. I've always just assumed that
the non-consing/destructive forms were intended to be faster in all
cases where they were safe to use. Now that notion is broken I can
just not worry about that particular aspect when writing code. The
most effective optimization seems to come from the choice of data
structures and algorithms anyway, not micro-efficiency details.
I had some interesting experiences writing code for the Apress fractal
contest. I wish I took notes while I was doing it because there were
some things that made a big difference in speed. One was using a
single dimension array for data and displacing a two dimensional array
to that rather than the other way around. That may have been an SBCL
specific detail, but it only took one little change to make a big
improvement in the speed of the code.
Also sb-sprof is very useful.
--
My .sig file sucks. Can anyone recommend a better one?
David Steuber <·····@david-steuber.com> writes:
> Pascal Costanza <··@p-cos.net> writes:
>
> > In my code, list processing functions are typically used in
> > macros. Performance of the macro expansion is typically not that
> > important in compiled Common Lisp AFAICT.
>
> One thing I really like about macros is that I don't have to worry
> about the macro expansion time. I use compiler only Lisps so the
> macro expansion code doesn't run at run time.
Well, to every rule there is an exception.
Well, I _have_ seen situations where you do have to worry. It depends
on how "clever" your macros are trying to be. Simple macros that do
little more than tree-walk their input often take computation time
no worse than proportional to the size of the input, and this is so.
But macros that "reason" about their their input, performing recursive
macro expansions, making deductions, doing rule processing, and other
things can still need to worry.
The way I think of it is that compile time work is a "loop invariant"
being lifted out of an iteration. The iteration out of which it is
lifted is the repeated calling of the code at runtime. But the point
to which you lift the code is not a magic place where ordinary physics
doesn't apply. Rather, it's just a thing on which there is, relatively
speaking, less stress.
In iteration terms, something like:
(dotimes (i ...)
(dotimes (j ...)
(let ((k (pure-function-of i)))
...)))
is better written:
(dotimes (i ...)
(let ((k (pure-function-of i)))
(dotimes (j ...)
...)))
But that doesn't imply that PURE-FUNCTION-OF's runtime doesn't
matter in the new home. It still matters. It just matters less.
If you think of the inner dotimes as the repeated calling of the code
and the outer dotimes as the repeated compilation of the code, the
time it takes still matters. Just less.
The CL cleanup issues AREF-1D and DECLARE-MACROS were a direct result
of my having run into a situation [the one described in AREF-1D of
porting LISTARRAY and FILLARRAY for Macsyma]. Out of 100,000 lines of
code in Macsyma taking about a half hour to compile on the Lisp
Machine in 1986 or so, I think each of these two functions (which were
only a few tens of lines long) took 5 minutes each (that is, these two
functions alone took 10 of the 30 minutes it took to compile the whole
system) because partly of the improper attention I paid to algorithmic
complexity issues and because some of the complexity issues were not
obvious to me for reasons explained in the writeup of DECLARE-MACROS.
(Let's just say it wasn't one of the smartest pieces of code I ever
wrote... clever? yeah, maybe. But not smart. You all probably know
what they say about how it's best not to try to be overly clever. It
comes back to haunt you. And it did me in this case.)
In effect, the number of macro expansions it took to macroexpand a
set of binding macros nested n deep in CLTL1 was, at minimum, 2^n
because you had to pre-expand the first form in the body to see if it
was a macro. If the result of the expansion was itself a macro, the
effect was grossly worse and the total work could become more like
m^n. It wasn't pretty.
It was possible to work around this problem by replacing
(dotimes (i ...) (dotimes (j ...) ...))
with
(dotimes (i ...) nil (dotimes (j ...) nil ...))
But it was also ugly and hard to explain to people what the
purpose of the NIL was.
And the macro I had written did the even more egregious thing of
iterating from 0 to ARRAY-RANK-LIMIT (which was only something like 7
in Symbolics Common Lisp, and which I had made the mistake of thinking
wouldn't be much bigger in other Lisps. In some, it turned out to be
in the thousands or higher...). So the effect I'm talking about was
bad in that implementation, but would have been just impossibly worse
in other implementations had it gotten that far.
So it was for this performance reason that we took out the ability of
a macro to expand to a declaration, losing some useful functionality,
but in order to not surprise people with awful compilation time.
Even so, though this particular situation resulted in changes to the
language, that can't be done in general. Since arbitrary code is run
at macroexpansion time, it's impossible to completely shield users from
the risk that a computation will take forever (or annoyingly long).
And so, users do have to care about macro compile time, because it DOES
take time to compile. Often not much. But sometimes a lot.
I'll go one farther, though, and spin it all a different way:
The really good thing about Lisp is that you CAN write macros of
enough complexity that you have to sometimes worry about efficiency.
Maybe most of the time, you don't have to watch very hard. But at
least you can, if you want, do a hard enough computation that you can
mire yourself. To me, it seems a worthwhile trade.
Pascal Costanza wrote:
> Kenny Tilton wrote:
>
>>> Code doesn't get automagically faster by replacing non-destructive
>>> functions with destructive ones.
>>
>>
>> Uh, yeah, actually it does.
>
>
> No, it doesn't. Program efficiency is hard to predict and can be highly
> counter-intuitive. AFAICT, this is a well-known fact.
Rubbish, nonsense, and crap. Let me help you all. Here is where I am
coming from. We often get newbies posting code here saying, Whoa, this
is slow. We find that an innocent looking bit of recursive code is
allocating ten million cons cells to get not much work done, thanks to
an exponential explosion of structure copying. We turn the code around
running a couple of orders of magnitude faster thx to one or more
straightforward changes and away goes a happy newby, who might have
ended up here thx to Costanza's Law. ("Thall shalt not use destructive
functions.")
OK, trust me, there may be edge cases where the effect on GC makes
destructive infinitessimally slower than destructive, but does anyone
think we will ever see a newby here with dead slow code which gets fixed
by substituting a non-destructive func for a destructive one? I await
the answer to this question eagerly, because it seems the
anti-destructive crowd is clinging to this argument as if it somehow
justifies Costanza's injunction to newbies never to use destructive
functions. Methinks it justifies nothing. Raise your hand if you ever
coded a non-destructive function thinking, "Gosh, this will run faster."
Raise your hand if you ever tuned code by changing a destructive
function to a non-destructive one. Can't wait for the results on this
one either.
As for efficiency being unpredictable, OK, now you are talking about
something completely different. But I have already addressed this silly
muddying of the waters so I will not bother again. (OK, a hint: no one
is talking about code optimization except you and others trying to
change the subject away from the need to learn about lists if you want
to program Lisp.)
Finally, let's talk about your latest bit of misinformation: lists are
not all that big a deal for Lisp programming. Perhaps you need a few
more years Lisp under your belt, because I think you still do not get
it. One of the things that makes Lisp superior to other languages is
precisely its orientation around lists, and its rich library of
list-manipulating functions. The surprise is that such a simple data
structure can be so powerful. It is. Newbies need to learn it inside and
out. Period. (Full stop for the yobbos.)
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton wrote:
> it seems the
> anti-destructive crowd is clinging to this argument as if it somehow
> justifies Costanza's injunction to newbies never to use destructive
> functions.
I have never claimed that newbies should never use destructive
functions. I just said that other things are more important to learn first.
It's very unclear to me what your alternative suggestion is. Learn all
of Common Lisp at once in one day? Spend one year on the list processing
functions, and learn everything else later on?
Do you really think that nreconc is more important than defmethod? ;)
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza wrote:
> It's very unclear to me what your alternative suggestion is. Learn all
> of Common Lisp at once in one day?
How is understanding linked lists "learning all of Common Lisp"? (I
know, you just do not want to give up.)
Spend one year on the list processing
> functions, and learn everything else later on?
>
> Do you really think that nreconc is more important than defmethod? ;)
Did it really take you a year to learn list processing functions? I had
a feeling something like this lay behind your silly advice to newbies.
I am reminded of J. Edgar Hoover, whose limo driver was not allowed to
make left turns because one time they got t-boned making a left turn
(driving on the right as we do in the US, one must cross a lane of
oncoming traffic to turn left). I guess the impact gave our fearless FBI
leader a scare he never got over. hmmm...
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton <·······@nyc.rr.com> writes:
> Pascal Costanza wrote:
>
>> It's very unclear to me what your alternative suggestion is. Learn
>> all of Common Lisp at once in one day?
>
> How is understanding linked lists "learning all of Common Lisp"? (I
> know, you just do not want to give up.)
Learning about linked lists is straightforward (though still doesn't
need to be tackled in any serious way until, oh, around about Chapter
12 of a good Lisp book). Learning all the details of which functions
are recycling (destructive) *and* which functions can return shared
structure is not. Also, unless I'm thinking of different typical
examples than you, the typical exponentially exploding algorithms that
we see from newbies because they don't understand lists wouldn't be
solved by simply replacing a non-recycling function with it's
recycling counterpart. I.e. this:
(loop until (whatever) do (setf my-list (nconc my-list (list (new-thing)))))
may cons less than this:
(loop until (whatever) do (setf my-list (append my-list (list (new-thing)))))
but the big-O complexity--which is the thing that's killing the
newbie--is the same for both. I'd say that either of the following
would be an appropriate fix (leaving out just using COLLECT in the LOOP):
(loop until (whatever) do (push (new-thing) my-list) finally (return (reverse my-list)))
or:
(loop until (whatever) do (push (new-thing) my-list) finally (return (nreverse my-list)))
-Peter
--
Peter Seibel * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/
Peter Seibel wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
>
>>Pascal Costanza wrote:
>>
>>
>>>It's very unclear to me what your alternative suggestion is. Learn
>>>all of Common Lisp at once in one day?
>>
>>How is understanding linked lists "learning all of Common Lisp"? (I
>>know, you just do not want to give up.)
>
>
> Learning about linked lists is straightforward (though still doesn't
> need to be tackled in any serious way until, oh, around about Chapter
> 12 of a good Lisp book).
Only if you define a good Lisp book as "Practical Common Lisp" (there's
a fat pitch), in which the focus is on real-world tasks in specifically
today's real word. If, otoh, one were writing a book about Lisp being a
great language for programming any task in any world, the Unbearable
Powerfulness of native support for linked lists would be section one of
chapter one.
I mean, youse guys were at ILC 2005 where McCarthy explained what "lisp"
stands for, right?
And when it comes to greenspunning, can anyone name a feature more often
re-invented than list handling?
I guess you 21st century Lispniks look at list processing as an amusing
relic, safely ignored until Chapter 12 or Pascal's graduate-level
seminar on Advanced Lisp (where he is staying one chapter ahead of the
students). You'll learn, but not before crippling a generation of
newbies with your poor understanding of why you like the language you like.
Hey, any chance we can get &rest args presented in a vector instead of
one of those tricky list thingys?
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton <·······@nyc.rr.com> writes:
> Peter Seibel wrote:
>> Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>>Pascal Costanza wrote:
>>>
>>>
>>>>It's very unclear to me what your alternative suggestion is. Learn
>>>>all of Common Lisp at once in one day?
>>>
>>>How is understanding linked lists "learning all of Common Lisp"? (I
>>>know, you just do not want to give up.)
>> Learning about linked lists is straightforward (though still doesn't
>> need to be tackled in any serious way until, oh, around about Chapter
>> 12 of a good Lisp book).
>
> Only if you define a good Lisp book as "Practical Common Lisp"
> (there's a fat pitch), in which the focus is on real-world tasks in
> specifically today's real word. If, otoh, one were writing a book
> about Lisp being a great language for programming any task in any
> world, the Unbearable Powerfulness of native support for linked
> lists would be section one of chapter one.
Hmmm. Maybe someone can write that first chapter--I'd love to see
it. I just don't think that native support for linked lists, by
itself, is that big a deal and I'd be interested to see someone try
and make the case to a Lisp newbie in a way that is actually
compelling.
At the very least you also need the reader and printer. And it really
doesn't start to get interesting until you also have symbols as a
first class data type. And none of that is really interesting until
you've deeply immersed yourself in the idea that often the best way to
write a program is to write a language for writing the program in
(either using macros or a DSL that you compile to some other target
language, possibly not even Lisp.)
> And when it comes to greenspunning, can anyone name a feature more
> often re-invented than list handling?
Garbage collection (via ref counting, etc. That's where the "buggy",
"slow", and "poorly specified" part of Greenspun's comes in.)
Multi-methods and the ability to add new methods to existing classes
(both via the Visitor pattern).
Bignums.
>
> I guess you 21st century Lispniks look at list processing as an
> amusing relic, safely ignored until Chapter 12 or Pascal's
> graduate-level seminar on Advanced Lisp (where he is staying one
> chapter ahead of the students). You'll learn, but not before crippling
> a generation of newbies with your poor understanding of why you like
> the language you like.
Hmmm. What *do* you use lists for, outside of macros, that they're so
important to you.
-Peter
--
Peter Seibel * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/
Kenny Tilton <·······@nyc.rr.com> writes:
> Peter Seibel wrote:
>> Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>>Pascal Costanza wrote:
>>>
>>>
>>>>It's very unclear to me what your alternative suggestion is. Learn
>>>>all of Common Lisp at once in one day?
>>>
>>>How is understanding linked lists "learning all of Common Lisp"? (I
>>>know, you just do not want to give up.)
>> Learning about linked lists is straightforward (though still doesn't
>> need to be tackled in any serious way until, oh, around about Chapter
>> 12 of a good Lisp book).
>
> Only if you define a good Lisp book as "Practical Common Lisp"
> (there's a fat pitch), in which the focus is on real-world tasks in
> specifically today's real word. If, otoh, one were writing a book
> about Lisp being a great language for programming any task in any
> world, the Unbearable Powerfulness of native support for linked
> lists would be section one of chapter one.
Hmmm. Maybe someone can write that first chapter--I'd love to see
it. I just don't think that native support for linked lists, by
itself, is that big a deal and I'd be interested to see someone try
and make the case to a Lisp newbie in a way that is actually
compelling.
At the very least you also need the reader and printer. And it really
doesn't start to get interesting until you also have symbols as a
first class data type. And none of that is really interesting until
you've deeply immersed yourself in the idea that often the best way to
write a program is to write a language for writing the program in
(either using macros or a DSL that you compile to some other target
language, possibly not even Lisp.)
> And when it comes to greenspunning, can anyone name a feature more
> often re-invented than list handling?
Garbage collection (via ref counting, etc. That's where the "buggy",
"slow", and "poorly specified" part of Greenspun's comes in.)
Multi-methods and the ability to add new methods to existing classes
(both via the Visitor pattern).
Bignums.
>
> I guess you 21st century Lispniks look at list processing as an
> amusing relic, safely ignored until Chapter 12 or Pascal's
> graduate-level seminar on Advanced Lisp (where he is staying one
> chapter ahead of the students). You'll learn, but not before crippling
> a generation of newbies with your poor understanding of why you like
> the language you like.
Hmmm. What *do* you use lists for, outside of macros, that they're so
important to you.
-Peter
--
Peter Seibel * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/
Kenny Tilton <·······@nyc.rr.com> writes:
> Peter Seibel wrote:
>> Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>>Pascal Costanza wrote:
>>>
>>>
>>>>It's very unclear to me what your alternative suggestion is. Learn
>>>>all of Common Lisp at once in one day?
>>>
>>>How is understanding linked lists "learning all of Common Lisp"? (I
>>>know, you just do not want to give up.)
>> Learning about linked lists is straightforward (though still doesn't
>> need to be tackled in any serious way until, oh, around about Chapter
>> 12 of a good Lisp book).
>
> Only if you define a good Lisp book as "Practical Common Lisp"
> (there's a fat pitch), in which the focus is on real-world tasks in
> specifically today's real word. If, otoh, one were writing a book
> about Lisp being a great language for programming any task in any
> world, the Unbearable Powerfulness of native support for linked
> lists would be section one of chapter one.
Hmmm. Maybe someone can write that first chapter--I'd love to see
it. I just don't think that native support for linked lists, by
itself, is that big a deal and I'd be interested to see someone try
and make the case to a Lisp newbie in a way that is actually
compelling.
At the very least you also need the reader and printer. And it really
doesn't start to get interesting until you also have symbols as a
first class data type. And none of that is really interesting until
you've deeply immersed yourself in the idea that often the best way to
write a program is to write a language for writing the program in
(either using macros or a DSL that you compile to some other target
language, possibly not even Lisp.)
> And when it comes to greenspunning, can anyone name a feature more
> often re-invented than list handling?
Garbage collection (via ref counting, etc. That's where the "buggy",
"slow", and "poorly specified" part of Greenspun's comes in.)
Multi-methods and the ability to add new methods to existing classes
(both via the Visitor pattern).
Bignums.
>
> I guess you 21st century Lispniks look at list processing as an
> amusing relic, safely ignored until Chapter 12 or Pascal's
> graduate-level seminar on Advanced Lisp (where he is staying one
> chapter ahead of the students). You'll learn, but not before crippling
> a generation of newbies with your poor understanding of why you like
> the language you like.
Hmmm. What *do* you use lists for, outside of macros, that they're so
important to you.
-Peter
--
Peter Seibel * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp * http://www.gigamonkeys.com/book/
+ Peter Seibel <·····@gigamonkeys.com>:
| Er, Gnus explosion. Sorry about that.
Gnus has a nice cancel feature, you now: gnus-summary-cancel-article
(C in summary mode).
--
* Harald Hanche-Olsen <URL:http://www.math.ntnu.no/~hanche/>
- Debating gives most of us much more psychological satisfaction
than thinking does: but it deprives us of whatever chance there is
of getting closer to the truth. -- C.P. Snow
Peter Seibel <·····@gigamonkeys.com> writes:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
> > world, the Unbearable Powerfulness of native support for linked
> > lists would be section one of chapter one.
>
> Hmmm. Maybe someone can write that first chapter--I'd love to see
> it. I just don't think that native support for linked lists, by
> itself, is that big a deal and I'd be interested to see someone try
> and make the case to a Lisp newbie in a way that is actually
> compelling.
>
> At the very least you also need the reader and printer. And it really
> doesn't start to get interesting until you also have symbols as a
I would say that the key for list processing is true automatic memory
mgt (GC). That is the single most important thing that makes list use
_simple and natural_.
/Jon
> Hmmm. What *do* you use lists for, outside of macros, that they're so
> important to you.
Many things actually, but I think KT is using hyperbole above. As a
side note, if you are doing full traversals they are typically faster
than arrays since the next address is just a dereference.
/Jon
--
'j' - a n t h o n y at romeo/charley/november com
jayessay wrote:
> Many things actually, but I think KT is using hyperbole above. As a
> side note, if you are doing full traversals they are typically faster
> than arrays since the next address is just a dereference.
Uhm, why? On the architectures I know it doesn't matter if you
dereference an address (the cdr), or if you dereference your base
register + 4 (and probably increment it). The incrementation can take
place for free, if your CPU can do more than one operation at once, and
the next array value is almost guaranteed to already be in the cache.
--
My ideal for the future is to develop a filesystem remote interface
(a la Plan 9) and then have it implemented across the Internet as
the standard rather than HTML. That would be ultimate cool.
Ken Thompson
Ulrich Hobelmann <···········@web.de> writes:
> jayessay wrote:
> > Many things actually, but I think KT is using hyperbole above. As a
> > side note, if you are doing full traversals they are typically faster
> > than arrays since the next address is just a dereference.
>
> Uhm, why?
The incrementing, but
> On the architectures I know it doesn't matter if you dereference an
> address (the cdr), or if you dereference your base register + 4 (and
> probably increment it). The incrementation can take place for free,
> if your CPU can do more than one operation at once, and the next
> array value is almost guaranteed to already be in the cache.
certainly sounds plausible. OTOH, I've seen this difference for sure
on SPARC and some X86 stuff. So, like most "performance issues", it
probably "depends"...
/Jon
--
'j' - a n t h o n y at romeo/charley/november com
Kenny Tilton wrote:
> And when it comes to greenspunning, can anyone name a feature more often
> re-invented than list handling?
This wasn't invented for Lisp, but John McCarthy got the idea from IPL.
> I guess you 21st century Lispniks look at list processing as an amusing
> relic, safely ignored until Chapter 12 or Pascal's graduate-level
> seminar on Advanced Lisp (where he is staying one chapter ahead of the
> students). You'll learn, but not before crippling a generation of
> newbies with your poor understanding of why you like the language you like.
So why don't you use property lists instead of CLOS in Cells? They are
even more flexible because you can extend them with arbitrary properties
at runtime, unlike CLOS objects. Is it because of efficiency? Well just
follow Tilton's rule: Use destructive functions, and suddenly you will
get lightning speed. ;) (That's what you said!)
> Hey, any chance we can get &rest args presented in a vector instead of
> one of those tricky list thingys?
One shouldn't use destructive functions on &rest arguments.
Or was your statement an off topic remark? ;P
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Kenny Tilton wrote:
> Pascal Costanza wrote:
>
>> It's very unclear to me what your alternative suggestion is. Learn all
>> of Common Lisp at once in one day?
>
> How is understanding linked lists "learning all of Common Lisp"?
Where do you think I have suggested that "understanding linked lists" is
"learning all of Common Lisp"?!?
>> Spend one year on the list processing functions, and learn
>> everything else later on?
>>
>> Do you really think that nreconc is more important than defmethod?
>> ;)
> Did it really take you a year to learn list processing functions?
Guess what? I still don't understand some of them. Do you understand
what nreconc is there for? Have you ever used it?
> I had a feeling something like this lay behind your silly advice to newbies.
OK, final attempt: Assume you were supposed to write a book about Common
Lisp. In which chapter would you discuss nconc, nreconc, and the likes?
Before or after, say, the chapters about arrays or CLOS?
Note that the book this thread was originally about discusses the
destructive functions in Chapter 12 and arrays in Chapter 18. (It has 22
chapters in total.) For example, CLOS isn't mentioned at all.
Apparently, it even contains serious bugs, like attempts to call quoted
lambda expressions as functions, or attempts to side effect literal
lists. BTW, the book's title in English is "Programming in Common Lisp"
and is from 1995. (You can find its table of contents in amazon.de - the
author is Otto Mayer.)
The OP explicitly said that he wanted to read a German book before
switching to an English one in order to get used to the terminology. I
have just tried to encourage him to not spend too much time on topics
that are discussed in (IMHO) too much detail in that book while
completely skipping much more fundamental ones.
If you have a better idea how to get someone read about the really
important features of Common Lisp without actually explaining all the
details of the less important ones, I would be happy to hear about it.
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza <··@p-cos.net> writes:
> Guess what? I still don't understand some of them. Do you understand
> what nreconc is there for? Have you ever used it?
You didn't think you'd get away with a rhetorical question like that,
did you?
I have used NRECONC. When you are processing a list and the
interesting stuff is at the head of the list, you write a loop like
this:
(do ((tail list (cdr tail))
(processed '() (cons (do-something (car tail))
processed)))
((done? ...) (nreconc processed tail)))
The nreconc reverses the processed stuff (which is accumulated
backwards) and pastes it on to the remaining element of LIST.
~jrm
Joe Marshall wrote:
> Pascal Costanza <··@p-cos.net> writes:
>
>>Guess what? I still don't understand some of them. Do you understand
>>what nreconc is there for? Have you ever used it?
>
> You didn't think you'd get away with a rhetorical question like that,
> did you?
Not on c.l.l... ;)
> I have used NRECONC. When you are processing a list and the
> interesting stuff is at the head of the list, you write a loop like
> this:
>
> (do ((tail list (cdr tail))
> (processed '() (cons (do-something (car tail))
> processed)))
> ((done? ...) (nreconc processed tail)))
>
> The nreconc reverses the processed stuff (which is accumulated
> backwards) and pastes it on to the remaining element of LIST.
Ah, finally a clue. Thanks for that!
Indeed, I could have used something like that before, but came up with a
solution with LOOP that looks like this:
(loop for (car . cdr) on list
collect (do-something car) into processed
until done
finally (return (nconc processed cdr)))
It's a pity that both solutions have to do an extra pass through the
processed list as a last step. It would be nice if there were a way to
avoid that...
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza <··@p-cos.net> wrote:
+---------------
| It's a pity that both solutions have to do an extra pass through the
| processed list as a last step. It would be nice if there were a way to
| avoid that...
+---------------
There is, but it involves keeping your own tail-pointer to
the list you're (manually) COLLECT'ing, and then RPLACD the
tail to point to the rest...
-Rob
-----
Rob Warnock <····@rpw3.org>
627 26th Avenue <URL:http://rpw3.org/>
San Mateo, CA 94403 (650)572-2607
On Sat, 03 Sep 2005 20:21:08 +0200, Pascal Costanza <··@p-cos.net> wrote:
> Do you understand what nreconc is there for? Have you ever used it?
Here's an example:
<http://groups.google.com/group/comp.lang.lisp/msg/2520fe9bc7749328>
Cheers,
Edi.
--
Lisp is not dead, it just smells funny.
Real email: (replace (subseq ·········@agharta.de" 5) "edi")
Edi Weitz wrote:
> On Sat, 03 Sep 2005 20:21:08 +0200, Pascal Costanza <··@p-cos.net> wrote:
>
>>Do you understand what nreconc is there for? Have you ever used it?
>
> Here's an example:
>
> <http://groups.google.com/group/comp.lang.lisp/msg/2520fe9bc7749328>
Yes, I am aware of that example, but I find it too obfuscated. I prefer
this:
(loop for (key value) on plist by #'cddr
unless (member key keys-to-be-removed)
nconc (list key value))
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza schrieb:
> Kenny Tilton wrote:
> > Pascal Costanza wrote:
...
> If you have a better idea how to get someone read about the really
> important features of Common Lisp without actually explaining all the
> details of the less important ones, I would be happy to hear about it.
Oh. Reading this notes is ...
You know, Lisp is very old (1959 or so). Now, by the time, more and more
is added and so it is till today.
(Not want to name it feature overloaded, cause you know, that is all
important stuff, but anyway, remove that)
It is, perhaps not possible for you to see Lisp like a newbie does. But
that should be good, for Lisp. All you see with newbie-eyes is great
confusing and at last, only one way to see things right.
(Please see that point clear: the interpreter offers you hundert ways to
do something, but there is only one way to do it fast, so why are the
other possibilities necessary? Only to play around or to say we have
that also? -> Throw it away. Means (do (tabula-rasa :all)))
Is it ok to have all that nice methods other languages have and say at
the same time: Lisp is different?
No, I say. And I say it for one reason: LIStoriented Programming. Right?
No word about objects, symbols, lambdas or whatever was added later.
Remember, this kind of point: academic way to do things vs. practicably
way to do things. (why we need more than one LET? Or why there is
-traditional and -modern? more can stand here). There are things, that
make life easy for experienced coders but these things will confuse new
followers.
There is a point reached with that relict. You can have an universal
tool, like you have today. Ready to manage all the software-world
offers. And this all in one application. Is it that what you what? (You
can go down in dependencies with that.)
Otherwise go back to the roots, exclude all that modern stuff into load
able packages and get a nice, small and clean interpreter. (Thats what I
want to see, this should solve speed problems, load only what you need).
Or, what you mean will happen when command-line-freaks meets
clicky-bunty-jupies? This is what happens here, I might say.
Meanwhile you teach me: Lisp is the best tool ever found. Well, use it
-self.
-----------
clisp [[-h] | [--help]] [--version] [--license] [-B lisp-lib-dir] [-M
mem-file] [-m mem-size] [-L language] [-N locale-dir] [-Edomain
encoding] [[-q] | [--quiet] | [--silent] | [-v] | [--verbose]]
[-on-error action] [-repl] [-w] [-I] [[-ansi] | [-traditional]]
[-modern] [-p
package] [-C] [-norc] [-i init-file...] [-c [-l] lisp-file [-o
output-file]...] [-x expressions...] [lisp-file [argument...]]
What you think a newbie thought when looking at this? For historical
reasons ... one end up in a 3-line start command (no click icon(s)).
Great luck here: max is 256 char, but that can be changed, sure.
(Note: This is clisp example, other Lisps may have eq-things)
(DEFCLASS class-name (superclass-name*)
Ok, sounds like Java. Yes, but the Java-people take this from the
Lisp-people and Lisp is still the better way to do it. But in Java it is
much faster ... bla.
(you can add here what you want instead of "Java", its an example, so
judge yourself, what it is good for, to have it include in the
interpreter or exclude it in a separate package and do some optimize
there or so. MOP, yes, another relict from dino-days, just like the
question about an editor, right? Means, nothing small and quick like a
speedy fire-fox, the academic way to do it. Should I look for a
"house-wife-mode"?)
(DEFGENERIC function-name lambda-list)
-uhh Yes. Great. Thats it. Bye. apropos
(create-window 20 20 180 180 :content "Bye" :execute bye)
[This isn't available in LISP.]
Ahja, when I can do it in C why should I do it in Lisp?
(see online paper XY (pscht, dated 1978)) ...how the story goes on.
-----------------
I only can hope, you see what I try to point out here.
Maybe you know my favorite, its newLisp v.6.22, 168 448 Bytes (runs on
win and *nix (wine did it)). CLisp I have w/ mathematics (for all other
goodys may time is too short).
stefan
Kenny Tilton wrote:
> coming from. We often get newbies posting code here saying, Whoa, this
> is slow. We find that an innocent looking bit of recursive code is
The couple newbies who find there way into the Lisp world usually have
quite some programming experience (including optimization, debugging,
etc.) in other languages, are known to like experimenting with different
languages and programming concepts (otherwise, how would they have found
out about Lisp?) and normally are not so incompetent as to not know when
to do optimization, and how to use profiling tools, when available. In
fact, when they come to Usenet or mailing lists and ask about how to
make things faster, is that not a good indicator that they're trying to
find out about these things and learn about them?
mkb.
Matthias Buelow wrote:
> Kenny Tilton wrote:
>
>> coming from. We often get newbies posting code here saying, Whoa, this
>> is slow. We find that an innocent looking bit of recursive code is
>
>
> The couple newbies who find there way into the Lisp world usually have
> quite some programming experience (including optimization, debugging,
> etc.) in other languages, are known to like experimenting with different
> languages and programming concepts (otherwise, how would they have found
> out about Lisp?) and normally are not so incompetent as to not know when
> to do optimization, and how to use profiling tools,...
<PWUAAHAHAHAHAAH> So your idea is to teach Lisp incorrectly then let
them use profiling tools to find out that Costanza was wrong to say
newbies should eschew destructive functions as tools only experts should
risk using? <\PWUAHAHHAHAHAHAHHAHA>
After the newbies find out about the dangers of unnecessary structure
duplication are they allowed to use destructive functions, or is
Costanza's Ban contagious: should they instead avoid algorithms which
require destructive functions?
You guys have one thing going for you. My newbies will be able to write
tricky algorithms efficiently, but yours will have mastered profilers,
GC internals, and compiler internals. I am humbled.
PWUUAHAHAHAHAHHAA. Now I remember why I had stopped posting here.
--
Kenny
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
"I've wrestled with reality for 35 years, Doctor, and I'm happy to state
I finally won out over it."
Elwood P. Dowd, "Harvey", 1950
Kenny Tilton wrote:
> <PWUAAHAHAHAHAAH> So your idea is to teach Lisp incorrectly then let
> them use profiling tools to find out that Costanza was wrong to say
> newbies should eschew destructive functions as tools only experts should
> risk using? <\PWUAHAHHAHAHAHAHHAHA>
I have never said that destructive functions are tools that only experts
should use.
> After the newbies find out about the dangers of unnecessary structure
> duplication are they allowed to use destructive functions, or is
> Costanza's Ban contagious: should they instead avoid algorithms which
> require destructive functions?
I have never said that people should be disallowed to use destructive
functions.
> PWUUAHAHAHAHAHHAA. Now I remember why I had stopped posting here.
It may be a good idea to take a deep breath, calm down and reread what I
and others have actually said.
Pascal
--
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
Pascal Costanza wrote:
> Kenny Tilton wrote:
>> /Always/ use the destructive version if one can get away with it.
>> Understanding when one can get away with it is not so hard, and
>> requires no more than the same understanding one needs anyway to
>> program in Lisp.
> Code doesn't get automagically faster by replacing non-destructive
> functions with destructive ones.
FWIW, I'm diving back into compiler code optimizations now,
and I find that for many programs it is possible to richly
reward programmers who avoid destructive functions.
Aggressive memoization, evaluation reordering, etc, all
save lots of CPU time, and the instant there's a side effect
you can't do most of them any more.
Bear
Bernd Schmitt <··················@gmx.net> writes:
>> What you have in mind is that variables are updated to see new
>> values. However, destructive functions in Lisp are allowed _not_ to
>> update the corresponding variables. (There are good reasons for
>> this.) So in general, you still need to use assignment via setq or
>> setf to store the results in the right places. That's why it in
>> general also doesn't really matter whether you use the destructive
>> or non-destructive functions, except for optimization purposes.
>
> As far as i understand (is there no apprev. for this AFAIU? I have the
> feeling I will use this term frequently ...), nconc is a destructive
> version of append, isnt't it? Nconc does change variables, if i
> understand the following in the right way:
No. nconc doesn't change variables. It changes values. That's what is
meant by "destructive".
> CL-USER> (setq x '(1 2) y '(3 4))
> (3 4)
> CL-USER> (nconc x y)
> (1 2 3 4)
> CL-USER> x
> (1 2 3 4)
>
> So, when do I know, whether a variable can be used to catch the result
> of a destructive function (or was my example implementation
> dependent)?
It was the wrong experiment to do, because you still have the wrong model.
(setq x (list 1 2) y (list 3 4)) ; better to create mutable lists instead
; of immutable quoted lists, since we'll
; use the destructive nconc.
(setq ox x oy y) ; let's keep a reference of the original values;
(and (eq ox x) (eq oy y)) --> T ; see, ox and x refer to the same value.
; and oy and y too.
(nconc x y) --> (1 2 3 4)
x --> (1 2 3 4)
but what you don't see is that the cons refered to by x is still the
same as before, and has not even been modified by nconc!
(and (eq ox x) (eq oy y)) --> T
ox --> (1 2 3 4)
(eq (cddr x) y) --> T
Let's to some drawings:
+-----------------------------------------------+
| Before nconc: |
| |
| +---+ +----+ +---+ +----+ |
| | X | | OX | | Y | | OY | |
| +---+ +----+ +---+ +----+ |
| | | | | |
| | +------+ | +------+ |
| | | | | |
| v v v v |
| +---+---+ +---+---+ +---+---+ +---+---+ |
| | * | * |-->| * |NIL| | * | * |-->| * |NIL| |
| +---+---+ +---+---+ +---+---+ +---+---+ |
| | | | | |
| v v v v |
| +---+ +---+ +---+ +---+ |
| | 1 | | 2 | | 3 | | 4 | |
| +---+ +---+ +---+ +---+ |
+-----------------------------------------------+
+-----------------------------------------------+
| After nconc: |
| |
| +---+ +----+ +---+ +----+ |
| | X | | OX | | Y | | OY | |
| +---+ +----+ +---+ +----+ |
| | | | | |
| | +------+ | +------+ |
| | | | | |
| v v v v |
| +---+---+ +---+---+ +---+---+ +---+---+ |
| | * | * |-->| * | * |-->| * | * |-->| * |NIL| |
| +---+---+ +---+---+ +---+---+ +---+---+ |
| | | | | |
| v v v v |
| +---+ +---+ +---+ +---+ |
| | 1 | | 2 | | 3 | | 4 | |
| +---+ +---+ +---+ +---+ |
+-----------------------------------------------+
So you see, the only thing modified by nconc, has been the (cdr (cdr x)).
No variable (binding) has been modified.
nconc is a function and function arguments are only passed by _value_.
Only, some values are references. A macro, like push, could modify a
variable (binding) when it's given an unevaluated symbol or in general
a _place_.
Now, as explained, if the first argument given to nconc was nil, since
this is not a cons cell, it has not cdr to be modified, so nconc
doesn't do anything and just return the second argument.
This is to handle this case that you need to write:
(setf x (nconc x y))
and similarly with other destructive functions.
(delete 1 (list 1)) --> NIL
So: (setf list (list 1)) (delete 1 list) doesn't "destroy" or modify anything.
You need to write (setf list (delete 1 list)) to handle this case.
But some macros like push or pop do this for you:
(setf list (list 1))
(push 2 list)
list --> (2 1)
See it with;
(macroexpand '(push 2 list)) --> (SETQ LIST (CONS 2 LIST)) ; T
--
__Pascal Bourguignon__ http://www.informatimago.com/
Kitty like plastic.
Confuses for litter box.
Don't leave tarp around.
Pascal Bourguignon wrote:
> Bernd Schmitt <··················@gmx.net> writes:
> [explanations and drawings]
> So you see, the only thing modified by nconc, has been the (cdr (cdr x)).
> No variable (binding) has been modified.
> nconc is a function and function arguments are only passed by _value_.
Another thing I was unsure about in lisp, thanks.
> Only, some values are references. A macro, like push, could modify a
> variable (binding) when it's given an unevaluated symbol or in general
> a _place_.
Ok, I will remember this, when I reach the chapter in the book.
> Now, as explained, if the first argument given to nconc was nil, since
> this is not a cons cell, it has not cdr to be modified, so nconc
> doesn't do anything and just return the second argument.
> This is to handle this case that you need to write:
> (setf x (nconc x y))
Ok. (setf is 40 pages away :)
> and similarly with other destructive functions.
> (delete 1 (list 1)) --> NIL
> So: (setf list (list 1)) (delete 1 list) doesn't "destroy" or modify anything.
> You need to write (setf list (delete 1 list)) to handle this case.
> But some macros like push or pop do this for you:
> (setf list (list 1))
> (push 2 list)
> list --> (2 1)
> See it with;
> (macroexpand '(push 2 list)) --> (SETQ LIST (CONS 2 LIST)) ; T
Thank you for your detailed explanations, illustrations and valuable
informations (which I will remember when I reach apropriate chapters in
the book).
I am absolutely surprised how helpful this group is.
Thanks,
Bernd
--
https://gna.org/projects/mipisti - (microscope) picture stitching
T_a_k_e__c_a_r_e__o_f__y_o_u_r__R_I_G_H_T_S.
P_r_e_v_e_n_t__L_O_G_I_C--P_A_T_E_N_T_S
http://www.ffii.org, http://www.nosoftwarepatents.org
Bernd Schmitt <··················@gmx.net> wrote:
+---------------
| Pascal Costanza wrote:
| > However, mapcan is somewhat old-fashioned. It's especially odd that the
| > filtering function has to return the elements wrapped in lists.
| > It's clearer (IMHO) to use, for example, the LOOP macro instead:
| > (loop for x in '(a 2 b c 3 4 d 5)
| > when (numberp x)
| > collect x)
|
| Ok. Loop is 90 pages away ...
+---------------
O.k, try this then: ;-} ;-}
> (remove-if-not #'numberp '(a 2 b c 3 4 d 5))
(2 3 4 5)
>
+---------------
| But as i see in the index, there is no collect ...
+---------------
The COLLECT above is a "loop keyword". Don't worry about it until
you start studying LOOP for real...
-Rob
-----
Rob Warnock <····@rpw3.org>
627 26th Avenue <URL:http://rpw3.org/>
San Mateo, CA 94403 (650)572-2607
Bernd Schmitt <··················@gmx.net> writes:
> It is about destructive functions. As far as I understand, destructive
> functions "destroy" their arguments, right?
Yes. "destroy" meaning "modify" here.
> So, if one wants to use their result, the argument should be a
> variable (somehow unsure which vocabulary to use,
Not variable, mutable.
In lisp, what is variable or constant, it's the binding between a
symbol and a value. The values themselves can be mutable or immutable.
(Literal) Values given to the REPL, or present in sources should be considered
immutable (but they may not be, and in anycase when you try to modify
them the results are implementation dependant).
> is symbol in lisp the same (in this regard) as variable in c?)?
No, not exactly.
Expressed in C, a symbol would be a structure:
typedef struct {
const char* name;
object_t* value;
function_t* function;
cons_t* plist;
package_t* package;
} symbol_t;
The important point is that a C compiler has a similar structure to
keep the identifiers found in the C source. Any C program can define
such a structure and dynamically manipulate such "symbols". But in C
they're not the same, while in Lisp, there are the same: the compiler
and the user program use the same symbols both to identify variable in
the source of the program and to identify other symbolic information.
This allows for a high degree of introspection, and the interweaving
of run-time and compilation-time needed to efficiently implement
artificial intelligence (to let computers learn new tricks while
working).
And now, for something completely different...
> The book, I am reading gives this as an example for the use of mapcan
> (a filter for numbers):
>
> CL-USER> (mapcan #'(lambda (x)
> (cond ((numberp x) (list x)) (t nil)))
> '(a 2 b c 3 4 d 5)) ;; my problem
> (2 3 4 5)
> I do have problems with this.
> What happens to '(a 2 b c 3 4 d 5)?
Nothing. mapcan is not "destructive".
> Is this a temporary object which will be changed?
No.
> If such a code would be in a source file, the code (the argument)
> would be changed?
No.
> Where are those objects stored?
On the heap.
The reader keeps a reference to the expressions it reads in the
variables named: +, ++ and +++. Older expressions read are not
refered to by the reader, and could be garbage collected if no other
reference to them is kept.
> When will their space be freed (by GC?)?
When the GC runs. Common Lisp doesn't specify when this happens.
Some implementations provide a way to invoke the garbage collector.
In clisp, you can call: (EXT:GC)
--
__Pascal Bourguignon__ http://www.informatimago.com/
You're always typing.
Well, let's see you ignore my
sitting on your hands.
Hi All,
Pascal Bourguignon <····@mouse-potato.com> writes:
> Bernd Schmitt <··················@gmx.net> writes:
[deletia]
>> The book, I am reading gives this as an example for the use of mapcan
>> (a filter for numbers):
>>
>> CL-USER> (mapcan #'(lambda (x)
>> (cond ((numberp x) (list x)) (t nil)))
>> '(a 2 b c 3 4 d 5)) ;; my problem
>> (2 3 4 5)
>
>> I do have problems with this.
>> What happens to '(a 2 b c 3 4 d 5)?
>
> Nothing. mapcan is not "destructive".
I beg to differ: MAPCAN is not destructive with regard to its direct
arguments, but will certainly modify the lists returned by the mapped
function—which is fine in this case since they are freshly consed.
Cu,
Damien.
--
http://foobox.net/~dash/
I can resist everything except temptation.
--Oscar Wilde