From: Joshua Eckroth
Subject: quick newbie question
Date: 
Message-ID: <bo70ti$191fik$1@ID-212781.news.uni-berlin.de>
What does "...can do in O(n log n) steps, and that in some cases [omega](n
log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
n)?

Thanks,
-josh

From: Kenneth P. Turvey
Subject: Re: quick newbie question
Date: 
Message-ID: <slrnbqe5kq.iv3.kt@geo.premiumservices.yahoo.com>
On Mon, 03 Nov 2003 17:54:57 -0800, Joshua Eckroth <····@humboldt.edu> wrote:
> What does "...can do in O(n log n) steps, and that in some cases [omega](n
> log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
> n)?

Look up "theory of computation".  This is usually referred to as Big O
notation.  I believe there is a page on it in the Wikipedia
(http://www.wikipedia.org).

-- 
Kenneth P. Turvey <··@squeakydolphin.com>

  Artificial Intelligence Algorithms Wiki 
  http://ai.squeakydolphin.com
From: Joshua Eckroth
Subject: Re: quick newbie question
Date: 
Message-ID: <bo7gg9$1a4hn3$1@ID-212781.news.uni-berlin.de>
Thanks all for the explanations.

As for it's Lisp-ness, how it has to do with Lisp, I found it reading
McCarthy's 1960 paper.

And yes, I could have eventually found the answer in google, probably, but
it's sometimes nice (for both parties) to be able to "ask a friend", even
if that friend is a near-anonymous group.

I would ask what STFW means, but I think I'll head to google for that.

-josh
From: Coby Beck
Subject: Re: quick newbie question
Date: 
Message-ID: <bo7m6h$2j2k$1@otis.netspace.net.au>
"Joshua Eckroth" <····@humboldt.edu> wrote in message
····················@ID-212781.news.uni-berlin.de...
> I would ask what STFW means, but I think I'll head to google for that.

LOL!  Good choice...

you might also try www.acronymfinder.com

(sorry, kenny, I checked but JMUML is not in there yet... ;)

-- 
Coby Beck
(remove #\Space "coby 101 @ big pond . com")
From: Joshua Eckroth
Subject: Re: quick newbie question
Date: 
Message-ID: <bo7t5m$1a376a$1@ID-212781.news.uni-berlin.de>
Coby Beck wrote:

> 
> "Joshua Eckroth" <····@humboldt.edu> wrote in message
> ····················@ID-212781.news.uni-berlin.de...
>> I would ask what STFW means, but I think I'll head to google for that.
> 
> LOL!  Good choice...
> 

Yeah, go figure, STFW = Search The Fucking Web.  Had I known, I probably
wouldn't have been able to form a joke out of it.

-josh
From: Artie Gold
Subject: Re: quick newbie question
Date: 
Message-ID: <3FA7ED20.3060803@austin.rr.com>
Joshua Eckroth wrote:
> Coby Beck wrote:
> 
> 
>>"Joshua Eckroth" <····@humboldt.edu> wrote in message
>>····················@ID-212781.news.uni-berlin.de...
>>
>>>I would ask what STFW means, but I think I'll head to google for that.
>>
>>LOL!  Good choice...
>>
> 
> 
> Yeah, go figure, STFW = Search The Fucking Web.  Had I known, I probably
> wouldn't have been able to form a joke out of it.
> 

Ah, but as it happened, it's a great line! Certainly sig-worthy!

Anyway, please accept my apologies for any offense...and, in 
retrospect, I
should have at least added a smiley!

[There was no insult intended.]

Cheers,
--ag




-- 
Artie Gold -- Austin, Texas
Oh, for the good old days of regular old SPAM.
From: Joshua Eckroth
Subject: Re: quick newbie question
Date: 
Message-ID: <bo9nib$1bc22j$1@ID-212781.news.uni-berlin.de>
Artie Gold wrote:
> 
> Ah, but as it happened, it's a great line! Certainly sig-worthy!
> 
> Anyway, please accept my apologies for any offense...and, in
> retrospect, I
> should have at least added a smiley!
> 
> [There was no insult intended.]

All is good, and thank you for the resource.

-josh
From: Peter Seibel
Subject: Re: quick newbie question
Date: 
Message-ID: <m3brrsg9fk.fsf@javamonkey.com>
Joshua Eckroth <····@humboldt.edu> writes:

> What does "...can do in O(n log n) steps, and that in some cases [omega](n
> log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
> n)?

This isn't really a Lisp question; you'll probably do well to grab any
text on algorithms which should explain this. But here's a short
version:

Suppose you have some problem, liking sorting a list of numbers or
searching for a record in a database. We use N to refer to the "size"
of the problem. In the case of sorting numbers it would be the length
of the list; in the case of the database search, it would be the
number of records in the database. Big-O notation says roughly how
the amount of work changes as N changes. Thus we have the following:

  O(1) == it doesn't change at all. The work takes a constant amount
  of time regardless of the size (N) of the problem. For instance,
  indexing into an array is generally thought of as being an O(1)
  operation because getting the Nth element of an array takes the same
  amount of time as getting the 0th element. (Of course when you look
  more carefully at reality, caching and then paging effects make this
  not quite true.)

  O(log n) == the amount of work is proportional to the log of the
  size of the problem. Roughly speaking, each time you double the size
  of the problem you add one more unit of work. Finding an element in
  a balanced binary tree is a O(log N) operation because you can
  double the number of leaves in the tree while only adding one level
  to the tree and thus adding one more comparison to the number
  required to find any element.

  O(N) == the amount of work is proportional to the size of the
  problem. Searching for an arbitrary particular element in a list is
  O(N) because the longer the list gets the longer (on average) it
  takes to find.

  O(N log N) == the amount of work is proportional to the size of the
  problem times the log of the size of the problem. Many sorting
  algorithms have this kind of complexity since they divide the
  sequence in half at each step and then sort each half. Thus the the
  number of partitions is log N (since each doubling of the N would
  require only one more partition) but the number of comparisons is
  ultimately going to be tied to the number of elements in the list,
  i.e. N. Other sorting algorithms do wore leading us to ...

  O(N^2) == the amount of work grows as the square of the size of the
  problem. For instance if you write a naive sorting algorithm that
  ends up comparing every element of a sequence with every other
  element then you're doing N^2 comparisons and that algorithm has
  O(N^2) complexity.

Obviously actual algorithms will also have different constant
factors--the overhead that's the same regardless of the size of the
problem. They are typically ignored in doing this kind of analysis,
but in reality you should not totally ignore them--if in an actual
program you have to choose between two algorithms, one with O(log N)
complexity an the other with and O(N), and N is sufficiently small,
the constant factors might make a real difference. ObLisp: this is why
sometimes alists could be more efficient than hash tables even though
an alist lookup is O(N) while a hashtable lookup is usually O(log N):
the constant factor on alists may well be much smaller.

If you need more than that you might start with this web page:

  <http://www.cs.strath.ac.uk/~mdd/teaching/alg&comp/big_oh.html>

-Peter

-- 
Peter Seibel                                      ·····@javamonkey.com

         Lisp is the red pill. -- John Fraser, comp.lang.lisp
From: Rob Warnock
Subject: Re: quick newbie question
Date: 
Message-ID: <Q_icnccoXMbwXjqiXTWc-w@speakeasy.net>
Peter Seibel  <·····@javamonkey.com> wrote:
+---------------
| ...the constant factors might make a real difference. ObLisp: this is why
| sometimes alists could be more efficient than hash tables even though
| an alist lookup is O(N) while a hashtable lookup is usually O(log N):
| the constant factor on alists may well be much smaller.
+---------------

Actually, hash table lookup [with a well-designed hash and good
resizing of the hash table when necessary] can be as good as O(1).
Which is why people use hash tables...


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Peter Seibel
Subject: Re: quick newbie question
Date: 
Message-ID: <m3ekwoduet.fsf@javamonkey.com>
····@rpw3.org (Rob Warnock) writes:

> Peter Seibel  <·····@javamonkey.com> wrote:
> +---------------
> | ...the constant factors might make a real difference. ObLisp: this is why
> | sometimes alists could be more efficient than hash tables even though
> | an alist lookup is O(N) while a hashtable lookup is usually O(log N):
> | the constant factor on alists may well be much smaller.
> +---------------
> 
> Actually, hash table lookup [with a well-designed hash and good
> resizing of the hash table when necessary] can be as good as O(1).
> Which is why people use hash tables...

Of course. Duh. I knew that. Thanks.

-Peter

-- 
Peter Seibel                                      ·····@javamonkey.com

         Lisp is the red pill. -- John Fraser, comp.lang.lisp
From: Jon S. Anthony
Subject: Re: quick newbie question
Date: 
Message-ID: <m3fzh4qe81.fsf@rigel.goldenthreadtech.com>
Peter Seibel <·····@javamonkey.com> writes:

> an alist lookup is O(N) while a hashtable lookup is usually O(log N):

??? Hashtables are (or at least should be) O(1).  Certainly perfect
hashtables are that.  For others, there are plenty of techniques that
keep the total complexity near enough to O(1) to call it that in
practice.

/Jon
From: rif
Subject: Re: quick newbie question
Date: 
Message-ID: <wj0ism011q2.fsf@five-percent-nation.mit.edu>
·········@rcn.com (Jon S. Anthony) writes:

> Peter Seibel <·····@javamonkey.com> writes:
> 
> > an alist lookup is O(N) while a hashtable lookup is usually O(log N):
> 
> ??? Hashtables are (or at least should be) O(1).  Certainly perfect
> hashtables are that.  For others, there are plenty of techniques that
> keep the total complexity near enough to O(1) to call it that in
> practice.
> 
> /Jon

One possible newbie gotcha (I mention this only because it happened to
me, and I don't mean to insult anyone for whom this "obvious") is that
you can take a speed hit (and lose your "O(1)" performance) if you
don't use #'eql to compare keys.  In particular, coming from Perl, it
seemed very natural to make the keys strings and to use #'equal to
compare them.  If you do this, then the lookup time becomes O(length
of the string) rather than O(1).  If you make your keys things that
require even more complicated comparisons, this can turn into a real
problem.  You're best off if you can use keys you can compare with
#'eql.

Note that the same observation of course applies to alists or to
comparisons in general.

rif
From: Jon S. Anthony
Subject: Re: quick newbie question
Date: 
Message-ID: <m3brrsqakq.fsf@rigel.goldenthreadtech.com>
rif <···@mit.edu> writes:

> ·········@rcn.com (Jon S. Anthony) writes:
> 
> > Peter Seibel <·····@javamonkey.com> writes:
> > 
> > > an alist lookup is O(N) while a hashtable lookup is usually O(log N):
> > 
> > ??? Hashtables are (or at least should be) O(1).  Certainly perfect
> > hashtables are that.  For others, there are plenty of techniques that
> > keep the total complexity near enough to O(1) to call it that in
> > practice.
> > 
> > /Jon
> 
> One possible newbie gotcha (I mention this only because it happened to
> me, and I don't mean to insult anyone for whom this "obvious") is that
> you can take a speed hit (and lose your "O(1)" performance) if you
> don't use #'eql to compare keys.

No, that will not affect whether the hashtable lookup is O(1), i.e.,
how many hash table elements do you need to look at.  OTOH, it could
make the O(1) complexity of the hashtable irrelevant, as in:


> In particular, coming from Perl, it seemed very natural to make the
> keys strings and to use #'equal to compare them.  If you do this,
> then the lookup time becomes O(length of the string) rather than
> O(1).

The _comparison_ time is O(N) on the length of the string, which is
independent of the number of elements (whose _keys_ are strings).


> If you make your keys things that require even more complicated
> comparisons, this can turn into a real problem.  You're best off if
> you can use keys you can compare with #'eql.

Indeed.  Presumably you switched to symbols for this.


/Jon
From: Artie Gold
Subject: Re: quick newbie question
Date: 
Message-ID: <3FA70E2B.3020208@austin.rr.com>
Joshua Eckroth wrote:
> What does "...can do in O(n log n) steps, and that in some cases [omega](n
> log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
> n)?
> 
See, for example:

http://en.wikipedia.org/wiki/Big_O_notation

Next time, however, STFW -- or, even better, get a good book on 
algorithms.

!
--ag

-- 
Artie Gold -- Austin, Texas
Oh, for the good old days of regular old SPAM.
From: Kenny Tilton
Subject: Re: quick newbie question
Date: 
Message-ID: <lOFpb.74216$pT1.72785@twister.nyc.rr.com>
Artie Gold wrote:
> Joshua Eckroth wrote:
> 
>> What does "...can do in O(n log n) steps, and that in some cases 
>> [omega](n
>> log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
>> n)?
>>
> See, for example:
> 
> http://en.wikipedia.org/wiki/Big_O_notation
> 
> Next time, however, STFW -- or, even better, get a good book on algorithms.
> 

OK, that's a good link you offered, and to the extent that STFW falls 
into the "teach me to fish" category it is sound advice, but as good as 
Google is, it sure as hell cannot compare with the direction possible 
from cll-ers, and it sure as hell is not as 
human/collegial/community-building. I'd rather have newbies here 
pestering us than wandering about in the wilderness--being a Lispnik is 
solitary enough as is.

Whenever I ask an STFW question, I am looking for the people who can 
answer without thinking--if they exist and if the answer exists.

Someone curious as to good books on "C" will do a lot better if they run 
into me than if they spend 40min on the Web. And it won't take me more 
time than it takes to type (a) K&R and (b) "C Puzzle Book".

my2. oh, that's studly. my-2.

kenny

-- 
JMcCUML

http://tilton-technology.com

Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film

Your Project Here! http://alu.cliki.net/Industry%20Application
From: Kenneth Rose
Subject: Re: quick newbie question
Date: 
Message-ID: <bo7b57$pja$1@rumours.uwaterloo.ca>
Kenny Tilton wrote:
> 
> 
> Artie Gold wrote:
> 
>> Joshua Eckroth wrote:
>> Next time, however, STFW -- or, even better, get a good book on 
>> algorithms.
> 
> OK, that's a good link you offered, and to the extent that STFW falls 
> into the "teach me to fish" category it is sound advice, but as good as 
> Google is, it sure as hell cannot compare with the direction possible 
> from cll-ers, and it sure as hell is not as 
> human/collegial/community-building. I'd rather have newbies here 
> pestering us than wandering about in the wilderness--being a Lispnik is 
> solitary enough as is.

Well said Artie.

By the way, searching the web (i.e., Google :-), with various queries 
such as "O(nlogn)" and "what is O(nlogn)" brings up lots of material on 
other algorithms that are also O(n*log(n)) (and even a link on the O 
blood type), but nothing on *explaining* what O(n*log(n)) is.  And of 
course, we can't expect the reader to think <<Hey, that looks like a 
big-oh... I think I'll query "big-oh notation".>>.

/<en
From: Kenny Tilton
Subject: Re: quick newbie question
Date: 
Message-ID: <puHpb.74851$pT1.50117@twister.nyc.rr.com>
Kenneth Rose wrote:

> Kenny Tilton wrote:
> 
>>
>>
>> Artie Gold wrote:
>>
>>> Joshua Eckroth wrote:
>>> Next time, however, STFW -- or, even better, get a good book on 
>>> algorithms.
>>
>>
>> OK, that's a good link you offered, and to the extent that STFW falls 
>> into the "teach me to fish" category it is sound advice, but as good 
>> as Google is, it sure as hell cannot compare with the direction 
>> possible from cll-ers, and it sure as hell is not as 
>> human/collegial/community-building. I'd rather have newbies here 
>> pestering us than wandering about in the wilderness--being a Lispnik 
>> is solitary enough as is.
> 
> 
> Well said Artie.
> 
> By the way, searching the web (i.e., Google :-), with various queries 
> such as "O(nlogn)" and "what is O(nlogn)" brings up lots of material on 
> other algorithms that are also O(n*log(n)) (and even a link on the O 
> blood type), but nothing on *explaining* what O(n*log(n)) is.  And of 
> course, we can't expect the reader to think <<Hey, that looks like a 
> big-oh... I think I'll query "big-oh notation".>>.

right, but Artie (I think you were agreeing with me, not Artie) wins big 
on this otherwise valid point that Google can (on rare occasion and on 
special cases) give one a fight. I typed in the OPs exact phrasing "O(n 
log n)" (with the quotes) and the first link Google offered was:

   http://www.nist.gov/dads/HTML/bigOnotation.html

My bottom line is that it is great having all these newbies around. 
Artie did the right thing, offering a solid link and a "teach me to 
fish" reminder, but I happen to be obsessed with Lisp advocacy so I'd 
rather see Lispniks simply telling newbies what they know when newbies 
bring up good questions (not homework!) which could also be answered 
(more arduously and less socially) by reading books and web pages.

kenny

-- 
http://tilton-technology.com

Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film

Your Project Here! http://alu.cliki.net/Industry%20Application
From: Artie Gold
Subject: Re: quick newbie question
Date: 
Message-ID: <3FA7EB0C.8040909@austin.rr.com>
Kenny Tilton wrote:
> 
> 
> Kenneth Rose wrote:
> 
>> Kenny Tilton wrote:
>>
>>>
>>>
>>> Artie Gold wrote:
>>>
>>>> Joshua Eckroth wrote:
>>>> Next time, however, STFW -- or, even better, get a good book on 
>>>> algorithms.
>>>
>>>
>>>
>>> OK, that's a good link you offered, and to the extent that STFW falls 
>>> into the "teach me to fish" category it is sound advice, but as good 
>>> as Google is, it sure as hell cannot compare with the direction 
>>> possible from cll-ers, and it sure as hell is not as 
>>> human/collegial/community-building. I'd rather have newbies here 
>>> pestering us than wandering about in the wilderness--being a Lispnik 
>>> is solitary enough as is.
>>
>>
>>
>> Well said Artie.
>>
>> By the way, searching the web (i.e., Google :-), with various queries 
>> such as "O(nlogn)" and "what is O(nlogn)" brings up lots of material 
>> on other algorithms that are also O(n*log(n)) (and even a link on the 
>> O blood type), but nothing on *explaining* what O(n*log(n)) is.  And 
>> of course, we can't expect the reader to think <<Hey, that looks like 
>> a big-oh... I think I'll query "big-oh notation".>>.
> 
> 
> right, but Artie (I think you were agreeing with me, not Artie) wins big 
> on this otherwise valid point that Google can (on rare occasion and on 
> special cases) give one a fight. I typed in the OPs exact phrasing "O(n 
> log n)" (with the quotes) and the first link Google offered was:
> 
>   http://www.nist.gov/dads/HTML/bigOnotation.html
> 
> My bottom line is that it is great having all these newbies around. 
> Artie did the right thing, offering a solid link and a "teach me to 
> fish" reminder, but I happen to be obsessed with Lisp advocacy so I'd 
> rather see Lispniks simply telling newbies what they know when newbies 
> bring up good questions (not homework!) which could also be answered 
> (more arduously and less socially) by reading books and web pages.
> 
> kenny
> 

Valid points all; as a fairly peripheral fella 'round these parts, 
I'll have no problem adapting myself to the local culture. ;-)

It is quite obvious that almost any question could be answered with 
STFW (which, of course, would be counterproductive; I *did* give the 
OP a starting point) but I wonder where the line is drawn? Questions 
directly relating to the use of Lisp or the theory (am I allowed to 
use the `T' word here? ;-)) of Lisp are obviously worthy of 
generating long, typically very illuminating, threads populated by 
many posters who know an awful lot. Even when they wander -- as they 
inevitably do -- such threads tend to be quite interesting at the 
very worst.

In this case, given the nature of the question, providing a direct 
link to the information seemed to be a service, both from the 
`teaching to fish' standpoint and the meta standpoint of `how to 
learn to X'.

Again, if I'm wrong here, please let me know the best way to 
contribute -- and I shall be happy to do so with whatever knowledge 
I might happen to possess.

Thanks to all.

Cheers,
--ag

-- 
Artie Gold -- Austin, Texas
Oh, for the good old days of regular old SPAM.
From: Kenny Tilton
Subject: Re: quick newbie question
Date: 
Message-ID: <9%Tpb.35042$Gq.9576138@twister.nyc.rr.com>
Artie Gold wrote:
> 
> In this case, given the nature of the question, providing a direct link 
> to the information seemed to be a service, both from the `teaching to 
> fish' standpoint and the meta standpoint of `how to learn to X'.
> 
> Again, if I'm wrong here, please let me know the best way to contribute 
> -- and I shall be happy to do so with whatever knowledge I might happen 
> to possess.
> 

Well, no one died and left me in charge, so I am just speaking for 
little ol' JMcCUML me. Hey, you provided a good link and good advice in 
re Google, too. Don't mind me, I'm just a bleeding heart newbie-hugger.

kenny

-- 
http://tilton-technology.com

Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film

Your Project Here! http://alu.cliki.net/Industry%20Application
From: Ray Dillinger
Subject: Re: quick newbie question
Date: 
Message-ID: <3FA7367C.1D088F49@sonic.net>
Joshua Eckroth wrote:
> 
> What does "...can do in O(n log n) steps, and that in some cases [omega](n
> log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
> n)?
> 

Big O notation (as in O(n), O(logN), O(n^2), etc) measures 
how long it takes to do something as a function of how big 
the problem is.  

The best sorting algorithms are O(NlogN), which means that sorting
a list of N elements, in general, can be done in some constant times 
N * LogN time.

Big Omega notation (as in [Omega](n),[Omega](logN), etc, measures 
how much memory it takes to solve the problem as a function of the 
size of the problem.  Otherwise it works the same as Big O notation. 

				Bear
From: Thomas F. Burdick
Subject: Re: quick newbie question
Date: 
Message-ID: <xcv8ymw39sb.fsf@famine.OCF.Berkeley.EDU>
Ray Dillinger <····@sonic.net> writes:

> Joshua Eckroth wrote:
> > 
> > What does "...can do in O(n log n) steps, and that in some cases [omega](n
> > log n) steps are necessary..." mean?  What's O(n log n) and [omega](n log
> > n)?
> > 
> 
> Big O notation (as in O(n), O(logN), O(n^2), etc) measures 
> how long it takes to do something as a function of how big 
> the problem is.  
> 
> The best sorting algorithms are O(NlogN), which means that sorting
> a list of N elements, in general, can be done in some constant times 
> N * LogN time.
> 
> Big Omega notation (as in [Omega](n),[Omega](logN), etc, measures 
> how much memory it takes to solve the problem as a function of the 
> size of the problem.  Otherwise it works the same as Big O notation. 

I wasn't going to respond, because Artie gave a great link; but the
above explanation is wrong.  Knuth's formalizations of
big-Oh/Omega/Theta can be summarized as:

      O( f(n) ) => Complexity is *no* *worse* than (on the order of) f(n)
                   (ie, f(n) is the worst case)
  Omega( f(n) ) => Complexity is *no* *better* than (on the order of) f(n)
                   (ie, f(n) is the best case)
  Theta( f(n) ) => Complexity is *exactly* (on the order of) fn(n)
                   (ie, f(n) covers all cases)

And, just to be contrarian, unlike most everyone else, I won't cite
Knuth AoCP vol 1, I'll pick a math book: _Concrete Mathematics_, by
Graham, Patashnik, and, yes, Knuth.

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'