From: ··········@gmail.com
Subject: getting Segmentation Violation
Date: 
Message-ID: <fa959d54-99f0-4812-bcb0-ccf5cb045766@n20g2000hsh.googlegroups.com>
hello all,

I am developing a lisp application which analyzes the source code.
During analysis  it creates many objects using make-instance. The
problem is ....the program consumes available memory and PF usage
reaches to 1.90 GB and program emits an exception 'Segmentation
violation: error no 11'
To optimize it,  I am now using secondary storage to dump the
intermediate objects and retrieving them back whenever required. But
still...I am not getting rid of the problem(even though many make-
instance are removed).

So ... can anyone help me. or atleast  some pointers towards memory
model of lisp. thanks in advance.

From: Ken Tilton
Subject: Re: getting Segmentation Violation
Date: 
Message-ID: <4782162c$0$9104$607ed4bc@cv.net>
··········@gmail.com wrote:
> hello all,
> 
> I am developing a lisp application which analyzes the source code.

How many lines of code are you analyzing?

> During analysis  it creates many objects using make-instance.

How many? Print out a count every so often until you crash.

Why are you using such a heavyweight representation as CLOS classes? 
Lisp has a lot of ways to encode data, and CLOS sounds too heavy for 
this from what little you have said. <hint>

> The
> problem is ....the program consumes available memory and PF usage
> reaches to 1.90 GB and program emits an exception 'Segmentation
> violation: error no 11'

What program? ie, which Lisp implementation on which OS?

> To optimize it,  I am now using secondary storage to dump the
> intermediate objects and retrieving them back whenever required. But
> still...I am not getting rid of the problem(even though many make-
> instance are removed).
> 
> So ... can anyone help me. or atleast  some pointers towards memory
> model of lisp. thanks in advance.

Your concern is not the Lisp memory model per se because the only way I 
know to get seg violations is to mess things up doing C via FFI, but 
then again I do not know what Lisp you are using, maybe it falls over 
when things get ugly. Are you seeing a lot of GC messages before the 
failure.

The very bestest pointer I can give you is to look at all the questions 
I had to ask which you should have answered in your original post and 
then stare at your navel until you can figure out how to make better 
requests for help. A better, fuller request will lead to more and better 
responses and get things sorted out in fewer exchanges. If you cannot 
figure out what better means, just err on the side of telling us 
everything you can think of.

kt

-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: vanekl
Subject: Re: getting Segmentation Violation
Date: 
Message-ID: <c3c6cec5-73be-4f01-b2b6-51dc8d04e088@l6g2000prm.googlegroups.com>
On Jan 7, 3:18 pm, ···········@gmail.com" <··········@gmail.com>
wrote:
> hello all,
>
> I am developing a lisp application which analyzes the source code.
> During analysis  it creates many objects using make-instance. The
> problem is ....the program consumes available memory and PF usage
> reaches to 1.90 GB and program emits an exception 'Segmentation
> violation: error no 11'
> To optimize it,  I am now using secondary storage to dump the
> intermediate objects and retrieving them back whenever required. But
> still...I am not getting rid of the problem(even though many make-
> instance are removed).
>
> So ... can anyone help me. or atleast  some pointers towards memory
> model of lisp. thanks in advance.

1. Make sure your code works with smaller test samples.

2. Are you making heavy use of recursion? Some recursive forms may not
be optimized and make heavy use of the stack, breaking it. To make
sure you aren't accidentally getting into a run-away recursive
situation you can keep track of the depth and throw an error if it
gets too large.
Also keep in mind that this situation can also occur with a set of
functions that call one another and get trapped in a never-ending
cycle, also possibly busting the stack.

3. Instrumenting/profiling your code will probably give insight. If
there's one thing I've learned, the pilot makes more errors than the
plane, and it's more productive to investigate there first.
When I say "instrument" I mean something as simple as keeping a
running log file or just stream to standard output with messages about
where the program is as it is running.

4. Some OSs allow the sysadmin to specify hard upper limits on the
amount of files one is allowed to open and the amount of memory that
may be consumed. Since you specify neither OS nor system limits,
you're on your own there.

5. If you compiled the Lisp yourself, you may want to think back
whether it passed all tests, or whether the tests were even run.

6. Is it possible you are calling an external C library when the error
occurs?

7. If you think you've run out of main memory, is it possible to map/
reduce the problem in a lazy manner if you aren't already, or merge
results from smaller problem sets?

8. Reboot.

9. Is the problem repeatable on other hardware?

10. Go to the bathroom and think about it some more. I do my best work
in the bathroom; it might help you, too.
From: ··········@gmail.com
Subject: Re: getting Segmentation Violation
Date: 
Message-ID: <a83f3c72-54cf-4b2a-8192-352368197b27@v29g2000hsf.googlegroups.com>
thanks for replying. I need to specify few more details to make the
problem clearer. I am using Allegro CL 7 on windows machine with 512
mb RAM.

On Jan 8, 2:02 am, vanekl <·····@acd.net> wrote:
> On Jan 7, 3:18 pm, ···········@gmail.com" <··········@gmail.com>
> wrote:
>
> > hello all,
>
> > I am developing a lisp application which analyzes the source code.
> > During analysis  it creates many objects using make-instance. The
> > problem is ....the program consumes available memory and PF usage
> > reaches to 1.90 GB and program emits an exception 'Segmentation
> > violation: error no 11'
> > To optimize it,  I am now using secondary storage to dump the
> > intermediate objects and retrieving them back whenever required. But
> > still...I am not getting rid of the problem(even though many make-
> > instance are removed).
>
> > So ... can anyone help me. or atleast  some pointers towards memory
> > model of lisp. thanks in advance.
>
> 1. Make sure your code works with smaller test samples.
>
yes, it works fine for smaller test samples.

> 2. Are you making heavy use of recursion? Some recursive forms may not
> be optimized and make heavy use of the stack, breaking it. To make
> sure you aren't accidentally getting into a run-away recursive
> situation you can keep track of the depth and throw an error if it
> gets too large.

i m using recursion at few places, but not heavy.

> Also keep in mind that this situation can also occur with a set of
> functions that call one another and get trapped in a never-ending
> cycle, also possibly busting the stack.
>
> 3. Instrumenting/profiling your code will probably give insight. If
> there's one thing I've learned, the pilot makes more errors than the
> plane, and it's more productive to investigate there first.
> When I say "instrument" I mean something as simple as keeping a
> running log file or just stream to standard output with messages about
> where the program is as it is running.
>
> 4. Some OSs allow the sysadmin to specify hard upper limits on the
> amount of files one is allowed to open and the amount of memory that
> may be consumed. Since you specify neither OS nor system limits,
> you're on your own there.
>
> 5. If you compiled the Lisp yourself, you may want to think back
> whether it passed all tests, or whether the tests were even run.
>
> 6. Is it possible you are calling an external C library when the error
> occurs?
>
no, i m not calling any C lib.

> 7. If you think you've run out of main memory, is it possible to map/
> reduce the problem in a lazy manner if you aren't already, or merge
> results from smaller problem sets?
>
> 8. Reboot.
>
> 9. Is the problem repeatable on other hardware?
>

yes, it is.

> 10. Go to the bathroom and think about it some more. I do my best work
> in the bathroom; it might help you, too.

Apart from these, I also tried these:-
1. I am calling GC explicitly.
2. I was passing heavy hash tables previously from one function to
other, i identified them and made them global.

Still...there is almost zero performance gain. :(

-Tushar
From: vanekl
Subject: Re: getting Segmentation Violation
Date: 
Message-ID: <013ad560-eb72-4d72-b19d-e0a6ebf10652@q39g2000hsf.googlegroups.com>
On Jan 10, 8:04 am, ···········@gmail.com" <··········@gmail.com>
wrote:
> thanks for replying. I need to specify few more details to make the
> problem clearer. I am using Allegro CL 7 on windows machine with 512
> mb RAM.
>
> On Jan 8, 2:02 am, vanekl <·····@acd.net> wrote:
>
> > On Jan 7, 3:18 pm, ···········@gmail.com" <··········@gmail.com>
> > wrote:
>
> > > hello all,
>
> > > I am developing a lisp application which analyzes the source code.
> > > During analysis  it creates many objects using make-instance. The
> > > problem is ....the program consumes available memory and PF usage
> > > reaches to 1.90 GB and program emits an exception 'Segmentation
> > > violation: error no 11'
> > > To optimize it,  I am now using secondary storage to dump the
> > > intermediate objects and retrieving them back whenever required. But
> > > still...I am not getting rid of the problem(even though many make-
> > > instance are removed).
>
> > > So ... can anyone help me. or atleast  some pointers towards memory
> > > model of lisp. thanks in advance.
>
> > 1. Make sure your code works with smaller test samples.
>
> yes, it works fine for smaller test samples.
>
> > 2. Are you making heavy use of recursion? Some recursive forms may not
> > be optimized and make heavy use of the stack, breaking it. To make
> > sure you aren't accidentally getting into a run-away recursive
> > situation you can keep track of the depth and throw an error if it
> > gets too large.
>
> i m using recursion at few places, but not heavy.
>
>
>
> > Also keep in mind that this situation can also occur with a set of
> > functions that call one another and get trapped in a never-ending
> > cycle, also possibly busting the stack.
>
> > 3. Instrumenting/profiling your code will probably give insight. If
> > there's one thing I've learned, the pilot makes more errors than the
> > plane, and it's more productive to investigate there first.
> > When I say "instrument" I mean something as simple as keeping a
> > running log file or just stream to standard output with messages about
> > where the program is as it is running.
>
> > 4. Some OSs allow the sysadmin to specify hard upper limits on the
> > amount of files one is allowed to open and the amount of memory that
> > may be consumed. Since you specify neither OS nor system limits,
> > you're on your own there.
>
> > 5. If you compiled the Lisp yourself, you may want to think back
> > whether it passed all tests, or whether the tests were even run.
>
> > 6. Is it possible you are calling an external C library when the error
> > occurs?
>
> no, i m not calling any C lib.
>
> > 7. If you think you've run out of main memory, is it possible to map/
> > reduce the problem in a lazy manner if you aren't already, or merge
> > results from smaller problem sets?
>
> > 8. Reboot.
>
> > 9. Is the problem repeatable on other hardware?
>
> yes, it is.
>
> > 10. Go to the bathroom and think about it some more. I do my best work
> > in the bathroom; it might help you, too.
>
> Apart from these, I also tried these:-
> 1. I am calling GC explicitly.
> 2. I was passing heavy hash tables previously from one function to
> other, i identified them and made them global.
>
> Still...there is almost zero performance gain. :(
>
> -Tushar

I'd wager your problem would go away if you were to spend $50 on 2 GB
of memory. Running a 2 GB Lisp image on a machine one fourth the size
should be close to unbearably slow if you are accessing all portions
of the image. Also keep in mind that Windows consumes most (if not all
of that 512 MB --- my Windows box consumes +/-600 MB upon startup, but
I have a few services running in the background).
Aside from that, you could try upping your virtual memory page file
size:
http://www.petri.co.il/pagefile_optimization.htm
You can also try shutting down all services and all external programs
unless you absolutely have to have them running at the same time as
your Lisp program. But this is just putting a bandaid on the problem:
you are over-stressing the hardware/OS and your algorithm is not
dividing the problem up into manageable portions.
Oh, and this should go without saying, I hope you aren't running
Vista.
In any event, processing will be much faster if you can change your
algorithm so that the image can largely fit in main memory.
From: Thomas A. Russ
Subject: Re: getting Segmentation Violation
Date: 
Message-ID: <ymi3at5selr.fsf@blackcat.isi.edu>
···········@gmail.com" <··········@gmail.com> writes:

> thanks for replying. I need to specify few more details to make the
> problem clearer. I am using Allegro CL 7 on windows machine with 512
> mb RAM.
> 
> On Jan 8, 2:02��am, vanekl <·····@acd.net> wrote:
> > On Jan 7, 3:18��pm, ···········@gmail.com" <··········@gmail.com>
> > wrote:
> >
> > > hello all,
> >
> > > I am developing a lisp application which analyzes the source code.
> > > During analysis ��it creates many objects using make-instance. The
> > > problem is ....the program consumes available memory and PF usage
> > > reaches to 1.90 GB and program emits an exception 'Segmentation
> > > violation: error no 11'
> > > To optimize it, ��I am now using secondary storage to dump the
> > > intermediate objects and retrieving them back whenever required. But
> > > still...I am not getting rid of the problem(even though many make-
> > > instance are removed).
> 
> Apart from these, I also tried these:-
> 1. I am calling GC explicitly.

This normally shouldn't matter, unless you have an implementation with a
generational collector and there is some feature of your GC that allows
explicit calls to examine and reclaim storage from what would otherwise
be deemed to be permanent generations.

> 2. I was passing heavy hash tables previously from one function to
> other, i identified them and made them global.

That shouldn't matter at all.  Values passed as arguments in Lisp are
not copied, so only a pointer to the hash table would be passed.  So you
wouldn't expect there to be any change:

> Still...there is almost zero performance gain. :(

If you are running out of heap space, then you must have some objects
being kept around.  Lisp can't GC any object to which there is a live
pointer.  You mention hash tables, especially big ones.  So any objects
that are referenced (stored) in those hash tables will continue to
consume memory.  It can't go away, because the hash table in question
provides a means of accessing it.  If you make those hash table global,
the problem is likely to only get worse, since then they never go away.

If I had to guess, I would suppose that you are keeping references to
those intermediate objects somewhere in your code.  That prevents them
from becoming garbage, and thus the storage never gets freed.

As for how to track this down, it is somewhat tricky.  Check the GC and
memory space information documentation that comes with your Lisp
implementation.  Perhaps there is a way of seeing how much memory is
allocated and to which objects.  That could give you some clues as to
where you are hanging onto objects that you no longer need.

Using global variables is one way to keep references to objects.  Make
sure that you use lexical variables so that they release references to
objects and tables when they are no longer needed.

You mention that you now use secondary storage.  After you save the
objects to the secondary storage, are you sure that you are getting rid
of the references to those objects?  If you continue to maintain
references, then secondary storage alone won't help you.


-- 
Thomas A. Russ,  USC/Information Sciences Institute