home *** CD-ROM | disk | FTP | other *** search
- Path: convex!cs.utexas.edu!wupost!uunet!ogicse!ucsd!nic!netlabs!lwall
- From: lwall@netlabs.com (Larry Wall)
- Newsgroups: comp.lang.perl
- Subject: Re: Does Perl have known memory leaks?
- Message-ID: <1991Nov6.155147.8938@netlabs.com>
- Date: 6 Nov 91 15:51:47 GMT
- Article-I.D.: netlabs.1991Nov6.155147.8938
- References: <1991Nov5.154907.5189@ms.uky.edu>
- Organization: NetLabs, Inc.
- Lines: 29
-
- In article <1991Nov5.154907.5189@ms.uky.edu> sean@ms.uky.edu (Sean Casey) writes:
- : I've been having trouble running some perl scripts on a DOS machine
- : (the dos port of perl 4) because it runs out of memory mid-run. Thing
- : is, it's just iterating on a very large file. I'm not allocating new
- : space or doing anything that would obviously tie up memory. When I
- : looked on the Unix box (also perl 4), sure enough, the same script was
- : using well over a megabyte of memory.
- :
- : Does perl have known memory leaks? Does scanning for patterns perform
- : mallocs in perl that aren't ever freed?
-
- A couple of small leaks are fixed in 4.018. One involved foreach loops
- with null lists, and the other involved local(*FILEHANDLE).
-
- You can test for leaks by compiling with -DLEAKTEST, inserting
-
- warn "New allocations:\n" unless $counter++ % 100;
-
- in the middle of your loop, and running your script with -D4096, which
- makes the warn command dump out a list of newly allocated memory. The
- numbers reported correspond to arguments to the New() macro or, in the
- 700's, the Str_new() macro.
-
- If it's not a leak, look in the Space Efficiency section of Chapter 7.
-
- It would be fun to write a profiler that added up the amount of memory
- allocated for each statement.
-
- Larry
-