Anthony Atkins wrote:
> Unfortunately, the fact the the script dies when it first tries to allocate
> memory for the CGI data is not the best sign, in that you probably won't be
> able to improve the memory usage by changing the script itself, short of
> completely abandoning CGI.pm for your file upload scripts.
Ahh, success. By simply installing a newer version of Perl (5.005-03) as
well as its freind CGI.pm (2.51) I was able to upload a 20MB file. I
will now have our Graduate School try to upload the ETD again.
> I did a little experiment on our equipment a while back, and watched the
> memory usage (with "top") as the file upload script tried to process a
> large file. Try this with a file smaller than the limit, then see how much
> real memory is being used over the course of the upload.
I downloaded top and installed it on one of my computer. It worked as
advertized. Unfortunately, the machine hosting the ETD's runs OSF. Duh.
> Also, I've attached a script that's designed to overload perl's memory,
> it's useful in that you get a real-world estimate (within 1 Mb) of the
> total amount of memory perl is going to be willing to use....
The attached script was pretty cool and enlightinging. Thank you. I will
now explore the possibilitites of using the database driving EDT process...
Thank you for your prompt replies.
Eric Lease Morgan
Digital Library Initiatives Department, NCSU Libraries