|
Posted by Jerry Stuckle on 11/16/07 15:12
deciacco wrote:
> "The Natural Philosopher" <a@b.c> wrote in message
> news:1195209624.8024.5@proxy00.news.clara.net...
>> deciacco wrote:
>>> thanks for the reply steve...
>>>
>>> basically, i want to collect the file information into memory so that I
>>> can then do analysis, like compare file times and sizes. it's much faster
>>> to do this in memory than to do it from disk. should have mentioned this
>>> earlier as you said...
>>>
>> Why do you care how much memory it takes?
>>
>> 1.7MB is not very much.
>
> These days memory is not an issue, but that does not mean we shouldn't
> write good, efficient code that utilizes memory well.
>
There is also something known as "premature optimization".
> While 1.7MB is not much, that is what is generated when I look at
> ~2500 files. I have approximately 175000 files to look at and my
> script uses up about 130MB. I was simply wondering if someone out
> there with more experience, had a better way of doing this that would
> utilize less memory.
>
(Top posting fixed)
How are you figuring your 1.7Mb? If you're just looking at how much
memory is being used by the process, for instance, there will be a lot
of other things in there, also - like your code.
1.7Mb for 2500 files comes out to just under 700 bytes per entry, which
seems rather a bit large to me. But it also depends on just how much
you're storing in the array (i.e. how long are your path names).
I also wonder why you feel a need to store so much info in memory, but
I'm sure you have a good reason.
P.S. Please don't top post. Thanks.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================
[Back to original message]
|