|  | Posted by deciacco on 11/16/07 15:50 
"Jerry Stuckle" <jstucklex@attglobal.net> wrote in messagenews:OaSdnU7vl-_ELqDanZ2dnUVZ_hadnZ2d@comcast.com...
 > deciacco wrote:
 >> "The Natural Philosopher" <a@b.c> wrote in message
 >> news:1195209624.8024.5@proxy00.news.clara.net...
 >>> deciacco wrote:
 >>>> thanks for the reply steve...
 >>>> basically, i want to collect the file information into memory so
 >>>> that I  can then do analysis, like compare file times and sizes.
 >>>> it's much faster  to do this in memory than to do it from disk.
 >>>> should have mentioned this  earlier as you said...
 >>> Why do you care how much memory it takes?
 >>> 1.7MB is not very much.
 >> These days memory is not an issue, but that does not mean we shouldn't
 >> write good, efficient code that utilizes memory well.
 > There is also something known as "premature optimization".
 >> While 1.7MB is not much, that is what is generated when I look at
 >> ~2500 files. I have approximately 175000 files to look at and my
 >> script uses up about 130MB. I was simply wondering if someone out
 >> there with more experience, had a better way of doing this that would
 >> utilize less memory.
 > (Top posting fixed)
 > How are you figuring your 1.7Mb?  If you're just looking at how much
 > memory is being used by the process, for instance, there will be a lot of
 > other things in there, also - like your code.
 > 1.7Mb for 2500 files comes out to just under 700 bytes per entry, which
 > seems rather a bit large to me.  But it also depends on just how much
 > you're storing in the array (i.e. how long are your path names).
 > I also wonder why you feel a need to store so much info in memory, but I'm
 > sure you have a good reason.
 > P.S. Please don't top post.  Thanks.
 
 Jerry...
 
 I use Outlook Express and it does top-posting by default. Didn't realize
 top-posting was bad.
 
 To answer your questions:
 
 "Premature Optimization"
 I first noticed this problem in my first program. It was running much slower
 and taking up 5 times as much memory. I realized I needed to rethink my
 code.
 
 "Figuring Memory Use"
 To get the amount of memory used, I take a reading with memory_get_usage()
 at the start of the code in question and then take another reading at the
 end of the snippet. I then take the difference and that should give me a
 good idea of the amount of memory my code is utilizing.
 
 "Feel the Need"
 The first post shows you an array of the type of data I store. This array
 gets created for each file and added as an item to another array. In other
 words, an array of arrays. As I mentioned in a fallow-up posting, the reason
 I'm doing this is because I want to do some analysis of file information,
 like comparing file times and sizes from two seperate directories. This is
 much faster in memory than on disk.
  Navigation: [Reply to this message] |