|
Posted by Jochem Maas on 11/02/05 22:40
Manish Marathe wrote:
> Sorry for spamming but I didn't mean to reply you privately :(
I did it myself by accident 2 minutes ago :-)
no probs.
....
>
> Well, its not me who is going to run the unit tests but it the user who
> is going to. As I said I am developing an automatic unit test
> "generator", which will generate tests for the user and then the user
> can run them. So its totally upto the user to decide how many classes to
> load, how frequently to load them.
I assumed as much. Its sounds very interesting!
>
> you might consider using forking in a master process and letting a
> series of
> child processes each handle a chunk of all the classes to generate
> tests for,
> basically running in parallel.
>
> I don't see how you can avoid loading many classes 'at once',
> why do you
> think this is foolish [by definition]?
>
>
> I am sorry if I have hurted anyone's feelings here but by foolish simply
au contraire. I don't think anyone was hurt.
> I was pointing to the same problem you stated which is of allocating the
> memory and which I definitely foresee.
>
> Thank You any way, forking should help but my problem was, I guess I
> didn't explain it well but to use the ReflectionClass like above I have
> to include the class file into my script. Is it possible to keep all the
> class files somewhere in the path where PHP can look for and I don't
> have to include in my script.
php.ini setting include_path - its a simple concept, /^RT(?:F)?M$/.
but your problem is more complex than that, you are building a tool that
seems to be intend to form [part of] a php development/project environment
which means essentially you need to be able to hook in arbitrary projects
into a an installed test system which consistents of your generation tool
(and other modules). I envisage a simple interface (cmdline) to register
projects ... and a search engine thats capable of finding php code (tokenizer?),
locating all class defs in registered projects - and keeping track of
located files.
now imagine a generator deamon that just sits running occasionally doing
the following (triggered by e.g.: regular-timeout, cvs commit msg, soap msg,
etc):
scan the projects for new/changed files.
examine the found files.
(re)generate tests for each changed/new class found.
the trick indeed to to take the monolithic process above
and parallelize it (made that word up, I think, heck C.S.Lewis did it).
the master deamon could track which project(s) are showing most [recent!] activity
and could for instance spawn 3 children:
1. SCANNER
2. PARSER
3. GENERATOR
each of the above spawns a child for each [active?] project (accounting for activity stats in order
they are spawned and more interestingly in the ammount of CPU you give each
process? [can you even 'nice' a child process in php? note to self: rtfm]
let all processes log/read to/from a central place (I like firebird :-)
and act accordingly...
you might consider to conditionally (no idea what conditions though!)
spawn children for handling the scanning/parsing/generation of files
in [delayed] chunks.
again central logging could allow you to control how fast children are spawned,
max memory usage, num of children.... the AUTGenerator family tree (sideways):
1. SCANNER
proj1
proj2
kid1
kid2
kid3
proj3
proj4
2. PARSER
proj1
proj2
kid1
kid2
kid3
proj3
proj4
3. GENERATOR
proj1
proj2
kid1
kid2
kid3
proj3
proj4
the master deamon could even be rigged to response to msgs
to change config values on fly.
just a thought. keep us posted :-)
anyway there is no shame in getting the CPU warm :-) if you are writing compiled code,
the continuous re-compiling ammounts to the same kind of thing, no?
>
>
> Manish Marathe
> SpikeSource, Inc.
> http://developer.spikesource.com
Navigation:
[Reply to this message]
|