|
Posted by shimmyshack on 10/24/07 19:48
On Oct 24, 8:21 pm, jfizer <jfi...@vintara.com> wrote:
> > tha fact that adding a header triples the load time suggests to me
> > that what you are annoyed about is the rendering time, rather than the
> > script time.
>
> Could very well be, I've not tried to time the difference if any when
> the application consumes the XML.
>
> Whats more, it turns out that the resulting XML file is over five
> megs, which would be the major source of the performance bottle neck.
>
> Other then that, any major issues with my php code? I'm new to the
> language (this is my first project with it) and I'm not sure I'm doing
> things in the optimal manor.
well i might use $tempXML .= "<el_1>$var_1</el_1><el_2>$var_2</
el_2>.....";
then print it once.
however have you thought of calling
mysqldump -q -X -u user - p password
using
$command = 'mysqldump -q -X -u user - p password';
passthru($command);
it might be faster! follow jerry's advice and add some calls to
microtime in certain lines and use the difference to see where the
bottlenecks are occuring, since your file is so big i would be tempted
to use on the fly gzipping to get that data down to approximately
1/8th of its size - depending on the data. i know that adds
compression and decompression overhead, but it will speed up the
actual download for a file that size.
also try removing the while loop and using
$num = mysql_num_fields ($query);
to get the number of rows returned, then
use a for loop to concatenate a variable, then print it once you exit
the for loop.
again untested, it might be faster, i seem to remember while loops can
be slow, but i havent actually benchmarked this statement so dont
trust it.
Navigation:
[Reply to this message]
|