|
Posted by _q_u_a_m_i_s's on 10/05/06 15:42
I won't hit the memory limit because i only read about 10 to 64Kb at a
tim from the file, process it and output it to the other one.
I'm not sure that increasing the execution time will be the solution,
this thing should be able to process files of about 100-200Mb, and if i
need 30seconds for a 2-3Mb file, i can't imagine how long it will take
to process a 100Mb file. during this time the user won't get any
feedback...
I'm goint to take your solution into consideration, longer exection
time and a FAST server might do the trick...
Erwin Moller a scris:
> quamis@gmail.com wrote:
>
> > Hy,
> > i need to process every character in a file, so i open the file read
> > in buffers of about 8192bytes and process each buffer, then i write
> > the output to another file.
> >
> > the problem is that with large files(>8Mb) i get a script error(Fatal
> > error: Maximum execution time of 30 seconds exceeded ).
> > i acess every character in the buffer with
> > $chr=ord($bufferIn{$i}); (where $i=0...8192)
> > seems like all he time the script consumes is in the for loop and the
> > chr/ord functions.
> >
> > can i do something to speed things up?
> > is there any other way of acessing a single characher except
> > $bufferIn{$i} ?
>
> Hi,
>
> I am not sure if you can speed things up in your script, but why not simply
> increase the max_execution time?
>
> This can be done with
> ini_set("max_execution_time",60);
> for 60 seconds.
>
> Have a look here for more options:
> http://nl2.php.net/manual/en/ini.php#ini.list
>
> You might also hit the roof of your memory usage if you handle and process
> very large file. In that case look at: memory_limit (defaults to 8MB)
>
> Hope that helps.
>
> Regards,
> Erwin Moller
Navigation:
[Reply to this message]
|