|
Posted by cluthz on 05/07/07 23:18
Hi there,
I have written a script to update some tables. There are 15000 rows in my
table and the table has about 35 fields.
My PHP script first of all completes a select on the main table like so:
select * from maintable
Then I enter a for each statements which for each row in the main table,
updates another table with a value from maintable. E.g.(psudoe-ish code)
*************START*******
requireonce Db.php
DB connect(to db)
select name, email, id from maintable
For each (Row in maintable)
Call subroutine which - DB connects to same database again (As in
subroutine)
subroutine performs an update to another (new) table
all done and free to move onto next maintable row.
End for each
********END**************
When first tested on a sample of 150 rows all worked perfectly. However,
when I run it on the full 15000, I got an error message:
Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to
allocate 39 bytes) in /usr/share/pear/DB/common.php on line 682
Therefore I amended the code to make it echo to screen a counter on each
iteration of the loop that runs through each.
When I do this it does not give an error message, but the counter only gets
to about 5300 and then just seems to stop without reason or reporting any
errors. The browser just says DONE.
So what can I do. I did try "unset"ing variables at the end of the
subroutine to no avail
OK I'm thinking I can write this code to only perhaps address 5000 rows at a
time. But I don't think I should have to as each time the subroutine
shouldn't any memory and connections be freed up?
So maybe this means I have another problem with my code, either way I
thought I would seek an expert opinion. Experts... any help appreciated.
thanks in advance.
Navigation:
[Reply to this message]
|