Posted by Eric Anderson on 01/30/06 15:53
NC wrote:
> Eric Anderson wrote:
>> NC wrote:
>>> This is not necessarily a good thing. Because you want minimize the
>>> disk usage, you are missing out on MySQL's ability to process large
>>> files very quickly. I would suggest an alternative approach:
>>>
>>> 1. Copy the remote file to your local disk.
>>> 2. Use LOAD DATA INFILE to load the data into MySQL.
>>> 3. Delete the data file, if necessary.
>> Interesting approach. It would be fast and low on resources (although it
>> would require usage of the filesystem but perhaps that isn't too big of
>> a deal). The only downside is that it is MySQL specific. Currently this
>> application is database independent and it would be nice to keep it that
>> way.
>
> If memory serves, all SQL databases support import of text files. The
> query syntax may differ, but the concept is clearly there...
I went for a modification on this method. I got to thinking that php
variables are probably not designed to store large amounts of data. So
even though Stream_Var is convenient it is probably not not efficient
because as new data is read in from the FTP server PHP is probably
mallocing continuously causing performance really to drag. So I instead
have ftp_fget write out to whatever tmpfile() gives me to disk and then
use fgetcsv() to read that in from disk and insert into the database as
before.
This change resulted in reducing the time of the import taking about 3
hours to the import now taking about 10 minutes! It still seems like the
FTP library should offer a way to stream ftp data directly into a
consuming function such as fgetcsv() without having to read the entire
file into memory at once. But using the tmpfile() workaround seems to
perform well so I am happy.
Eric
[Back to original message]
|