|
Posted by Tyrone Slothrop on 06/29/07 17:52
I am writing a script to upload data to a delimited text file. The
data is in the format:
recordID \t the rest of the data ...
It is significant that the recordID's are unique and that the newly
uploaded data replaces the old data in the event of duplicate
recordID's.
The solution I came up with (trying to avoid having to loop through
all of the records of both files looking for dupes) was to:
* treat both files as arrays using file()
* assign the keys of each line on each array using the alpha+recordID
(prevents renumbering of key when ...)
* running array_merge ($old_data, $new_data)
* running ksort() on merged data to sort by recordID
* imploding the data on \n and writing to the data file
This seems to be working as planned. However, I am really concerned
that the old data will not be overwritten by new when running
array_merge.
Comments, suggestions, snark, flames? TIA!
The (simplified) code:
$data_file = "D:\\pathto\\data.txt";
$dfile = file($data_file);
$data_in = file(addslashes($_FILES['upload']['tmp_name']));
// I hate Windows servers!
foreach ($dfile as $line) {
$line = trim($line);
if (!empty($line)) { list ($id, $unused) = explode ("\t", $line); }
$odata['a'.$id] = $line;
}
foreach ($data_in as $line) {
$line = trim($line);
if (!empty($line)) { list ($id, $unused) = explode ("\t",$line); }
$ndata['a'.$id] = $line;
}
$mdata = array_merge ($odata, $ndata);
ksort ($mdata);
$fp = fopen ($data_file, 'w');
fwrite ($fp, implode("\n", $mdata));
fclose ($fp);
[Back to original message]
|