|
Posted by petersprc on 09/20/06 05:49
Hi,
One way to download files in PHP is to use the CURL extension
(php.net/curl). This allows you to transfer files with a timeout. The
options are pretty straightforward. Below is a function to download
files over HTTP with CURL. This can be called like so:
downloadUrl('http://www.yahoo.com', 'yahoo.html', 10);
The URL http://www.yahoo.com will be downloaded to the file yahoo.html.
The time limit is 10 seconds.
Source code is below:
function curlDownload($url, $destination, $timeout)
{
// The result array will store the download status
// and various statistics
$result = array(
'completed' => true,
'timedOut' => false,
'errors' => array(),
'curlErrorNum' => 0,
'curlStats' => null
);
// Open the local destination file
if (!($fp = fopen($destination, 'w'))) {
$result['completed'] = false;
$result['errors'][] = "Failed to open file \"$destination\" for
writing.";
return $result;
}
// Initialize CURL with the URL
$ch = curl_init($url);
// Set the destination file handle
curl_setopt($ch, CURLOPT_FILE, $fp);
// Don't write the HTTP header to the output file
curl_setopt($ch, CURLOPT_HEADER, 0);
// Set a timeout
curl_setopt($ch, CURLOPT_TIMEOUT, $timeout);
// Execute the request
curl_exec($ch);
// Close the destination file
if (!fclose($fp)) {
unlink($destination);
$result['completed'] = false;
$result['errors'][] = "Failed to close file \"$destination\".";
}
// Get the CURL error status
$result['curlErrorNum'] = curl_errno($ch);
if ($result['curlErrorNum'] != 0) {
unlink($destination);
$result['completed'] = false;
$result['errors'][] = curl_error($ch);
// Check for a request time out
if ($result['curlErrorNum'] == 28) {
$result['timedOut'] = true;
}
}
// Get download statistics
$result['curlStats'] = curl_getinfo($ch);
// Make sure the HTTP status code from the server is valid
if ($result['curlStats']['http_code'] != 200) {
unlink($destination);
$result['completed'] = false;
$result['errors'][] = "Received HTTP code
{$result['curlStats']['http_code']}.";
}
curl_close($ch);
return $result;
}
function downloadUrl($url, $destination, $timeout)
{
$result = curlDownload($url, $destination, $timeout);
if ($result['timedOut']) {
// Request timed out
$errors = join($result['errors'], " ");
echo "Download of URL {$result['curlStats']['url']} timed out:
$errors";
} elseif (!$result['completed']) {
// Another error occurred
$errors = join($result['errors'], " ");
echo "Download of URL {$result['curlStats']['url']} failed:
$errors";
} else {
// Download was successful
echo "Downloaded " .
number_format($result['curlStats']['size_download']) .
" bytes from URL " . $result['curlStats']['url'] . " in " .
number_format(round($result['curlStats']['total_time'], 2), 2) .
" seconds at a rate of " .
number_format($result['curlStats']['speed_download']) .
" bytes per second.";
}
print_r($result);
return $result;
}
downloadUrl('http://www.yahoo.com', 'yahoo.com.html', 10);
Best Regards,
John Peters
a.r.austin@gmail.com wrote:
> Hello,
>
> I am trying to download a few files one after another from a remote
> server. Problem is that I don't know how to, or if I am able at all,
> set a time out for download. I don't want to time out whole script,
> just a part if file won't download in 2mins then skip to the next one.
> Previously, I had a Javascript implementation with AJAX, this time I
> thought of doing it in PHP since PHP has far better array functions, I
> don't have to do most from scratch. Yet, I with PHP implementation I
> came to a problem where I am unable to cancel the download if it runs
> more than so many minutes. Yet, as I said, I don't want to quit the
> script completely, all I need is to skip to the next file and try that.
>
> Any help? Please?
[Back to original message]
|