|
Posted by Krustov on 11/22/06 23:40
<comp.lang.php>
<Dave>
<Wed, 22 Nov 2006 14:13:06 -0000>
<456457ef$1_1@glkas0286.greenlnk.net>
> I'd like to be able to request a page from a server and then be able to
> analyse the content (rather than rendering it to the screen of my browser)
> in php. I guess this is a bit like how a robot works. I've got quite a lot
> of php knowledge already, but I can't think of how to do this.
>
> Has anybody any ideas on the types of functions or mechnisms I should be
> using for this?.
>
<?php
$ganja="http://www.yourdomain.com";
$handle=fopen($ganja,"rb"); $contents='';
while (!feof($handle)) {$contents .= fread($handle,8192);}
fclose($handle);
$whatever=strip_tags($contents);
$filename="store/demo.php";
$fp=fopen($filename,"w"); fwrite ($fp,$whatever); fwrite ($fp,"\n");
fclose($fp);
?>
The above will grab the webpage and strip the html tags before saving it
as a text file .
Navigation:
[Reply to this message]
|