|
Posted by Chris Hope on 04/17/07 00:56
Chris Hope wrote:
> Krustov wrote:
>
>> <comp.lang.php>
>> <Toby A Inkster>
>> <Mon, 16 Apr 2007 18:31:26 +0100>
>> <egocf4-4kp.ln1@ophelia.g5n.co.uk>
>>
>>> > In your opinion - what are the problems ?
>>>
>>> There aren't any problems per se -- it's just far too big a problem
>>> for one programmer to complete in two hours. It's a task that's
>>> measured in "man-years" rather than "man-hours".
>>>
>>
>> Even if the end result is only to produce a small thumbnail ? .
>
> Of course. Whether it's a thumbnail or not, you're going to need to
> render it somewhat like the page actually is, which means you need to
> be able to parse table layouts, CSS positioning, etc etc. It takes a
> lot of time to get this working.
>
> I think it might have been Toby who posted here in the past a method
> of capturing screenshots of websites using a browser and scripting. I
> have utilised this method myself and it works really well. An example
> of this is below.
>
> This is on a Linux system. You may or may not be able to do it on
> other systems, but it works nicely for me. I run Opera fullscreen on a
> 1024x768 screen:
>
> system("opera -remote 'openURL($url)'");
> sleep(30);
> system("import -display :0 -window root temp/$filename.jpg");
>
> The 30 second sleep gives it time to have all the images etc
> downloaded. I probably don't need to make it wait that long - just
> being safe.
And of course you then need to resize it down etc using gd or
imagemagick or other such tools.
--
Chris Hope | www.electrictoolbox.com | www.linuxcdmall.com
Navigation:
[Reply to this message]
|