|
Posted by J.O. Aho on 01/25/07 11:31
vjp2.at@at.BioStrategist.dot.dot.com wrote:
> I have an html file with like 3,000 links.
>
> Every two years (ca 1995-2000) I used to run it in pieces of 500 links
> on netmechanic. It would email you the broken links.
>
> I realise there are now ways to do this on your own machine (spider&al).
> But I'm afraid it might slow down my BASH Unix Shell ISP.
You can use nice to make it have less "priority"
nice -n 20 your_script
If some other process needs CPU power more than your script, it will and
hardly anyone will notice anything. You can also pick a time that is when the
host is less used. I guess that script of yours uses wget, which is quite CPU
friendly application, you can tell it to limit bandwidth usage, if you fear it
would use up that too much.
And it's nothing you need to run every day, so I don't think you will have any
problems.
--
//Aho
Navigation:
[Reply to this message]
|