You are here: Recommendation for efficiently fetching headers from many URLs? « PHP Programming Language « IT news, forums, messages
Recommendation for efficiently fetching headers from many URLs?

Posted by usenet on 08/01/07 21:47

I have a MySQL database which includes several thousand links to pages on
external sites. Naturally, over time some of those links go away, so I would
like to create a script that reads through the URL fields, accesses each link,
and performs an action based on each response code.

I can think of at least three ways to accomplish the header fetch -- http_head,
get_headers, and curl -- but I want to choose the most efficient option that
won't suck up more server resources & bandwidth than is absolutely necessary. I
imagine that I'll also need to throttle the number of requests per second, since
this low-priority task should interfere as little as possible with other
connections in & out of the server.

Does anyone have a recommendation about which method is most efficient?

Thanks for any and all advice.

 

Navigation:

[Reply to this message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация