|  | Posted by Virginner on 04/27/07 17:44 
"Blagovist" <blag@ovist.com> wrote in message news:463097cb_1@x-privat.org...
 > Virginner wrote:
 >> "Blagovist" <blag@ovist.com> wrote in message
 >> news:462f0f0f_3@x-privat.org...
 >>> Hi.
 >>> Is there an easy way to "lift" data from HTML tables and enter that into
 >>> my database? I'm a total novice and so far my searches have yielded
 >>> little. I see Navicat has an import option, but that appears to be for
 >>> well structured data like Word, Excel or PDF...
 >>>
 >>> Thanks,
 >>>
 >>> Blago
 >>
 >> If you've got Excel, then you can "bounce" a table via that (copy /
 >> paste) then use that to import via Navicat....
 >>
 >> D.
 >
 > I found something called easywebsave (an IE add-on) that looks promising.
 > But still a long way from being automated.
 
 Ah! You didn't state "automated" in your OP, hence my suggestion about
 Excel -> Navicat.
 
 If you want it automated, then file_get_contents of the url into a string,
 strip_tags except table related ones, then use a few explodes or preg_splits
 to rip the reaming data into array(s).
 
 D.
 --
 googlegroups > /dev/nul
  Navigation: [Reply to this message] |