|
Posted by Ben C on 10/31/06 07:57
On 2006-10-30, howcanimissu@hotmail.com <howcanimissu@hotmail.com> wrote:
> Hi All,
>
> I am wondering what the simpliest tool or best way is to go to website
> using a program or tool and either:
> a) saving the file at the URL ( In this case a ***.txt file is the url)
> so it would be something like http:// www.testsite.com/testfile.txt .
> This file is a simple pipe delimited text file with some rudimentary
> data on it. I would like to automate this process so that whatever tool
> I use goes to the site once every minute to get an update and saves the
> file to my ftp.
There are lots of ways. To get the page, wget or curl are suitable
programs, Python's httplib will also do it.
> FOllowing that I have to put this data into an xls file
>
> b) Mine the above data and delimit it and send it to an xls or html
> file.
>
> Any thoughts, not looking to do much coding here just a tool to use if
> possible to automate this process, will buy if need to.
You are going to have to a little bit of coding I think. The task is
just too specific, I don't see how anyone could sell a tool that did
exactly this unless you paid someone specifically to write it.
[Back to original message]
|