|
Posted by axel on 08/18/06 01:03
In uk.net.web.authoring John Bokma <john@castleamber.com> wrote:
> axel@white-eagle.invalid.uk wrote:
>>> See: <http://johnbokma.com/windows/apache-virtual-hosts-xp.html>
>>> on how to create local versions of your site(s). (Windows XP).
>> A good guide.
> Thanks Axel, I have been working today like crazy to update it to 2.0 and
> fix some minor issues with the 1.3.x version.
Niks te danken :)
>> Your approach makes more sense in using 127.0.0.1 as the one base
>> IP address when only using a single local machine, which I'm doing
>> at the moment... it saves having to edit configuration files when
>> moving to a different network (a couple of months ago I had to
>> switch to a 10.0.1 network).
> It depends a lot on what you want, I don't want most of my sites to
> become visible on the LAN :-)
I see your point.
>> Although I have a development site on my local machine which I use
>> to check out things before uploading... I do my real development a
>> stage before that by using Makefiles, the htp 1.15 HTML
>> pre-processor
>> (old, but it works just fine) and various perl scripts to create
>> the deveopment site, or parts thereof. In other words I write as
>> little HTML as possible.
> Ditto. I use XML for the content, and parse and process it with
> Perl into HTML. All things that the Perl script can solve it does
> (like finding out the values for width and height attributes for
> the img element).
I suppose my approach came from something I originally some years ago.
We wanted to scan, cut and paste press releases from a specific market
area. I found the simplest approach was for those who did it was to fill
in a blank file with two or three lines starting with '#' indicating the
title, possible subtitle and company. Such a brain-dead scan, copy and
paste thing that anyone could do it.
Obviously formatting and clickable URLs were lost, but it was a free
service and if more was wanted, it could be paid for.
Originally it had been an experimental thing with actual HTML files
being filled in (a 3-stage business process... test something in raw
HTML, if it is found of interest move to a script, and then finally
to a database if decided worthwhile).
That was interesting especially when someone, ok, mea culpa, was
rushed/lazy and forget to replace the text 'XXX' markers that had
been set up for cut-and-paste. The result: Many hits from over the
world on a press release of little interest to anyone outside a
limited audience from all over the world and offers to buy the
URL... not the domain, but specifically the individual deep URL.
We were bemused, but then found out why... on a search engine search
for 'XXX' (I think it was Yahoo... it might have been Alta Vista)
it was turning up in the top ten results.
> Another script creates the RSS feed (it extracts the title from the page,
> and uses it as the title for the feed, etc.).
> And another script uploads the stuff using plink (part of PuTTY).
I use scp (well, we use different OS's)... I make an initial copy and
then ssh in and run an update (delete and move files) script.
> And all is kicked into action using ant :-)
Yes... that makes a lot of sense.
Although where do you find the time to do all this!? I'm still behind
uploading a few cat photos taken over a month ago... ag, I'm just a
lazy toad (my tutor at university always called me that).
Axel
Navigation:
[Reply to this message]
|