Posted by news on 11/11/16 11:28
I have a new situation I'm facing and could use a suggestion or two, as
I don't seem to be able to think in the abstract very well.
We have a local server which holds all of our image files.
We have a remote server that runs our public Web server and mySQL
database.
I need to be able to run a script that will:
Read the contents of a dir on the local server and
a. make thumbnails of the files in it
b. querey the database and pull information on each file based on the
filename
c. create a spreadsheet and e-mail it to 3rd parties
Now, I know how to do each of those things, don't need help there.
I just need a way to get it to span the two servers.
Here's what I was thinking:
1. Have the PHP script on the local server which processes the files
and sends the file names it processed...
2. Through a GET in a WGET back to another PHP script on the remote
server which pulls the database info for those filenames and creates
the spreadsheets/emails
or
1. Have the PHP script on the local server which processes the files
and sends the file names it processed to a local text file named with a
timestamp.
2. Then it initiates a WGET to a PHP script on the remote server
sending that filename in a GET which tells the remote script where to
find the list of filenames it needs to pull the information for from
the DB and generate the sheets/emails
or
1. Keep a constantly backed up version of the remote database on the
local server via SCP's of the exported database
2. Have the PHP script on the local server process the files AND do all
the DB queries and sheets/emails from the version of the DB that's been
backed up to it on a regular basis
What do you think? Or is there a better way that I'm just not thinking
of?
Thanks for any feedback!
Liam
Navigation:
[Reply to this message]
|