|
Posted by John Bell on 12/31/05 11:54
Hi Rey
You have not said enough about how the systems are connected and how the
data is stored to give a specific answer.
If you receive an XML file to process you can copy (or have the same file
sent) to the remote server which can upload it using the same application
that is on your local server. The is no need to export the data from the
local database unless there is some processing that means that it would
either require the remote system to do too much work, or the remote system
does not have all the information to process the file correctly.
If you do need to process the data locally and then send the modified data
to your remote system, then your time scales may indicate that replication
would be a suitable option. As the local and remote tables are the same
structure there is no need to use XML as an intermediate format, you may
want to look at creating a DTS (Data Transformation Services) task to
export/import the data. If you only have FTP access to the remote system
then you will need a scheduled DTS task on each system a task on the local
system to export/ftp the data and a task on the remote system to check for
new data files and load them up. Check out books online and
http://www.sqldts.com/default.aspx for more information on DTS.
There are other options such as writing a web service and SQL 2005 makes
this easier to host, but that may be more than you actually need.
John
"_(d)IEGO" <rey_guerrero@hotmail.com> wrote in message
news:1136007732.925963.8950@g43g2000cwa.googlegroups.com...
> John,
>
> Thanks for the link.
>
> Please tell me if I am correct. First I have to export the data to XML
> on the local server then do an FTP to the remote server then process
> the uploaded XML file there. If so, will this approach be fast if I set
> the upload at a 30-minute interval and let's say that there is an
> average of 20 transactions per minute?
>
> Rey Guerrero
>
[Back to original message]
|