|
Posted by Tim Satterwhite on 12/08/06 07:50
Hi folks,
Thanks for your suggestions. A group of people I know gave this some
thought, and two posed the option that I write a loop which compares the
FTP'd file's size to itself every few seconds. When the file size stops
growing, I can reasonably guess that the write is complete. I wrote such a
procedure in T-SQL today, and it seems to work in my testing scenarios.
In response to Roy, I'd love to have an end of job file sent, but I need to
process files as soon as they arrive. If I wait for all the files (I'm
contending with 500 files over a 45 minute period), then I'm inconveniencing
my customers. An EOJ file for each data file would inconvenience the COBOL
programmer who is sending all this data from the mainframe. I can start my
import process and pause for individual files that are in contention with
SQL Server, though.
In response to Jonathan, we can (and do) move the (older) files to another
directory after upload, but my problem is that I can't upload if the current
day's file happens to be writing while the upload process is focusing on
that file during its loop through the list of files to expect.
Again, I think I might have nailed this by comparing file sizes over time.
I'll know when I move my code to production next week. I appreciate your
guys' time and energy!!!!
Regards,
Tim
"Tim Satterwhite" <timothy.satterwhite@nospamucsfmedctr.org> wrote in
message news:el7hon$1krq$1@itssrv1.ucsf.edu...
> Hi All,
>
> I think this is a thorny problem, and I'm hoping you can help. I've not
> found this exact issue described anywhere yet.
>
> I have a stored procedure that calls BULK INSERT on a set of text files.
> These files are FTP'd from a legacy system (a mainframe running MVS).
> Sometimes, the process steps on iteslf, whereby the bulk insert is
> attempted
> on a file whose FTP is still in progress; it's not fully written to disk
> on
> the SQL box (it's a 100MB file that takes a long time to cross the network
> to the share on the Windows box), and so the insert generates a fatal
> error
> and the procedure stops.
>
> I wrote a proc that calls the sp_OA* procs to use the
> Scripting.FileSystemObject to test for the file's readiness. It returns
> an
> error for me if the file does not exist. It also returns an error if I
> try
> to run the BULK INSERT against a file which is being copied via Windows to
> the SQL box. This is working just fine under those conditions; the
> sp_OAMethod to call OpenTextFile bombs appropriately if the file write is
> still in progress.
>
> That's great, but it doesn't do the same thing during an FTP in progress.
> It doesn't generate the same error (that is OpenTextFile has no problem
> opening a partially written FTP'd file). I can also open the file in
> Notepad, even though it's not fully written to disk ... something I did
> not
> expect, but there we are. What is it about FTP that's different from a
> Windows file system copy operation that makes the file look available for
> reading?
>
> If BULK INSERT is capable of detecting that it cannot do its thing on a
> file
> whose FTP is in progress, then what can I write or call to emulate that
> detection? I tried writing a COM object in VB.NET and calling that from
> my
> SQL stored proc instead of the Scripting Engine's FSO methods. My code
> simlpy tries to run a FileOpen using an Exclusive read lock; however, this
> doesn't seem to work, and I'm shooting in the dark now.
>
> Can anyone tell me what kind of file i/o request BULK INSERT makes, such
> that it is capable of abending during a run against an incompletely
> written
> file using FTP?
>
> Thanks!
> Tim
>
>
Navigation:
[Reply to this message]
|