|
Posted by Erland Sommarskog on 09/30/80 11:17
Richard Hollis (richard_hollis@hotmail.com) writes:
> Using BCP with the hints TABLOCK and ROWS_PER_BATCH, what is the best
> setting to give ROWS_PER_BATCH? Is it the size of the table you are
> importing in rows or a logical division of those rows?
I guess this depends on what you want to optimize. At least in previous
versions, you could keep down the transaction-log size in simple recovery
mode by specifying something like 1000. I will have to confess, that I'm
uncertain whether this still applies.
Anyway, my gut feeling is that as long as you don't set a very low
number, ROWS_PER_BATCH does not have that much influence of load times.
But I should immedieately add the disclaimer that I don't load large
files often enougb to have reason to benchmark this.
Gert Drapers wrote an article about bulk load in SQL Server Magazine
sometime back, and he did not discuss batch size at all! You can find
his article on
http://www.windowsitpro.com/Article/ArticleID/41681/41681.htm, but
you will need to be a subscriber to access main body of the article.
--
Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinfo/productdoc/2000/books.asp
Navigation:
[Reply to this message]
|