is it Efficient to use Mars to Update 7000 Rows?
I have a service to update/inserts 7000+ rows using MARS. Is 开发者_开发百科there a better way to do this?
If you are doing this regularly, then for this volume, I would use SqlBulkCopy
to push to data into a staging table (same schema to the target table, but isolated), and then use a stored procedure to do the inserts / updates (from the staging table into the actual table) - probably wrapped in a transaction.
This will minimize the round-trips, and use the bulk insert API for pushing the data.
If you don't need everything in one batch, another option would be to process the data in blocks of (say) 100 records; this should work, without causing the transactions to take too long - it will be noticeably slower than the bulk-copy approach, but has the advantage of being object-based, and not requiring you to use an extra set of tools / languages.
精彩评论