Sql updating large number of rows Bolivia sexchat


21-Oct-2017 23:35

Next I thought I should test this on a slightly larger database.So I made another database and created a new, larger copy of Due to disk space limitations, I had to move off of my laptop's VM for this test (and chose a 40-core box, with 128 GB of RAM, that just happened to be sitting around quasi-idle :-)), and still it was not a quick process by any means.Note that I did not try any of these tests with compression enabled (possibly a future test!), and I left the log autogrow settings at the terrible defaults (10%) – partly out of laziness and partly because many environments out there have retained this awful setting.So one test would be to perform the following, one-shot delete: I know this is going to require a massive scan and take a huge toll on the transaction log. :-) While that was running, I put together a different script that will perform this delete in chunks: 25,000, 50,000, 75,000 and 100,000 rows at a time.

No warranties or other guarantees will be offered as to the quality of the opinions or anything else offered here.You can use this type of scale to determine whether it's more important to reduce the impact to disk space or to minimize the amount of time spent.