Tuesday, March 20, 2012

A tool...how long will this take...

Is there a tool, in SQL 2K or third party, that can look
at an UPDATE, or even DELETE, I'm about to run,
and "estimate" how long it'll run. I started a massive
UPDATE on 37 million rows on one column. I stared the
UPDATE 14 hours ago, and it's still running. I can't run
a query to determine how many records have already been
updated, because the update has a lock on the table.
I was just wondering was there something I could've done
beforehand, that could've told me how long this UPDATE
would take. I've played with the Execution Plan feature,
and it does give some useful information, but I'm
specifically looking for costs in terms of "time".
Thanks
RozRoz
I would divide a long transaction into small. Have you checked transaction
log file? Did it grow?
SET ROWCOUNT 1000
WHILE 1 = 1
BEGIN
UPDATE command
IF @.@.ROWCOUNT = 0
BEGIN
BREAK
END
ELSE
BEGIN
CHECKPOINT
END
END
SET ROWCOUNT 0
"Roz" <anonymous@.discussions.microsoft.com> wrote in message
news:2225101c45d0e$345d1fa0$a301280a@.phx
.gbl...
> Is there a tool, in SQL 2K or third party, that can look
> at an UPDATE, or even DELETE, I'm about to run,
> and "estimate" how long it'll run. I started a massive
> UPDATE on 37 million rows on one column. I stared the
> UPDATE 14 hours ago, and it's still running. I can't run
> a query to determine how many records have already been
> updated, because the update has a lock on the table.
> I was just wondering was there something I could've done
> beforehand, that could've told me how long this UPDATE
> would take. I've played with the Execution Plan feature,
> and it does give some useful information, but I'm
> specifically looking for costs in terms of "time".
> Thanks
> Roz
>|||Uri,
Yep, that's exactly what I did. I broke the Update into
5000 records at a time. The Tlog is small, as it should
be since I'm Checkpointing quite frequently. But the
Update is still running. I guess it just takes this
long...
Roz

>--Original Message--
>Roz
>I would divide a long transaction into small. Have you
checked transaction
>log file? Did it grow?
>SET ROWCOUNT 1000
>WHILE 1 = 1
>BEGIN
> UPDATE command
> IF @.@.ROWCOUNT = 0
> BEGIN
> BREAK
> END
> ELSE
> BEGIN
> CHECKPOINT
> END
>END
>SET ROWCOUNT 0
>"Roz" <anonymous@.discussions.microsoft.com> wrote in
message
> news:2225101c45d0e$345d1fa0$a301280a@.phx
.gbl...
run[vbcol=seagreen]
feature,[vbcol=seagreen]
>
>.
>|||That will give you 7400 separate transactions. It could still take a while,
but you will not cause your transaction log to grow uncontrollably. Using
smaller chunks gives you more options. You could insert into (or update) a
"logging" table after each pass through the while loop. That would give you
the ability to know how many you have done and how many more rows are left
to process.
Keith
"Roz" <anonymous@.discussions.microsoft.com> wrote in message
news:2243501c45d11$05c5e8e0$a101280a@.phx
.gbl...[vbcol=seagreen]
> Uri,
> Yep, that's exactly what I did. I broke the Update into
> 5000 records at a time. The Tlog is small, as it should
> be since I'm Checkpointing quite frequently. But the
> Update is still running. I guess it just takes this
> long...
> Roz
>
> checked transaction
> message
> run
> feature,|||Roz
A good idea would be to display a running total of how many rows you have up
dated everytime you hit your 5000 transaction count. That way at least you w
ould have an idea how long it will take that way. (Too late now I know)
Regards
John|||Beautiful. Very excellent ideas to try. I'll keep these
in mind next time I need to do such a massive update.
Thanks very, very much to all.
Roz

>--Original Message--
>That will give you 7400 separate transactions. It could
still take a while,
>but you will not cause your transaction log to grow
uncontrollably. Using
>smaller chunks gives you more options. You could insert
into (or update) a
>"logging" table after each pass through the while loop.
That would give you
>the ability to know how many you have done and how many
more rows are left
>to process.
>--
>Keith
>
>"Roz" <anonymous@.discussions.microsoft.com> wrote in
message
> news:2243501c45d11$05c5e8e0$a101280a@.phx
.gbl...
look[vbcol=seagreen]
massive[vbcol=seagreen]
the[vbcol=seagreen]
been[vbcol=seagreen]
done[vbcol=seagreen]
UPDATE[vbcol=seagreen]
>.
>|||By the way, limiting the rowcount to 5000 updates seems a little light. I
would probably try with 50,000 or even 100,000. Heck, you could set it to
10 if you wanted to...it is probably a balancing act of time vs resource
usage.
One more idea for inside the WHILE loop: you could perform a BACKUP LOG
<database> WITH NO_LOG within the while loop to clear the transaction log.
Keith
"Roz" <anonymous@.discussions.microsoft.com> wrote in message
news:226eb01c45d23$f44297e0$a501280a@.phx
.gbl...[vbcol=seagreen]
> Beautiful. Very excellent ideas to try. I'll keep these
> in mind next time I need to do such a massive update.
> Thanks very, very much to all.
> Roz
>
> still take a while,
> uncontrollably. Using
> into (or update) a
> That would give you
> more rows are left
> message
> look
> massive
> the
> been
> done
> UPDATE

No comments:

Post a Comment