A somewhat recursive post here. There are a number of forums around the internet which Oracle professionals increasingly use to research various issues, discover new features and diagnose problems. One such is the relatively new Database Administrators stack exchange site. For various reasons I came across this thread and in particular a piece of advice on logging long running queries using an autonomous transaction.
Personally, I suggest using autonomous transactions to enable this feature — not on the transaction itself, but as a logging mechanism letting you know what is going on. For example, you could have PROCEDURE LONG_ACTION call PROCEDURE WRITE_LOG_ENTRY (defined as an autonomous transaction) that would write a VARCHAR2 to another table. Autonomous transactions do NOT interfere with your current transaction and so you can see what’s going on via your logging entries regardless of a COMMIT or ROLLBACK in your current transaction. That said, you can’d do that with one massive DML statement; you’d have to use a loop.
Now it seems to me at least that the caveats I have highlighted above are somewhat underplayed. There are 2 issues here:
- The original transaction is affected – in fact it is re-written entirely as a series of single row updates in a loop (a.k.a slow by slow) rather than a single update loop.
- The result of this is that you will get logs, but the runtime will grow potentially orders of magnitude. This is unlikely to be what you want.
So there you are: 2 pieces of advice on the Internet. How to choose between them? I suggest testing them out. The script below tests the scenario above. I have a ‘hairy calculation’ namely a function to tell whether a given positive integer is prime, I want to run it through the first ten thousand positive integers and update a table of primes accordingly. As sieving through primes is computationally intensive I expect this to take a measurable amount of time. Lets see shall we.
/* Script to demonstrate problems with http://dba.stackexchange.com/questions/2994/oracle-any-way-to-view-uncommited-changes-to-a-particular-table/3051#3051 */ -- create a table of all numbers from 1 to 1m drop table primes purge; create table primes (real,is_prime) as select rownum,0 from dual connect by level <= 10000; -- create a function to test 'primeness' create or replace function is_prime(p in number) return number is l_limit number; i number := 3; retval number := 1; -- returns 0 if false, 1 if true begin l_limit := sqrt(p); if p = 1 then retval:=1; end if; if p = 2 then retval :=1; end if; if mod(p,2) = 0 then retval := 0; end if; while (i <= l_limit) loop if mod(p,i) = 0 then retval := 0; end if; i := i + 2; -- (no need to test even numbers) end loop; return retval; end; / -- Create proposed function drop table log_entries purge; CREATE TABLE LOG_ENTRIES( activity_date date, log_entry varchar2(2000) ); create or replace PROCEDURE WRITE_LOG_ENTRY ( str VARCHAR2 ) IS PRAGMA AUTONOMOUS_TRANSACTION; BEGIN INSERT INTO LOG_ENTRIES VALUES ( SYSDATE, str ); COMMIT; END; / -- now do the tests create or replace PROCEDURE LONG_ACTION_SUGGESTED IS c NUMBER; BEGIN FOR r IN ( SELECT real FROM PRIMES ) LOOP c := c + 1; UPDATE PRIMES z SET IS_PRIME = IS_PRIME(real) WHERE z.real = r.real; IF MOD(c,500) = 0 THEN WRITE_LOG_ENTRY ( c || ' rows processed.' ); END IF; END LOOP; COMMIT; END; / create or replace PROCEDURE LONG_ACTION IS BEGIN UPDATE PRIMES SET IS_PRIME = IS_PRIME(real); COMMIT; END; / set timing on prompt ' run the update in sql without logging' update primes set is_prime = is_prime(real); commit; prompt ' run the suggested logging approach' exec LONG_ACTION_SUGGESTED; prompt ' run from pl/sql approach' exec LONG_ACTION;
In this example I time how long my update takes when run from sql, how long when run from pl/sql and how long when modified in the manner suggested by the article. My results are below.
' run the update in sql without logging' 10000 rows updated. Elapsed: 00:00:01.08 Commit complete. Elapsed: 00:00:00.00 ' run the suggested logging approach' PL/SQL procedure successfully completed. Elapsed: 00:00:07.98 ' run from pl/sql approach' PL/SQL procedure successfully completed. Elapsed: 00:00:01.26
In other words what took just a second without my autonomous transaction took 8 seconds with it. The test is easily reconfigurable though – running with 100000 rows in the table produced the following results.
Elapsed: 00:00:00.01 ' run the update in sql without logging' 100000 rows updated. Elapsed: 00:00:21.55 Commit complete. Elapsed: 00:00:00.01 ' run the suggested logging approach' PL/SQL procedure successfully completed. Elapsed: 00:07:26.05 ' run from pl/sql approach' PL/SQL procedure successfully completed. Elapsed: 00:00:20.50
That is we jumped from 20 seconds or so to 7 and half minutes.
So next time you see some advice on the internet, it might well be worth your while testing it out first, especially if the author provides you with the tools to allow you to do this.