Small shortcut in a large reportPosted: November 22, 2007
Came back from Oracle OpenWorld to a flurry of business demands. End of Quarter is here and with it new demands for reports and data.
The main report that I’m writing is rather large, generating the data requires about 30 different queries on about 10 databases. Some run in few seconds while others, despite my best tuning efforts can take an hour or more.
So, I wrote a PL/SQL procedure that runs all the queries on all the DBs and collects the results into a single table that I’ll use in the report. That’s pretty cool, but suppose there is an error in query number 23? After I fix the error, I have to run the report again which takes hours. Quite annoying.
So, I created a small table:
CREATE TABLE PROGRESS
( "STAT_NAME" VARCHAR2(30),
And now, before each query runs, I check in the progress table if it was already done or not. If it was already done, I don’t need to do it again. If it wasn’t, I run it and after it ran successfully, I add it to the table.
Here’s the procedure that takes care of this process:
create or replace
PROCEDURE CONDITIONAL_EXEC ( in_stat_name IN VARCHAR2,
in_schema_name IN VARCHAR2,sql_stmt in varchar2) AS
select count(*) into stat_status from progress where schema_name=in_schema_name and stat_name=in_stat_name;
if stat_status=0 then
dbms_output.put_line ('About to run: ' || sql_stmt);
execute immediate sql_stmt;
insert into progress values (in_stat_name,in_schema_name,sysdate);
Advantages of this approach:
- I save time by not running the same stuff twice
- I can always “undo” something that already ran by deleting from that table
- I can watch the progress of my report in the table
- The table contains the finish time of each query. This means I have rough information on how long each query took to run and I can use this information to decide which queries need more work.
- I can also use the time information to show the business how much db time each statistic they request costs them.Maybe they’ll decide that some statistics are not that important? Especially those that are carefully crafted so we can’t use any existing aggregated data to produce them. Unlikely, but who knows?
Not bad for 5 minutes of work.