I have written on this before, and at openworld last year I've talked to some of the presenters about this.
My goal is to to create a performance data warehouse. What does this mean ?
I want to take all the AWR data from all the databases in the corporate infrastructure, and create one giant repository.
Well I just got one step closer.
I have a repository server set up already. It is just an 11.2.0.2 database with lots of space (about 3 tb), running on a Linux blade server.
My first test was to take all the AWR data from an 10g database we just retired running on HPUX. Since this database was moved to Linux (cross endian), transportable tablespaces was used, and all the AWR data was not brought over.
First step.. Get the data out of original source.
In order to do that you can use awrextr.sql in the $ORACLE_HOME/rdbms/admin directory. I noticed (and found from other blogs) that this just calls sys.dbms_swrf_internal.awr_extract.
So
STEPS on SOURCE.
Notes
STEPS on the Destination
3. Once imported move the awr data to sys catalogs
DBMS_SWRF_INTERNAL.MOVE_TO_AWR ( SCHNAME => 'AWR');
FINALLY
All this worked flawlessly, and then I tried to run COE_PROFILE.
IF you didn't know COE_PROFILE.sql is part of sqlt and allows you to transfer profiles from one database to another by creating a script to build the profile.. It all works from my repository.
I will write about this again, but I am physched to finally get this off the ground.
My goal is to to create a performance data warehouse. What does this mean ?
I want to take all the AWR data from all the databases in the corporate infrastructure, and create one giant repository.
Well I just got one step closer.
I have a repository server set up already. It is just an 11.2.0.2 database with lots of space (about 3 tb), running on a Linux blade server.
My first test was to take all the AWR data from an 10g database we just retired running on HPUX. Since this database was moved to Linux (cross endian), transportable tablespaces was used, and all the AWR data was not brought over.
First step.. Get the data out of original source.
In order to do that you can use awrextr.sql in the $ORACLE_HOME/rdbms/admin directory. I noticed (and found from other blogs) that this just calls sys.dbms_swrf_internal.awr_extract.
So
STEPS on SOURCE.
- Create user "awrextract" identified by xxxxx
- grant connect ot awrextract
- grant execute on sys.dbms_swrf_internal to awrextract
- Login as awrextract
- execute awr_extract procedure
Notes
- The procedure automagically puts a .dmp on the dump file name
- You need to know the start and end snapshot you want to export
- The dmpdir needs to be a precreated directory and it has to be a cooked filesystem (non-asm). I found that it also puts the logfile here, and that's why it has to be cooked.
- You cannot use dbms_file_transfer, especially going across endian types
STEPS on the Destination
- Create schema to import the data into
- Import the file in
3. Once imported move the awr data to sys catalogs
DBMS_SWRF_INTERNAL.MOVE_TO_AWR ( SCHNAME => 'AWR');
FINALLY
All this worked flawlessly, and then I tried to run COE_PROFILE.
IF you didn't know COE_PROFILE.sql is part of sqlt and allows you to transfer profiles from one database to another by creating a script to build the profile.. It all works from my repository.
I will write about this again, but I am physched to finally get this off the ground.
No comments:
Post a Comment