Golden Gate catch #1, performance in replication, when databases in replication are partial

 

Recently, in a quick Golden Gate testing setup, there was a situation where export/import of data for replication was partial just for prototyping.

In Golden Gate replication, online redo/archive logs are extracted and full SQL statements are formed.

Now, when you have a partial import and export of the data, and run some simulation tests with real application load, you might get to a situation, where application is deleting data but the tables are not included in the export/import (or in the prototype database).

Golden Gate happily runs a cleanup process to synchronize all tables via delete statements the application has done, even if the tables do not exist. For this reason, the performance will degrade in the catchup phase on an active system.

 

In partial export/imports,    Golden Gate can also be configured to skip any delete operations for certain tables or you can disregard certain tables in the replication.

 

For this situation, there’s a Golden Gate configuration parameter.

 

 

GETDELETES | IGNOREDELETES

Valid for Extract and Replicat

Use the GETDELETES and IGNOREDELETES parameters to control whether or not Oracle

GoldenGate processes delete operations. These parameters are table-specific. One

parameter remains in effect for all subsequent TABLE or MAP statements, until the other

parameter is encountered.

Default GETDELETES

Syntax GETDELETES | IGNOREDELETES

 

 

You can also skip tables:

TABLEEXCLUDE
Valid for Extract
Use the TABLEEXCLUDE parameter with the TABLE and SEQUENCE parameters to explicitly
exclude tables and sequences from a wildcard specification. TABLEEXCLUDE must precede all
TABLE and SEQUENCE statements that contain the objects that you want to exclude.
Default None
Syntax TABLEEXCLUDE <exclude specification> [NORENAME]
Example In the following example, the TABLE statement retrieves all tables except for the table
named TEST.
TABLEEXCLUDE fin.TEST
TABLE fin.*;

Golden Gate monitoring framework

We were discussing with a few fellow DBAs how to monitor Golden Gate replication as a starting point for widely deployed systems.

Sure, there’s a chapter in the GG’s Adminisration guide and Oracle guys always refer to GGSCI command line tool, which you can use.

Yeah, you can..

But how about with a high number of databases?  Because there are lots of other things to do and time could be saved, if you had a utility kit with monitoring scripts.

So why nobody has provided essential scripts to monitor Golden Gate with the product itself?

  • Grid Control management pack support for Golden Gate is still a work in progress.
  • There’s GoldenGate director.. as an extra application (and needs a separate license).

There are some scripts from DBAs around the web:

  •  http://oracleabout.blogspot.com/2010/08/golden-gate-lag-monitoring.html

My idea:

Possible direction to add Golden Gate monitoring into monitoring frameworks:

Not a bad idea?

Golden Gate DB user creation without the DBA role

Here are the minimum privileges for Golden Gate user in the  database.

SQL> create user oragg identified by oragee;

SQL> grant connect,resource to oragg;

SQL> grant select any dictionary, select any table to oragg;

SQL> grant create table to oragg;

SQL> grant flashback any table to oragg;

SQL> grant execute on dbms_flashback to oragg;

SQL>  grant execute on utl_file to oragg;