Setting up a Classic Capture Extract process
A GoldenGate Classic Capture Extract process runs on the source system. This process can be configured for initially loading the source data and for continuous replication. This process reads the redo logs in the source database and looks for changes in the tables that are defined in its configuration file. These changes are then written into a buffer in the memory. When the extract reads a commit
command in the redo logs, the changes for that transaction are then flushed to the trail files on disk. In case it encounters a rollback statement for a transaction in the redo log, it discards the changes from the memory. This type of Extract process is available on all platforms which GoldenGate supports. This process cannot read the changes for compressed objects. In this recipe you will learn how to set up a Classic Capture process in a GoldenGate instance.
Getting ready
Before adding the Classic Capture Extract process, ensure that you have completed the following steps in the source database environment:
Enabled database minimum supplemental logging.
Enabled supplemental logging for tables to be replicated.
Set up a manager instance.
Created a directory for the source trail files.
Decided a two-letter initial for naming the source trail files.
How to do it…
The following are the steps to configure a Classic Capture Extract process in the source database:
From the GoldenGate
Home
directory, run the GoldenGate software command line interface (GGSCI) as follows:./ggsci
Edit the Extract process configuration as follows:
EDIT PARAMS EGGTEST1
This command will open an editor window. You need to add the extract configuration parameters in this window as follows:
EXTRACT <EXTRACT_NAME> USERID <SOURCE_GG_USER>@SOURCEDB, PASSWORD ****** EXTTRAIL <specification> TABLE <replicated_table_specification>;
For example:
EXTRACT EGGTEST1 USERID GGATE_ADMIN@DBORATEST, PASSWORD ****** EXTTRAIL /u01/app/ggate/dirdat/st TABLE scott.*;
Save the file and exit the editor window.
Add the Classic Capture Extract to the GoldenGate instance as follows:
ADD EXTRACT <EXTRACT_NAME>, TRANLOG, <BEGIN_SPEC>
For example:
ADD EXTRACT EGGTEST1, TRANLOG, BEGIN NOW
Add the local trail to the Classic Capture configuration as follows:
ADD EXTTRAIL /u01/app/ggate/dirdat/st, EXTRACT EGGTEST1
Start the Classic Capture Extract process as follows:
GGSCI> START EXTRACT EGGTEST1
How it works…
In the preceding steps we have configured a Classic Capture Extract process to replicate all tables for a SCOTT
user. For this we first configure an Extract process parameter file and add the configuration parameter to it. Once the parameter file is created, we then add the Extract process to the source manager instance. This is done using the ADD EXTRACT
command in step 5. In step 6, we associate a local trail file with the Extract process and then we start it. When you start the Extract process you will see the following output:
GGSCI (prim1-ol6-112.localdomain) 11> start extract EGGTEST1 Sending START request to MANAGER ... EXTRACT EGGTEST1 starting
You can check the status of the Extract process using the following command:
GGSCI (prim1-ol6-112.localdomain) 10> status extract EGGTEST1 EXTRACT EGGTEST1: STARTED
There's more…
There are a few additional parameters that can be specified in the extract configuration as follows:
EOFDELAY secs
: This parameter controls how often GoldenGate should check the source database redo logs for new dataMEGABYTES <N>
: This parameter controls the size of the extract trail fileDYNAMICRESOLUTION
: Use this parameter to enable extract to build the metadata for each table when the extract encounters its changes for the first time.
If your source database ie this parameter to enable extract to build the metadata for each table when the exs a very busy OLTP production system and you cannot afford to add additional load of GoldenGate process on it, you can however offload GoldenGate processing to another server by adding some extra configuration. You will need to configure the source database to ship the redo logs to a standby site and set up a GoldenGate manager instance on that server. The Extract processes will be configured to read from the archived logs on the standby system. For this you specify an additional parameter as follows:
TRANLOGOPTIONS ARCHIVEDLOGONLY ALTARCHIVEDLOGDEST <path>
Tip
If you are using Classic Capture in ALO
mode for the source database using ASM, you must store the archive log files on the standby server outside ASM to allow Classic Capture Extract to read them.
See also
The recipe, Configuring an Extract process to read from an Oracle ASM instance and the recipe, Setting up a GoldenGate replication with multiple process groups in Chapter 2, Setting up GoldenGate Replication