Best Practices for Implementing CDC
Performance considerations with high volume CDC streams
Each CDC session uses an API call that blocks until records are received from the server. These records are then processed sequentially in a single thread. If your CPU does not keep up with processing the records, consider splitting the monitored tables into separate CDC sessions each running on its own thread in the client program. This approach distributes the workload for faster concurrent processing.
Journal File Cleanup and CDC Sessions
When CDC sessions fall behind or read historical data using a journal address, they may require access to older journal files. Database administrators must therefore coordinate journal file cleanup with CDC sessions.
To prevent the deletion of journal files still needed by CDC sessions, administrators should determine the lowest journal file read position among active CDC sessions. This can be done using the following SQL query:
echo "select min(lgs_jfa_fileseq) from ima_lgstream_sessions\\g" |sql imadb
Log File Size Constraint
A larger transaction log file is required to avoid frequent switching between the transaction log and journal files.
BLOB Memory Usage
When capturing log records related to BLOB data, the BLOB data is stored in the memory. It is recommended not to capture very large BLOBs, as this will impact memory usage.
Last modified date: 01/27/2026