Batch Statement Execution
The IngresDataAdapter class supports batch processing through the UpdateBatchSize property. This property directs the data adapter to gather several INSERT, UPDATE, and DELETE statements and their parameter sets from a DataSet or DataTable into a single block and send it to the server instead of processing one operation at a time.
The IngresDataAdapter is the Ingres/Vector implementation of the .NET DbDataAdapter class and can be used to manage the flow of Ingres data between the .NET application and the Ingres data source. The data adapter contains the SELECT command to fill a DataTable with DataRows and also contains the INSERT, UPDATE, and DELETE commands to update the database. As the application makes updates in the DataTable, the value and state change of each DataRow is recorded in the DataRow but the changes are not committed to the database until the application calls the data adapter's Update() method. When the method is called, the data adapter sweeps the DataRows of the DataTable. If the DataRow has a RowState of Added, Changed, or Deleted, the data adapter invokes the respective INSERT, UPDATE, or DELETE command.
The Microsoft .NET Framework allows a .NET application to specify that the updates be gathered into a batch by the DbDataAdapter for processing by the database. You can use the UpdateBatchSize property of the .NET DbDataAdapter to control the batch size. An UpdateBatchSize value of 1 (the default) disables batch processing—that is, each update is processed individually in a call to the database server. An UpdateBatchSize > 1 specifies the number of rows to be updated by the batch. If UpdateBatchSize is set to 0 then the batch size is left to the discretion of the data provider.
The value you choose for UpdateBatchSize depends on the application and its environment. The slower the channel between the client and server machines, the better the relative performance will be due to batch support. The benefit of .NET batch is that fewer I/Os result in reduced elapsed time. While some batch sizes will have a dramatic improvement in rows processed per second, increasingly larger batch sizes will reach a point of diminishing returns as the constant database processing time overshadows the decreasing I/O time. Experimentation in the client/server system will suggest a good value for UpdateBatchSize that balances I/Os, CPU, memory resources, .NET Garbage Collection (GC) timing, server resources to process the batch, and overall performance. A good starting value for UpdateBatchSize is 1000. A higher or lower value may be better for your environment.
The data adapter and UpdateBatchSize property can be used to bulk load a database table. A .NET DataTable is loaded with the new rows, the DataTable is associated with the IngresDataAdapter, the IngresDataAdapter.UpdateBatchSize is set to an efficient value > 1, and the IngresDataAdapter.Update() method invoked to drive a batch of rows to be INSERTed instead of one row at a time. Faster batched inserts can be achieved if the following conditions are met:
• Inserts must be into a base table (not an updatable view or index).
• The table should not have any rules or integrities (while they are allowed, they slow processing).
• The table must not be a gateway table (for example, an IMA table, security audit log file, or an Enterprise Access table).