Click or drag to resize

BulkCopy Class

A convenience class to copy data from an IDataReader to an index.
Inheritance Hierarchy
SystemObject
  SphinxConnector.SphinxQLBulkCopy

Namespace:  SphinxConnector.SphinxQL
Assembly:  SphinxConnector (in SphinxConnector.dll) Version: 5.3.0
Syntax
public class BulkCopy

The BulkCopy type exposes the following members.

Constructors
  NameDescription
Public methodBulkCopy
Initializes a new instance of the BulkCopy class.
Top
Properties
  NameDescription
Public propertyBatchSize
Gets or sets the batch size to use when inserting data into the index. Default is 16.
Public propertyCopyMethod
Gets or sets the BulkCopyMethod. Default is BulkCopyMethod.Replace
Public propertyIndexName
Gets or sets the name if the index to copy data to.
Public propertyProgress
Public propertyProgressNotificationAfter
By default progress notification happens for each processed batch, set this to a value greater 0 to adjust when a notification should occur.
Top
Methods
  NameDescription
Public methodCopyAsync
Copies the data provided by the dataReader to the index.
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Examples
//DataTable implements IDataReader and can be used directly.
DataTable data = GetData();

//If you have an enumarable of objects you want to bulk copy, you can use <see href="http://www.nuget.org/packages/FastMember" />
//to create an implementation of IDataReader on the fly:

IEnumerable<Product> products = GetProducts();
IDataReader productsDataReader = FastMember.ObjectReader.Create(products);

using (var connection = new SphinxQLConnection())
{
    await connection.OpenAsync();
    using (var tx = await connection.BeginTransactionAsync())
    {
        var bulkCopy = new BulkCopy(connection, "rt");
        {
            BatchSize = 1000, //Default batch size is a very conservative 16, do some tests and adjust to what gives the best performance for your workload. You may also need to adjust max_packet_size in your searchd config.
            Progress = new Progress<int>(c => Debug.WriteLine($"{c} documents written"))
        };
        int count = await bulkLoader.CopyAsync(data);
        //or
        int count = await bulkLoader.CopyAsync(productsDataReader);
        await tx.CommitAsync();
    }
}
See Also