I am indexing a db table using custom indexer all works fine. However at moment still developing and playing with small db table. Eventually looking at indexing a 4million row db although not all fields. In my code i have something like
var reader = _SqlHelper.ExecuteReader(CommandType.StoredProcedure, storedProcedureName);
while (reader.Read())
{
int fields = reader.FieldCount;
var sds = new SimpleDataSet
{ NodeDefinition = new IndexedNode(), RowData = new Dictionary<string, string>() };
for (int i = 0; i < fields; i++)
{
if (i == 0)
{
sds.NodeDefinition.NodeId = Convert.ToInt32(reader[0]);
sds.NodeDefinition.Type = indexType;
}
else
{
sds.RowData.Add(reader.GetName(i), reader[i].ToString());
}
}
list.Add(sds);
}
I guess there will be massive bottle neck when filling the reader and creating the dictionary. Has anyone else done something similar if so how did you manage to speed things up?
Examine custom indexer
Hello,
I am indexing a db table using custom indexer all works fine. However at moment still developing and playing with small db table. Eventually looking at indexing a 4million row db although not all fields. In my code i have something like
I guess there will be massive bottle neck when filling the reader and creating the dictionary. Has anyone else done something similar if so how did you manage to speed things up?
Regards
Ismail
is working on a reply...