Bulk copy CSV files using NPGSQL 3.0.5 BinaryImporter -
i have upgraded npgsql 3.0.5 , realized npgsqlcopyin no more available. older version process csv file npgsqlcopyin fast , efficient bulk copying huge data. used
var copystr = "copy tablename (col1,col2,etc) 'csv file' stdin delimiter ',' csv header" ; npgsqlcommand dbcommand = new npgsqlcommand(copystr, _datastoreconnection); npgsqlcopyin copyin = new npgsqlcopyin(dbcommand, _datastoreconnection, stream); copyin.start();
but 3.0 version, couldn't find way bulk copying letting binary importer data csv. instead use below code
streamreader streamreader = null; try { streamreader = new streamreader(filestream); { var copystr = string.format("copy {0} ({1}) stdin (format binary)", _datastorename, string.join(",", _datastorecolumns.select(a => a.tolower()))); if (_datastoreconnection.state == connectionstate.closed) _datastoreconnection.open(); string csvline = string.empty; while ((csvline = streamreader.readline()) != null) { if (linecount > 0) { using (var importwriter = _datastoreconnection.beginbinaryimport(copystr)) { importwriter.writerow(csvline.split(',')); } } else { linecount++; //this skip first line csv file. first line header, skip it. } } } }
is there way can specify binaryimporter input data csv, takes care of delimiter , inserting data datastore in npgsqlcopyin?
you can't use binary importer import csv because, well, csv isn't binary :)
the binary importer more efficient way import data, see example in the npgsql docs. if absolutely have import csv, can still via begintextimport
have format text yourself, i.e. put delimiter , forth. see docs well.
Comments
Post a Comment