DevToys.PocoCsv.Core
1.5.8
See the version list below for details.
dotnet add package DevToys.PocoCsv.Core --version 1.5.8
NuGet\Install-Package DevToys.PocoCsv.Core -Version 1.5.8
<PackageReference Include="DevToys.PocoCsv.Core" Version="1.5.8" />
paket add DevToys.PocoCsv.Core --version 1.5.8
#r "nuget: DevToys.PocoCsv.Core, 1.5.8"
// Install DevToys.PocoCsv.Core as a Cake Addin #addin nuget:?package=DevToys.PocoCsv.Core&version=1.5.8 // Install DevToys.PocoCsv.Core as a Cake Tool #tool nuget:?package=DevToys.PocoCsv.Core&version=1.5.8
DevToys.PocoCsv.Core
One of the fastest csv reader deserialzer available.
DevToys.PocoCsv.Core is a class library to read and write to Csv. It contains CsvStreamReader, CsvStreamWriter and Serialization classes CsvReader<T> and CsvWriter<T>.
- Read/write serialize/deserialize data to and from Csv.
- Use Linq to query large CSV files with CsvReader<T>.ReadAsEnumerable().
- Use CsvWriter<T>.Write() to write large data tables to Csv.
- Retrieve schema for a csv file with CsvUtils.GetCsvSchema() which can be used to create a poco object.
CsvStreamReader
string file = "C:\Temp\data.csv";
using (CsvStreamReader _reader = new CsvStreamReader(file))
{
_reader.Separator = ',';
while (!_reader.EndOfStream)
{
List<string> _values = _reader.ReadCsvLine().ToList();
}
}
CsvStreamWriter
string file = @"D:\Temp\test.csv";
using (CsvStreamWriter _writer = new CsvStreamWriter(file))
{
var _line = new string[] { "Row 1", "Row A,A", "Row 3", "Row B" };
_writer.WriteCsvLine(_line);
}
CsvReader<T>
public class Data
{
[Column(Index = 0)]
public string Column1 { get; set; }
[Column(Index = 1)]
public string Column2 { get; set; }
[Column(Index = 2)]
public string Column3 { get; set; }
[Column(Index = 5)]
public string Column5 { get; set; }
}
string file = @"D:\Temp\data.csv");
using (CsvReader<Data> _reader = new(file))
{
_reader.Open();
_reader.Separator = ','; // or use _reader.DetectSeparator();
var _data = Reader.ReadAsEnumerable().Where(p => p.Column1.Contains("16"));
var _materialized = _data.ToList();
}
- Open()
Opens the Reader. - Separator
Set the separator to use (default ','); - ReadAsEnumerable()
Reads and deserializes each csv file line per iteration in the collection, this allows for querying mega sized files. - DetectSeparator()
To auto set the separator (looks for commonly used separators in first 10 lines). - Skip()
Skip and advances the reader to the next row without interpret it. - Read()
Reads current row into T and advances the reader to the next row. - MoveToStart()
Moves the reader to the start position, Skip() and Take() alter the start positions use MoveToStart() to reset the position.
CsvWriter<T>
private IEnumerable<CsvSimple> LargeData()
{
for (int ii = 0; ii < 10000000; ii++)
{
CsvSimple _line = new()
{
AfBij = "bij",
Bedrag = "100",
Code = "test",
Datum = "20200203",
Mededelingen = $"test {ii}",
Rekening = "3434",
Tegenrekening = "3423424"
};
yield return _line;
}
}
string file = @"D:\largedata.csv";
using (CsvWriter<CsvSimple> _writer = new(file) { Separator = ',', Append = true })
{
_writer.Open();
_writer.Write(LargeData());
}
- Open()
Opens the Writer. - Separator
Set the separator to use (default ','); - WriteHeader()
Write header with property names of T. - Write(IEnumerable<T> rows)
Writes data to Csv while consuming rows.
ColumnAttribute
The column attribute defines the properties to be serialized or deserialized.
- Index
Defines the index position within the CSV document. Numbers can be skipped for the reader to ignore certain columns, for the writer numbers can also be skipped which leads to empty columns. - Header
Defines the header text, this property only applies to the CsvWriter, if not specified, the property name is used. - OutputFormat
Apply a string format, depending on the Property type. This property is for CsvWriter only. - OutputNullValue
Defines the value to write as a default for null, This property is for CsvWriter only.
Other Examples
public class Data
{
[Column(Index = 0)]
public string Collumn1 { get; set; }
[Column(Index = 1)]
public string Collumn2 { get; set; }
[Column(Index = 2, Header = "Test" )]
public byte[] Collumn3 { get; set; }
[Column(Index = 3)]
public DateTime TestDateTime { get; set; }
[Column(Index = 4)]
public DateTime? TestDateTimeNull { get; set; }
[Column(Index = 5)]
public Int32 TestInt { get; set; }
[Column(Index = 6, OutputNullValue = "[NULL]")]
public Int32? TestIntNull { get; set; }
}
private IEnumerable<Data> GetTestData()
{
yield return new Data
{
Collumn1 = "01",
Collumn2 = "AA",
Collumn3 = new byte[3] { 2, 4, 6 },
TestDateTime = DateTime.Now,
TestDateTimeNull = DateTime.Now,
TestInt = 100,
TestIntNull = 200
};
yield return new Data
{
Collumn1 = "01",
Collumn2 = "AA",
Collumn3 = new byte[3] { 2, 4, 6 },
TestDateTime = DateTime.Now,
TestDateTimeNull = DateTime.Now,
TestInt = 100,
TestIntNull = 200
};
yield return new Data
{
Collumn1 = "04",
Collumn2 = "BB",
Collumn3 = new byte[3] { 8, 9, 10 },
TestDateTime = DateTime.Now,
TestDateTimeNull = null,
TestInt = 300,
TestIntNull = null
};
}
public static string StreamToString(Stream stream)
{
using (StreamReader reader = new StreamReader(stream, Encoding.UTF8))
{
stream.Position = 0;
return reader.ReadToEnd();
}
}
List<Data> _result = new List<Data>();
using (MemoryStream _stream = new MemoryStream())
{
using (CsvWriter<Data> _csvWriter = new CsvWriter<Data>(_stream))
using (CsvReader<Data> _csvReader = new CsvReader<Data>(_stream))
{
_csvWriter.Separator = ';';
_csvWriter.Open();
_csvWriter.WriteHeader();
_csvWriter.Write(GetTestData());
_csvReader.Open();
_csvReader.DetectSeparator(); // Auto detect separator.
_csvReader.Skip(); // Skip header.
_result = _csvReader.ReadAsEnumerable().Where(p => p.Collumn2 == "AA").ToList();
}
}
string _result;
using (MemoryStream _stream = new MemoryStream())
{
using (CsvWriter<Data> _csvWriter = new CsvWriter<Data>(_stream))
{
_csvWriter.Separator = ',';
_csvWriter.Open();
_csvWriter.WriteHeader();
_csvWriter.Write(GetTestData());
_result = StreamToString(_stream);
}
}
Sampling only a few rows without reading entire csv.
List<CsvSimple> _result1;
List<CsvSimple> _result2;
string file = @"D:\largedata.csv";
_w.Start();
using (CsvReader<CsvSimple> _reader = new CsvReader<CsvSimple>(file))
{
_reader.Open();
_reader.Skip(); // skip the Header row.
_reader.Skip(10); // skip another 10 rows, this skip does not materialize. skip on Enumerable requires T to be materialized.
_result1 = _reader.ReadAsEnumerable().Skip(10).Take(10).ToList(); // Materializes 20 records but returns 10.
_reader.Skip(10);
_result1 = _reader.ReadAsEnumerable().Take(10).ToList();
_result2 = _reader.ReadAsEnumerable().Take(10).ToList();
}
Mind you on the fact that Skip and Take andvances the reader to the next position.
executing another _reader.ReadAsEnumerable().Where(p ⇒ p...).ToList() will Query from position 21.
Use MoveToStart() to move the reader to the starting position.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net6.0 is compatible. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net6.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
4.3.0 | 0 | 11/21/2024 |
4.2.5 | 33 | 11/20/2024 |
4.2.4 | 38 | 11/19/2024 |
4.2.3 | 69 | 11/13/2024 |
4.2.2 | 160 | 2/28/2024 |
4.2.1 | 116 | 2/24/2024 |
4.2.0 | 129 | 2/23/2024 |
4.1.2 | 104 | 2/22/2024 |
4.1.1 | 132 | 2/21/2024 |
4.1.0 | 127 | 2/21/2024 |
4.0.1 | 139 | 2/12/2024 |
4.0.0 | 128 | 2/12/2024 |
3.1.13 | 103 | 2/8/2024 |
3.1.12 | 150 | 2/7/2024 |
3.1.11 | 105 | 1/31/2024 |
3.1.10 | 116 | 1/19/2024 |
3.1.9 | 121 | 1/13/2024 |
3.1.8 | 120 | 1/12/2024 |
3.1.7 | 108 | 1/11/2024 |
3.1.5 | 134 | 1/8/2024 |
3.1.3 | 175 | 12/1/2023 |
3.1.2 | 135 | 12/1/2023 |
3.1.0 | 120 | 11/28/2023 |
3.0.7 | 209 | 8/27/2023 |
3.0.6 | 148 | 8/23/2023 |
3.0.5 | 159 | 8/23/2023 |
3.0.4 | 160 | 8/17/2023 |
3.0.3 | 174 | 8/15/2023 |
3.0.2 | 176 | 8/11/2023 |
3.0.1 | 195 | 8/11/2023 |
3.0.0 | 171 | 8/11/2023 |
2.0.7 | 220 | 8/9/2023 |
2.0.5 | 180 | 8/4/2023 |
2.0.4 | 178 | 8/3/2023 |
2.0.3 | 149 | 7/31/2023 |
2.0.2 | 174 | 7/28/2023 |
2.0.0 | 178 | 7/19/2023 |
1.7.53 | 216 | 4/14/2023 |
1.7.52 | 215 | 4/12/2023 |
1.7.51 | 202 | 4/7/2023 |
1.7.43 | 231 | 4/3/2023 |
1.7.42 | 214 | 4/3/2023 |
1.7.41 | 198 | 4/3/2023 |
1.7.5 | 203 | 4/7/2023 |
1.7.3 | 243 | 4/3/2023 |
1.7.2 | 231 | 4/3/2023 |
1.7.1 | 220 | 4/3/2023 |
1.7.0 | 227 | 4/1/2023 |
1.6.3 | 226 | 3/31/2023 |
1.6.2 | 228 | 3/29/2023 |
1.6.1 | 221 | 3/29/2023 |
1.6.0 | 215 | 3/27/2023 |
1.5.8 | 240 | 3/24/2023 |
1.5.7 | 211 | 3/22/2023 |
1.5.6 | 227 | 3/22/2023 |
1.5.5 | 235 | 3/21/2023 |
1.5.4 | 244 | 3/21/2023 |
1.5.1 | 234 | 3/20/2023 |
1.5.0 | 239 | 3/19/2023 |
1.4.5 | 234 | 3/18/2023 |
1.4.4 | 274 | 3/18/2023 |
1.4.3 | 227 | 3/18/2023 |
1.4.2 | 245 | 3/18/2023 |
1.4.1 | 211 | 3/18/2023 |
1.4.0 | 230 | 3/18/2023 |
1.3.92 | 240 | 3/18/2023 |
1.3.91 | 246 | 3/17/2023 |
1.3.9 | 233 | 3/17/2023 |
1.3.8 | 209 | 3/17/2023 |
1.3.7 | 239 | 3/17/2023 |
1.3.6 | 205 | 3/17/2023 |
1.3.5 | 221 | 3/17/2023 |
1.3.4 | 243 | 3/17/2023 |
1.3.3 | 233 | 3/16/2023 |
1.3.2 | 214 | 3/16/2023 |
1.3.1 | 241 | 3/16/2023 |
1.3.0 | 196 | 3/16/2023 |
1.2.0 | 235 | 3/14/2023 |
1.1.6 | 275 | 2/24/2023 |
1.1.5 | 320 | 2/16/2023 |
1.1.4 | 479 | 5/18/2022 |
1.1.3 | 716 | 1/27/2022 |
1.1.2 | 645 | 1/27/2022 |
1.1.1 | 698 | 1/14/2022 |
1.1.0 | 5,843 | 11/23/2021 |
1.0.5 | 394 | 5/11/2021 |
1.0.4 | 338 | 4/14/2021 |
1.0.3 | 378 | 4/12/2021 |
1.0.2 | 336 | 4/12/2021 |
1.0.1 | 317 | 4/7/2021 |
1.0.0 | 389 | 4/7/2021 |
1.5.8
- Minor Improvements
- Added Skip() to CsvStreamReader
- Changed EndOfStream behaviour
1.5.7
- Small improvements
1.5.1
- Updated Readme
- Fixed bug with Skip(rows)
- Fixed small bug with ReadAsEnumerable() always started at position 0.
1.5
- Correct handling Null Types for Reader
1.4.5
- Refactoring
- Removed DynamicReader and DynamicWriter
1.4.2
- Another performance improvement for Reader
1.4
- Performance improvements for Writer.
- Added OutputFormat ro ColumnAttribute
1.3.8
- Performance improvement for Reader
1.3.2
- Bug fixes
1.3
- Improved constructors to support all parameters for underlying StreamReader and StreamWriters.
- Added Skip() to CsvReader (to be used in combination Read())
- Added WriteHeader() to CsvWriter()
- Added Header to Column attribute to be used by the CsvWriter
- GetCsvSeparator() / DetectSeparator(),detects more exotic separators.
- Added byte[] to base64 serialization to CsvReader and CsvWriter
1.2
- Added single Read() function.
- Rows() now marked as obsolete.
- Added ReadAsEnumerable() as replacement for Rows()
- Added GetCsvSeparator(int sampleRows) to CsvStreamReader()
- Added DetectSeparator() to CsvReader()
1.1.5
- Bug Fixes
1.1.4
- Added CsvUtils static class including some special Csv functions to use.
1.1.3
- Added CsvWriterDynamic
1.1.1
- Added CsvReaderDynamic
1.1.0
- Speed optimizations (using delegates instead of reflection)
1.0.5
- Read/Write Stream csv lines into a poco object.
- Query / Read / Write large csv files.