I made a database-related test utility for C # (DBUnit Modoki)

4 minute read

Input test data and verify data when performing tests involving DB
Sometimes I want to do it without writing test code as much as possible.
Especially when there are members who have not written the test code
For example, when you have to prepare a lot of test data.

In the form of creating test data and expected value in an external file and reading it
I want to input and verify test data, if it is Java, there is DBUnit, in C #?
There is a product called NDbUnit,
I wanted to create the data in a CSV or Excel file instead of XML
It’s a pain to create a schema definition file (?),
So I decided to implement it.

specification

–Assuming test data to be input before the test,
Expected value data used for verification will be called expected value data.
–Create prerequisite data and expected value data in CSV
(Because Excel does not have an Office license …)
–A set of prerequisite data and expected value data (let’s call it a data set)
And store in the directory
–The dataset is uniquely identified by the test ID
–In the case of a specific pattern with a convention for the directory name of the dataset
Define a test ID with a directory name that should be recognized as a dataset
–Specify the root directory where the dataset is placed, specify the test ID, and perform prerequisite data input and verification.
–Allows you to verify the execution result of any query
–Do not depend on the table definition as much as possible (it does not necessarily have to match the table definition)

The dataset looks like this, the root of the project
Put it in the root of the solution.

Project root/
  testdata/
    .../ #Keep the directory structure free
      T__SAMPLE1__SampleTest1/
        R__item.csv
        R__sales.csv
        E__sales.csv
        E__assert_sales.csv
      T__SAMPLE2__SampleTest2/
        ...

Dataset directory name convention: T__ {test ID} __ {comment}
Test data filename convention: R__ {table name}
File name convention for expected value data: ʻE__ {table name or query name} `

Deliverables

Published on GitHub

https://github.com/singy15/DumbAssert

Add DumbAssert.cs to your project and use it

It became like this

For example, suppose such a table exists in DB (assuming SQLite)

CREATE TABLE "article" (
	"article_id" INTEGER NOT NULL,
	"name" TEXT NOT NULL,
	"content" TEXT NOT NULL,
	"published" TEXT NOT NULL,
	"tag" TEXT,
	"version" INTEGER NOT NULL
	PRIMARY KEY("article_id" AUTOINCREMENT)
);

Prepare the dataset

project-root/
  testdata/
    T__A1__PublishArticle/
      R__article.csv
      E__article.csv
      E__assert_article.csv

Prerequisite data, NULL is expressed by <NULL> (can be changed by setting * described later).

R__article.csv


article_id,name,content,published,tag,version
1,test1,content1,0,tag1,1
2,test2,content2,0,<NULL>,1

Expected value data and columns not described are not subject to verification.
The sort order is the ascending order of the described columns.
In this case, ʻORDER BY article_id asc, name asc, content asc, published asc, tag asc`

E__article.csv


article_id,name,content,published,tag
1,test1,content1,0,tag1
2,test2,content2,1,<NULL>

Expected value data and execution result of arbitrary query can be verified.
It can also be used to validate stored functions and views.

E__assert_article.csv


@query{select article_id,published where article_id = 2 order by article_id}
article_id,published
2,1

Test code, input prerequisite data with Prepare,
Expected value data of dataset read at Prepare is verified by ʻAssert`

[Test]
public void TestSample1() 
{
    //Specify the root directory of the dataset (required)
    DumbAssertConfig.TestDataBaseDir = /*Dataset root directory*/;
    //Specify encoding (default is UTF-8)
    DumbAssertConfig.Encoding = Encoding.GetEncoding("UTF-8");
    //Delete data before inputting prerequisite data (default is true)
    DumbAssertConfig.DeleteBeforeInsert = true;
    //Specify a null alternative string (default is"<NULL>")
    DumbAssertConfig.NullString = "<NULL>";
    //Specify the character string expression pattern of DateTime
    //* ToString parameter (default is"yyyy-MM-dd HH:mm:ss.fff")
    DumbAssertConfig.DateTimePattern = "yyyy-MM-dd HH:mm:ss.fff";
    //Double quote the generated SQL column name (default is true)
    DumbAssertConfig.QuoteColumnName = true;
    //Line feed code
    //(Default is Environment.NewLine * CRLF, Mac for Windows/LF for Linux)
    DumbAssertConfig.NewLine = Environment.NewLine;
    using(IDbConnection conn = new SQLiteConnection(/*Connection string*/)) {
        conn.Open();
        DumbAssert du = new DumbAssert(conn);
        du.Prepare("A1");
        ...
Processing involving DB operations
        ...
        du.Assert();
        conn.Close();
    }
}

Since it is possible to give an existing transaction and input and verify the prerequisite data,
You can roll back and get the data back after the test.

[Test]
public void TestUseExistingTransaction() 
{
    DumbAssertConfig.TestDataBaseDir = /*Dataset root directory*/;
    using(IDbConnection conn = new SQLiteConnection(/*Connection string*/)) {
        conn.Open();
        var tx = conn.BeginTransaction();
        DumbAssert du = new DumbAssert(conn, tx);
        du.Prepare("A1");
        ...
Processing involving DB operations
        ...
        du.Assert();
        tx.Rollback();
        conn.Close();
    }
}

It is also possible to create a dataset containing only prerequisite data and use it as common prerequisite data.

du.Prepare("Common1")
du.Prepare("A1");
...
Processing involving DB operations
...
du.Assert();

It is also possible to verify only the expected value of a specific dataset.

du.Prepare("A1")
du.Prepare("A2");
...
Processing involving DB operations
...
du.Assert("A1");

Implementation

Read CSV

I wanted to reduce the dependency as much as possible, so I decided to make it only with standard .NET functions.

using Microsoft.VisualBasic.FileIO;
...

TextFieldParser parser = 
    new TextFieldParser(filePath, DumbAssertConfig.Encoding);
parser.SetDelimiters(",");
this.Data = new List<string[]>();
while(!parser.EndOfData)
{
    this.Data.Add(parser.ReadFields());
}
...

Data validation

Basically, I just get the data with ADO.NET and compare it with the expected value of CSV.
It’s fine if it’s a character string, but about the problem of how to handle numbers, date and time, Boolean type, etc.
It uses a slightly crude method of converting everything into strings and comparing them.
You can set the serializer to change how it is stringified.
I haven’t tested anything other than strings, numbers, and dates, so it’s subtle if it works. .. ..
I will test it soon.

public class DumbAssertSerializer
{
    public string Serialize(object value)
    {
        switch(value)
        {
            case null: return DumbAssertConfig.NullString;
            case Boolean val: return Serialize(val);
            case Byte val: return Serialize(val);
            case Char val: return Serialize(val);
            case DateTime val: return Serialize(val);
            ...
            default: return null;
        }
    }
    public string Serialize(Boolean value) { return value.ToString(); }
    public string Serialize(Byte value) { return value.ToString(); }
    public string Serialize(Char value) { return value.ToString(); }
    public string Serialize(DateTime value) { return value.ToString(DumbAssertConfig.DateTimePattern); }
    ...
}

public class DumbAssertConfig
{
    ...
    
    public static DumbAssertSerializer Serializer = new DumbAssertSerializer();

    public static string NullString = "<NULL>";

    ...

    public static string DateTimePattern = "yyyy-MM-dd HH:mm:ss.fff";
}

Tags:

Updated: