Category: Testing

Create and running tests- part 4

WebAPI2CLI

This is a part of the series where about how I made the WebAPI2CLI - Execute ASP.NET Core WebAPI from Command Line
Source code on https://github.com/ignatandrei/webAPI2CLI/

I have made a TestWebSite , in .NET Core 3.1, I was also thinking that WebApplicationFactory exists and tutorial like https://docs.microsoft.com/en-us/aspnet/core/test/integration-tests?view=aspnetcore-3.1   it will be easy to test the site.

However, I have found the hard way that is not a real server : it does not have the adresses / ports open. Read https://github.com/dotnet/aspnetcore/issues/4892 .

So I have made a hack – see public class LocalServerFactory in the source code.

Also, WithWebHostBuilder helps create a new WebApplicationFactory  -used to test with / or without /  CLI_ENABLED command line

 

Second point was testing methods like

configuration.GetValue<int?>(“CLI_HELP”, null);

Because it is an extension function, I cannot mock. So I refactored the code to

public bool ShouldShowHelp()
{
//deleted for easy of automating testing
//var showHelp = configuration.GetValue<int?>(“CLI_HELP”, null);
var cliHelp = configuration[“CLI_HELP”];
var showHelp = int.TryParse(cliHelp, out var val) && val == 1;
return showHelp;
}

Also, I have resorted to XBehave, to have tests written in easy form. And to Moq to mock various properties.

Code Coverage–part 25

Install coverlet and report generator

dotnet tool install coverlet.console

dotnet tool install dotnet-reportgenerator-globaltool

 

Then

dotnet coverlet Infovalutar.sln –target “dotnet” –targetargs “test InfovalutarTest\InfovalutarTest.csproj –configuration release –no-build” –format opencover –exclude “[xunit*]*”

dotnet reportgenerator “-reports:coverage.opencover.xml” “-targetdir:coveragereport” “-reporttypes:HTMLInline;HTMLSummary;Badges”

 

And the result ?No assemblies have been covered.

Updating all NuGet packages for the solution  – including a collector for coverlet.

Reading more, discover it works with

dotnet test –collect:”XPlat Code Coverage”  -r resultsFolder

The problem is that generates a GUID inside resultsFolder … and reportGenerator cannot be pointed to a random folder

Playing with different options, such as

dotnet test /p:CollectCoverage=true /p:CoverletOutput=’results’ /p:CoverletOutputFormat=json -v n

No results in the results directory…

Found obscure reference https://github.com/tonerdo/coverlet/issues/201 that I need to add package coverlet.msbuild

Now

dotnet test –configuration release –no-build  /p:CollectCoverage=true /p:CoverletOutputFormat=cobertura -v m /p:CoverletOutput=cob.xml

dotnet reportgenerator “-reports:InfovalutarTest/cob.xml” “-targetdir:coveragereport” “-reporttypes:HTMLInline;HTMLSummary;Badges”

 

And generates report.

And because I want those to be added in AzureDevOps at test I add –logger trx  and https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/test/publish-test-results?view=azure-devops&tabs=yaml

– task: PublishTestResults@2
inputs:
testRunner: VSTest
testResultsFiles: ‘**/*.trx’

– task: PublishCodeCoverageResults@1
inputs:
codeCoverageTool: cobertura
summaryFileLocation: ‘**/cob.xml’

It works! Last: going to shields.io to have a display of the tests

![Azure DevOps tests (branch)](https://img.shields.io/azure-devops/tests/ignatandrei0674/InfoValutar/5/master)

![Azure DevOps cc](https://img.shields.io/azure-devops/coverage/ignatandrei0674/InfoValutar/5/master)

 

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )

C# integration testing in AzureDevOps with Docker containers– SqlServer and Cachet example

Every software that we make depends on others. For Stankins , as a general ETL data, it is more important to be tested with real data providers.For example, we may want to take data from Sql Server and send to Cachet . How can we have a SqlServer and a Cachet up and running easy ? The obvious answer our days is Docker.

Let’s see how a test for SqlServer looks

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
using FluentAssertions;
using Stankins.Alive;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Xbehave;
using Xunit;
 
namespace StankinsTestXUnit
{
    [Trait("ReceiverSqlServer", "")]
    [Trait("ExternalDependency","SqlServer")]
    public class TestReceiverSqlServer
    {
        [Scenario]
        [Example("Server=(local);Database=master;User Id=SA;Password = <YourStrong!Passw0rd>;")]
        public void TestReceiverDBServer(string connectionString)
        {
            IReceive status = null;
            IDataToSent data = null;
            $"Assume Sql Server instance {connectionString} exists , if not see docker folder".w(() => {
 
            });
            $"When I create the ReceiverDBServer ".w(() => status = new ReceiverDBSqlServer(connectionString));
            $"and receive data".w(async () =>
            {
                data = await status.TransformData(null);
            });
            $"the data should have a table".w(() =>
            {
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"and the result should be true".w(() =>
            {
                data.DataToBeSentFurther[0].Rows[0]["IsSuccess"].Should().Be(true);
            });
 
 
        }
    }
}

and for cachet :

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
using FluentAssertions;
using Stankins.FileOps;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Stankins.Rest;
using Xbehave;
using Xunit;
using static System.Environment;
using Stankins.Trello;
using Stankins.Cachet;
 
namespace StankinsTestXUnit
{
    [Trait("Cachet", "")]
    [Trait("ExternalDependency", "Cachet")]
    public class TestSenderCachet
    {
        [Scenario]
        [Example("Assets/JSON/CachetV1Simple.txt", 3)]
        public void TestSimpleJSON(string fileName,int NumberRows)
        {
            IReceive receiver = null;
            
            IDataToSent data=null;
            var nl = Environment.NewLine;
            $"Given the file {fileName}".w(() =>
            {
                File.Exists(fileName).Should().BeTrue();
            });
            $"When I create the {nameof(ReceiveRest)} for the {fileName}".w(() => receiver = new ReceiveRestFromFile(fileName));
            $"And I read the data".w(async () =>data= await receiver.TransformData(null));
            $"Then should be a data".w(() => data.Should().NotBeNull());
            $"With a table".w(() =>
            {
                data.DataToBeSentFurther.Should().NotBeNull();
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"The number of rows should be {NumberRows}".w(() => data.DataToBeSentFurther[0].Rows.Count.Should().Be(NumberRows));
            $"and now I transform with {nameof(SenderCachet)}".w(async ()=>
                data=await new SenderCachet("http://localhost:8000","5DiHQgKbsJqck4TWhMVO").TransformData(data)
            );
 
        }
 
    }
}

( I have use XBehave for extensions)

Nice and easy , right ? Not so!

For up and running SqlServer I have used a docker compose file

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

and in AzureDevOps yaml start the containers, run the tests, collect the code coverage, stop the containers

docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml up -d  
        

echo 'start regular test'
        
         dotnet build -c $(buildConfiguration) stankinsv2/solution/StankinsV2/StankinsV2.sln
        
         dotnet test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --logger trx  --logger "console;verbosity=normal" --collect "Code coverage"
         echo 'coverlet'
         coverlet stankinsv2/solution/StankinsV2/StankinsTestXUnit/bin/$(buildConfiguration)/netcoreapp2.2/StankinsTestXUnit.dll --target "dotnet" --targetargs "test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --configuration $(buildConfiguration) --no-build" --format opencover --exclude "[xunit*]*"
        
         echo 'compose down'
         docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml down
        

Easy, right ? That’s because SqlServer is well behaved and has a fully functional image on Docker

That is not so easy with Cachet . Cachet requires configuration – and more, after configuration, it generates a random token for write data  ( http://localhost:8000","5DiHQgKbsJqck4TWhMVO ) .

So it will be a task for docker to export the container and import again  - easy stuff, right ? Again, not.

So I start a small docker container with

docker run -p 8000:8000 –name myCachetContainer -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

and then browsing to http://localhost:8000 I have configured and grab the token

Now it is time to export :

docker export myCachetContainer -o cachet.tar

And to import as an image

docker import cachet.tar  mycac

And to run the image again

docker run -p 8000:8000  -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

And the image stopped! After many tries and docker inspect the initial image , I have resulted to

docker run -it -p 8000:8000 -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite --workdir /var/www/html --user 1001:1001 mycac "/sbin/entrypoint.sh"

So the workdir, user, and the entry point are not copied into the image and you should do youurself.

The final preparing for CI with Docker for Cachet ? I have docker push myimage to Docker Hub , and I will run it from docker compose.

So now my docker compose with sql server and cachet looks this way

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

  cachet:
     image: ignatandrei/ci_cachet
     ports:
       - "8000:8000"
      
     environment:
       APP_KEY: "base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc="
       DEBUG: "false"
       DB_DRIVER: "sqlite"
       
     user: "1001"   
     working_dir: "/var/www/html"
     entrypoint: "/sbin/entrypoint.sh"

And I have a nice C# integration tests with Azure Devops, Docker, Sql Server and Cachet ! You can see the code coverage report at https://codecov.io/gh/ignatandrei/stankins/src/master/stankinsv2/solution/StankinsV2/Stankins.Cachet/SenderCachet.cs

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.