Category: .NET

[PostEvent] Talks by Softbinator

The organizers from Talk from Softbinator were kind enough to select me for a presentation. Again , my presentation about .NET Core and Angular Everywhere. Source Code at https://github.com/ignatandrei/angNetCoreDemo/ 

The presentation was supposed to take 1/2 hour – and it took 1 hour with all the explanations.

Video at https://www.facebook.com/softbinator/videos/459024768242391

Correct abstraction–.NET Core IFileProvider

Create the right abstraction and they will implement it. I was delighted by finding that the .NET Core has an IFileProvider : https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.ifileprovider?view=aspnetcore-2.2  .

An obvious implementation is PhysicalFileProvider : https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.physicalfileprovider?view=aspnetcore-2.2

A not so obvious implementation, but normal from a programmers mind, is NullFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.nullfileprovider?view=aspnetcore-2.2

And, because we have already 2 providers, makes sense a CompositeFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.compositefileprovider?view=aspnetcore-2.2

And because we create assemblies, it is normal to have EmbeddedFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.embeddedfileprovider?view=aspnetcore-2.2

And , to complicate things, a ManifestEmbeddedFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.manifestembeddedfileprovider?view=aspnetcore-2.2

( You can read more details here: https://docs.microsoft.com/en-us/aspnet/core/fundamentals/file-providers?view=aspnetcore-2.2#physicalfileprovider)

But this is just what Microsoft provides in .NET Framework.

Other creates more, such as :

LongFileProvider: https://github.com/v-kabanov/hdrepository/blob/master/trunk/Common/bfs.Repository.IO.WinNtfs/WinLongFileFrovider.cs

S3FileProvider: https://github.com/evorine/S3FileProvider/blob/master/src/S3FileProvider.cs, https://github.com/lamondlu/AWSS3FileProvider

AzureBlogFileProvider: https://github.com/filipw/Strathweb.AspNetCore.AzureBlobFileProvider

More Cloud Providers : https://github.com/jiabiao/NCloudFiles

ZipFileProviders: https://github.com/tagcode/Lexical.FileProvider , https://github.com/cloudscribe/cloudscribe/blob/master/src/cloudscribe.Web.Common/StaticFiles/GzipMappingFileProvider.cs

DatabaseFileProviders: https://github.com/mikebrind/RazorEngineViewOptionsFileProviders

And InMemory: https://github.com/dazinator/Dazinator.AspNet.Extensions.FileProviders

It is interesting how you can find various implementation of an so common thing, like a file / folder. But it is also rewarding to see that you have created the right abstractions – and other will implement !

Simple serialize of encoding

My problem was the serialize of the Encoding . Let’s suppose that we have a class that have a property Encoding( maybe to read a file ).


internal class MyTest
{
    public MyTest()
    {
        enc = ASCIIEncoding.ASCII;
    }
    public Encoding enc { get; set; }
}


We want to serialize this class in order to let the administrator/people to decide what will be the encoding.

When we serialize( obvious, with NewtonSoftJson) , we obtain this kind of data:

{
“enc”: {
“IsSingleByte”: true,
“BodyName”: “us-ascii”,
“EncodingName”: “US-ASCII”,
“HeaderName”: “us-ascii”,
WebName“: “us-ascii”,
“WindowsCodePage”: 1252,
“IsBrowserDisplay”: false,
“IsBrowserSave”: false,
“IsMailNewsDisplay”: true,
“IsMailNewsSave”: true,
“EncoderFallback”: {
“DefaultString”: “?”,
“MaxCharCount”: 1
},
“DecoderFallback”: {
“DefaultString”: “?”,
“MaxCharCount”: 1
},
“IsReadOnly”: true,
“CodePage”: 20127
}
}

This is too much for someone to edit . We want a simple string that can be edited easy – and the WebName ,  that is , in fact , a string from https://www.iana.org/assignments/character-sets/character-sets.xhtml seems the obvious choice.

So –  I have done a JSONConverter class just for this property. It is very simple:

public class JsonEncodingConverter : JsonConverter
{
    public override bool CanConvert(Type objectType)
    {
        return (typeof(Encoding).IsAssignableFrom(objectType));
    }

    public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
    {
        string webName = "";
        if (reader.TokenType == JsonToken.String)
        {

            webName = reader.Value?.ToString();
        }
        existingValue = Encoding.GetEncoding(webName);

        return existingValue;
    }

    public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
    {
        string webName = (value as Encoding).WebName;
        serializer.Serialize(writer, webName);
    }
}

And can be used very easy:

MyTest m = new MyTest();
JsonEncodingConverter[] conv = new[] { new JsonEncodingConverter() };
string original = JsonConvert.SerializeObject(m, Formatting.Indented);
string data = JsonConvert.SerializeObject(m, Formatting.Indented, conv);
//https://www.iana.org/assignments/character-sets/character-sets.xhtml
Console.WriteLine(data);
Console.WriteLine("and now the original");
Console.WriteLine(original);
MyTest s = JsonConvert.DeserializeObject<MyTest>(data, conv);
Console.WriteLine(s.enc.WebName);

The result of serializing it is now

{
“enc”: “us-ascii”
}

And because it has this code

public override bool CanConvert(Type objectType)
{
    return (typeof(Encoding).IsAssignableFrom(objectType));
}

it means it will serialize just the encoding, not other tools.

And it is more easy to be edited by someone.

Moral: Aim for simple string that can be edited can be achieved when serializing. Do not stay with defaults!

 

( you can easy achieve backwards compatibility for already serialized Encoding by asking

if (reader.TokenType == JsonToken.StartObject)
{
    webName = reader.Value?.ToString();
    //handling old data format for encoding
    while (reader.TokenType != JsonToken.EndObject)
    {
        if (!reader.Read())
            break;
        if (reader.TokenType != JsonToken.PropertyName)
            continue;
        var val = reader.Value?.ToString();
        if (string.Compare("webname", val, StringComparison.InvariantCultureIgnoreCase) == 0)
        {
            webName = reader.ReadAsString();
            //do not break - advance reading to the end
            //break;
        }
    }

}

)

C# integration testing in AzureDevOps with Docker containers– SqlServer and Cachet example

Every software that we make depends on others. For Stankins , as a general ETL data, it is more important to be tested with real data providers.For example, we may want to take data from Sql Server and send to Cachet . How can we have a SqlServer and a Cachet up and running easy ? The obvious answer our days is Docker.

Let’s see how a test for SqlServer looks

using FluentAssertions;
using Stankins.Alive;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Xbehave;
using Xunit;

namespace StankinsTestXUnit
{
    [Trait("ReceiverSqlServer", "")]
    [Trait("ExternalDependency","SqlServer")]
    public class TestReceiverSqlServer
    {
        [Scenario]
        [Example("Server=(local);Database=master;User Id=SA;Password = <YourStrong!Passw0rd>;")]
        public void TestReceiverDBServer(string connectionString)
        {
            IReceive status = null;
            IDataToSent data = null;
            $"Assume Sql Server instance {connectionString} exists , if not see docker folder".w(() => {

            });
            $"When I create the ReceiverDBServer ".w(() => status = new ReceiverDBSqlServer(connectionString));
            $"and receive data".w(async () =>
            {
                data = await status.TransformData(null);
            });
            $"the data should have a table".w(() =>
            {
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"and the result should be true".w(() =>
            {
                data.DataToBeSentFurther[0].Rows[0]["IsSuccess"].Should().Be(true);
            });


        }
    }
}

and for cachet :



using FluentAssertions;
using Stankins.FileOps;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Stankins.Rest;
using Xbehave;
using Xunit;
using static System.Environment;
using Stankins.Trello;
using Stankins.Cachet;

namespace StankinsTestXUnit
{
    [Trait("Cachet", "")]
    [Trait("ExternalDependency", "Cachet")]
    public class TestSenderCachet
    {
        [Scenario]
        [Example("Assets/JSON/CachetV1Simple.txt", 3)]
        public void TestSimpleJSON(string fileName,int NumberRows)
        {
            IReceive receiver = null;
           
            IDataToSent data=null;
            var nl = Environment.NewLine;
            $"Given the file {fileName}".w(() =>
            {
                File.Exists(fileName).Should().BeTrue();
            });
            $"When I create the {nameof(ReceiveRest)} for the {fileName}".w(() => receiver = new ReceiveRestFromFile(fileName));
            $"And I read the data".w(async () =>data= await receiver.TransformData(null));
            $"Then should be a data".w(() => data.Should().NotBeNull());
            $"With a table".w(() =>
            {
                data.DataToBeSentFurther.Should().NotBeNull();
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"The number of rows should be {NumberRows}".w(() => data.DataToBeSentFurther[0].Rows.Count.Should().Be(NumberRows));
            $"and now I transform with {nameof(SenderCachet)}".w(async ()=>
                data=await new SenderCachet("http://localhost:8000","5DiHQgKbsJqck4TWhMVO").TransformData(data)
            );

        } 

    }
}

( I have use XBehave for extensions)

Nice and easy , right ? Not so!

For up and running SqlServer I have used a docker compose file

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

and in AzureDevOps yaml start the containers, run the tests, collect the code coverage, stop the containers

docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml up -d  
        

echo 'start regular test'
        
         dotnet build -c $(buildConfiguration) stankinsv2/solution/StankinsV2/StankinsV2.sln
        
         dotnet test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --logger trx  --logger "console;verbosity=normal" --collect "Code coverage"
         echo 'coverlet'
         coverlet stankinsv2/solution/StankinsV2/StankinsTestXUnit/bin/$(buildConfiguration)/netcoreapp2.2/StankinsTestXUnit.dll --target "dotnet" --targetargs "test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --configuration $(buildConfiguration) --no-build" --format opencover --exclude "[xunit*]*"
        
         echo 'compose down'
         docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml down
        

Easy, right ? That’s because SqlServer is well behaved and has a fully functional image on Docker

That is not so easy with Cachet . Cachet requires configuration – and more, after configuration, it generates a random token for write data  ( http://localhost:8000","5DiHQgKbsJqck4TWhMVO ) .

So it will be a task for docker to export the container and import again  - easy stuff, right ? Again, not.

So I start a small docker container with

docker run -p 8000:8000 –name myCachetContainer -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

and then browsing to http://localhost:8000 I have configured and grab the token

Now it is time to export :

docker export myCachetContainer -o cachet.tar

And to import as an image

docker import cachet.tar  mycac

And to run the image again

docker run -p 8000:8000  -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

And the image stopped! After many tries and docker inspect the initial image , I have resulted to

docker run -it -p 8000:8000 -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite --workdir /var/www/html --user 1001:1001 mycac "/sbin/entrypoint.sh"

So the workdir, user, and the entry point are not copied into the image and you should do youurself.

The final preparing for CI with Docker for Cachet ? I have docker push myimage to Docker Hub , and I will run it from docker compose.

So now my docker compose with sql server and cachet looks this way

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

  cachet:
     image: ignatandrei/ci_cachet
     ports:
       - "8000:8000"
      
     environment:
       APP_KEY: "base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc="
       DEBUG: "false"
       DB_DRIVER: "sqlite"
       
     user: "1001"   
     working_dir: "/var/www/html"
     entrypoint: "/sbin/entrypoint.sh"

And I have a nice C# integration tests with Azure Devops, Docker, Sql Server and Cachet ! You can see the code coverage report at https://codecov.io/gh/ignatandrei/stankins/src/master/stankinsv2/solution/StankinsV2/Stankins.Cachet/SenderCachet.cs

Create a new exception–add fields and/or properties

This post is not about why we need custom exception (https://blogs.msdn.microsoft.com/jaredpar/2008/10/20/custom-exceptions-when-should-you-create-them/ ) . It is (more a rant ) about a specific  item in best practices in Exceptions( https://docs.microsoft.com/en-us/dotnet/standard/exceptions/best-practices-for-exceptions )

It says:

In custom exceptions, provide additional properties as needed

Provide additional properties for an exception (in addition to the custom message string) only when there’s a programmatic scenario where the additional information is useful. For example, the FileNotFoundExceptionprovides the FileName property.

What I want to add : In EVERY exception that you create in code, DEFINE a custom field. It  is useless without !

Why this rant ? In Stankins I have to intercept KeyNotFoundException  and I want to find the Key that was not found ( problem with some dictionary ) to provide it (Yes, it is a flawed design – but this is not the point here) . The problem was the definition:

public class KeyNotFoundException : SystemException, ISerializable
{

public KeyNotFoundException();

public KeyNotFoundException(string message);

public KeyNotFoundException(string message, Exception innerException);

protected KeyNotFoundException(SerializationInfo info, StreamingContext context);
}

See the problem ? No way to find WHAT is the Key that was not found. So I ended up with this code:

name =innerKeyEx.Message;
// The given key 'nameColumn' was not present in the dictionary.
var first=name.IndexOf("'");
var last= name.IndexOf("'",first+1);
name= name.Substring(first+1,last-first-1);

Moral of the post ? Do NOT define a custom Exception without defining a field / property inside!

OpenSource library- publishing

Following the rules at https://docs.microsoft.com/en-us/dotnet/standard/library-guidance/publish-nuget-package

Nr

Recomandation

AOP Roslyn

1

DO publish stable packages and pre-release packages you want community feedback on to NuGet.org.

Done

2

CONSIDER publishing pre-release packages to a MyGet feed from a continuous integration build.

No

3

CONSIDER testing packages in your development environment using a local feed or MyGet. Check the package works then publish it to NuGet.org.

Yes

4

DO use a Microsoft account to sign in to NuGet.

Yes

5

DO enable two-factor authentication for accessing NuGet.

Yes

6

DO enable email notification when a package is published.

Yes

For 1: Done already. However, I do not like pre-release packages

For 2: Too much hassle for a single developer. Using instead NuGet

For 3: Yes , I have already a build.bat to do this

For 4: Yes.(NuGet allows only that now)

For 5 : Yes

For 6: Yes.

OpenSource library–Strong naming

The strong naming is something that I have not have done usually, so it will be interesting following https://docs.microsoft.com/en-us/dotnet/standard/library-guidance/strong-naming .

Nr

Recomandation

AOP Roslyn

1

CONSIDER strong naming your library’s assemblies.

Not –see below

2

CONSIDER adding the strong naming key to your source control system.

easy- not

3

CONSIDER incrementing the assembly version
on only major version changes to help
users reduce binding redirects, and how often they’re updated.

not

4

DO NOT add, remove, or change the strong naming key.

easy- not

5

DO NOT publish strong-named and non-strong-named versions of your library.

easy – not

I have tried to add signing – it is pretty easy in Visual Studio to generate a new pfx fiel to sign code. However, when compiling, it requires all dependencies to be signed

CSC : error CS8002: Referenced assembly ‘PortableConsoleLibs, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null’ does not have a strong name. [C:\projects\aop-with-roslyn\AOPRoslyn\aopCmd\aop.csproj]

So , if one of your referenced libraries is not code signed, it is a big no

The other requirements are pretty easy..

My Async Await tutorials

Rule of thumb: just await / async from top to down.

 

To deeply understand async await in .NET Core , please follow the following resources:

 

1. https://channel9.msdn.com/Events/TechDays/Techdays-2014-the-Netherlands/Async-programming-deep-dive  – to gain inner knowledge about what code is async / await

2. Read https://blog.stephencleary.com/2012/02/async-and-await.html to have started into async await

3. Read MSDN for a better understanding : https://msdn.microsoft.com/en-us/magazine/jj991977.aspx?f=255&MSPPError=-2147217396

4. Common pitfalls in ASP.NET  Framework( not in console! ) with async await: https://blog.stephencleary.com/2012/07/dont-block-on-async-code.html

5. No problem in ASP.NET Core: https://blog.stephencleary.com/2017/03/aspnetcore-synchronization-context.html

Happy reading !

Connections strings to config

I have the opportunity to work on some pretty old code . There were many projects , all that had a sort of connection string to the database.  This kind of code were in like > 11 places :

string connectionstring = “Data Source=.\\SqlServer;Initial Catalog=myDB;Integrated Security=SSPI;”;

The task was to modify this in something that could read from a config ( json) file . But more than that, was to verify
1.that is called everywhere – because sometimes it were in methods, other time in a static variable and so on .
2. that the json exists or not ( if not return the default string – but log the fact that are not in the settings)

So – how to monitor that your code is actually hit ? One idea is to make breakpoints. Other is to make a class:

So I came up with this class

public class ConnectionSql
{
public ConnectionSql(
[CallerMemberName] string memberName = "",
[CallerFilePath] string sourceFilePath = "",
[CallerLineNumber] int sourceLineNumber = 0)
{

Trace.WriteLine("member name: " + memberName);
Trace.WriteLine("source file path: " + sourceFilePath);
Trace.WriteLine("source line number: " + sourceLineNumber);
}
//TODO: if need speed, make this static - or strConnection...
public string ConnectionString()
{
var strConnection = ConfigurationManager.AppSettings["sql"];
if (string.IsNullOrWhiteSpace(strConnection))
{
Console.WriteLine("not found connection sql string in settings , going to default");
strConnection = ".\\SqlServer;Initial Catalog=myDB;Integrated Security=SSPI;";
}
return strConnection;

}

}

 

How we call ?


string connectionstring = new ConnectionSql().ConnectionString();

 

Because of the attributes https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/caller-information we can figure by looking at the trace if the code was hit or not and by which line.

Circular references on .NET , Entity Framework and WebAPI

Imagine having a class Department( ID, Name) and Employee ( ID , Name, IDDepartment) . You want to return in the WebAPI the Departments with the Employee . It is simple to wrote this code:


[HttpGet]
public IEnumerable<Department> Get([FromServices] MyTestDatabaseContext context)
{
var departments = context.Department.Include(iterator=>iterator.Employee).ToArray();
return departments;
}

But the problem is with circular references when serializing  : The Department has a list of  Employees  that have a Department that have a list of Employees that …

 

 

Solution 1 :  Delete/Comment from the EF generated code what you do not need ( in this case , the Department reference that the Employee has)


public partial class Employee
{
public int Idemployee { get; set; }
public string NameEmployee { get; set; }
public int Iddepartment { get; set; }

//public Department IddepartmentNavigation { get; set; }
}

Solution 2 : Nullify the reference


[HttpGet]
public IEnumerable<Department> Get([FromServices] MyTestDatabaseContext context)
{
var departments = context.Department.Include(iterator=>iterator.Employee).ToArray();
foreach (var dep in departments)
{
foreach (var emp in dep.Employee)
{
emp.IddepartmentNavigation = null;
}
}
return departments;
}

( maybe this should be add to the partial class of the employee ?)

 

 

Solution 3: Handle circular references in serialization


services
.AddMvc()
.AddJsonOptions(opt =>
{
opt.SerializerSettings.PreserveReferencesHandling = Newtonsoft.Json.PreserveReferencesHandling.Objects;
opt.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
});

Solution 4: Make a read DDD design. Read https://siderite.dev/2018/08/client-models-vs-data-transfer-objects.html

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.