Category: .NET

Exchange rates–what I have done in 37 hours–part 38

What I have create for now in 37 hours :

  1. A source control – https://github.com/ignatandrei/InfoValutar
  2. A plugin based software – you can use to load any kind of exchange rates, for anywhere , provided that you implement the interface – see implementation
  3. Tests for some of the code
  4. Deployment:
  5. A SqlServer database – to store datas
  6. An Azure Function  – https://azurefuncloaddata20191205080713.azurewebsites.net/ –  to load data at time based cron intervals
  7. A GitHub action to compile ,run tests , – https://github.com/ignatandrei/InfoValutar/actions
  8. An AzureDevops CI + CD  to do all 1-6 things +code coverage + deploy https://dev.azure.com/ignatandrei0674/InfoValutar/_build?definitionId=5&_a=summary

 

I did say it is a nice work for 37 hours of work, right ?

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

C# WebAPI and NotFound with message in MVC .NET 4.5

Usually, WEBAPI should return correct HTML status code ( read also https://damienfremont.com/2017/11/23/rest-api-maturity-levels-from-0-to-5/ ) . Let’s say we are saving an entity  – the OK result could return a meaningful message. How about querying for an id that you cannot find ? Easy: NotFound :  .But –the NotFound does NOT show an message. How a client can make a difference between the fact that the entity is not found for a specific id and the fact that ,maybe, the whole site is not deployed / a route does not exists  ? What we need is a NotFound with a specific message. ( In .NET Core we have NotFoundObjectResult )

I have found 2 solutions: one complicated and one simple.

The complicated one:

var err = Request.CreateErrorResponse(
HttpStatusCode.NotFound,
new HttpError("my message”)
);
return new ResponseMessageResult(err);

The simple one

//this is for APIController only, not usual controller in .NET 4.5
return Content(HttpStatusCode.NotFound, "my message");

Enjoy!

Use the right language for the job–if you know

Many years ago I have done a site that lists the exchange rates from BNR and BCE ( www.infovalutar.ro ). It reads the exchanges from BNR html page and then put into a database. The use is for programmers – there are many methods to find latest exchange rates ( RSS, SOAP, url, by MVC, JSON… –  see  all there : http://infovalutar.ro/programatori )

I keep the values in a simple table like

IDMoneda    Valoare    DataValorii
EUR    3.5842    2006-07-07 00:00:00
EUR    3.5892    2006-06-29 00:00:00
EUR    3.5887    2006-07-13 00:00:00
EUR    3.5942    2006-07-19 00:00:00
EUR    3.5642    2006-07-20 00:00:00
EUR    3.5672    2006-07-24 00:00:00

As you see . the value of the date is not contiguous – in the weekends is not a new exchange rate.

Now, the problem:  How can we detect if there is an error with parsing the values  ? The usual error is putting different values from NBR ( maybe they have also some csv file to parse ?). For example take this

IDMoneda    Valoare    DataValorii
RSD    0.0396    2019-01-17 00:00:00
RSD    0.0397    2019-01-18 00:00:00
RSD    0.0398   2019-01-21 00:00:00
RSD    3.98E-06    2019-01-22 00:00:00
RSD    4.02E-06    2019-01-23 00:00:00
RSD    4.02E-06    2019-01-25 00:00:00

How can we find where the difference starts for the currency ? ( We assume a relatively stable Romanian economy …. not like Venezuela )

The answer how to calculate where the problem is relatively easy: the percentage of how much the value has varied : ( CurrentValue  – Last_Value)/ Last_Value * 100 .  It should not pass 1 or , maximum 2 % .

But how to calculate this ?

For C# – or any other OOP programs- it is normal to read record by record , order by IDMoneda and DataValorii – and calculate the percentage. However , C# is not made for this kind of calculus.

Then we can see that is a database involved – and we  think about SQL . It is a complicated SQL –because of the fact that the dates are not contiguous – because of the Weekends. However , there is a simple SQL construct, LAG:https://docs.microsoft.com/en-us/sql/t-sql/functions/lag-transact-sql?view=sql-server-2016

select
abs(LAG(Valoare,1,0)Over(Partition BY IDMoneda Order BY DataValorii) – Valoare)/Valoare *100 ,Valoare , from CV_Valori

order by 1 desc

This query executes in seconds and we can see the problems fast

Sq- if you know the tool , please use in your current task!((

If you ask me about functional programming(F#) my answer it will be DSL)

Dotnet Try

I have wrote a blog post about DotNet CLI Tools
I did not mention one that is super important: DotNet Try : https://github.com/dotnet/try

You can see at https://github.com/dotnet/try how to install it .

Or ,to use in Docker, check my files to install in Docker https://github.com/ignatandrei/Presentations/tree/master/2019/shorts/NetCoreGlobalTools/dotnetTry   . To edit files in Docker, use Visual Studio Code with Docker and Remote Development and Docker extension.

To see in action on the web, check out Noda Time  or

( for the moment is in preview, but it will work with GitHub and blogs…)

[PostEvent] Talks by Softbinator

The organizers from Talk from Softbinator were kind enough to select me for a presentation. Again , my presentation about .NET Core and Angular Everywhere. Source Code at https://github.com/ignatandrei/angNetCoreDemo/ 

The presentation was supposed to take 1/2 hour – and it took 1 hour with all the explanations.

Video at https://www.facebook.com/softbinator/videos/459024768242391

Correct abstraction–.NET Core IFileProvider

Create the right abstraction and they will implement it. I was delighted by finding that the .NET Core has an IFileProvider : https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.ifileprovider?view=aspnetcore-2.2  .

An obvious implementation is PhysicalFileProvider : https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.physicalfileprovider?view=aspnetcore-2.2

A not so obvious implementation, but normal from a programmers mind, is NullFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.nullfileprovider?view=aspnetcore-2.2

And, because we have already 2 providers, makes sense a CompositeFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.compositefileprovider?view=aspnetcore-2.2

And because we create assemblies, it is normal to have EmbeddedFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.embeddedfileprovider?view=aspnetcore-2.2

And , to complicate things, a ManifestEmbeddedFileProvider: https://docs.microsoft.com/en-us/dotnet/api/microsoft.extensions.fileproviders.manifestembeddedfileprovider?view=aspnetcore-2.2

( You can read more details here: https://docs.microsoft.com/en-us/aspnet/core/fundamentals/file-providers?view=aspnetcore-2.2#physicalfileprovider)

But this is just what Microsoft provides in .NET Framework.

Other creates more, such as :

LongFileProvider: https://github.com/v-kabanov/hdrepository/blob/master/trunk/Common/bfs.Repository.IO.WinNtfs/WinLongFileFrovider.cs

S3FileProvider: https://github.com/evorine/S3FileProvider/blob/master/src/S3FileProvider.cs, https://github.com/lamondlu/AWSS3FileProvider

AzureBlogFileProvider: https://github.com/filipw/Strathweb.AspNetCore.AzureBlobFileProvider

More Cloud Providers : https://github.com/jiabiao/NCloudFiles

ZipFileProviders: https://github.com/tagcode/Lexical.FileProvider , https://github.com/cloudscribe/cloudscribe/blob/master/src/cloudscribe.Web.Common/StaticFiles/GzipMappingFileProvider.cs

DatabaseFileProviders: https://github.com/mikebrind/RazorEngineViewOptionsFileProviders

And InMemory: https://github.com/dazinator/Dazinator.AspNet.Extensions.FileProviders

It is interesting how you can find various implementation of an so common thing, like a file / folder. But it is also rewarding to see that you have created the right abstractions – and other will implement !

Simple serialize of encoding

My problem was the serialize of the Encoding . Let’s suppose that we have a class that have a property Encoding( maybe to read a file ).


internal class MyTest
{
    public MyTest()
    {
        enc = ASCIIEncoding.ASCII;
    }
    public Encoding enc { get; set; }
}


We want to serialize this class in order to let the administrator/people to decide what will be the encoding.

When we serialize( obvious, with NewtonSoftJson) , we obtain this kind of data:

{
“enc”: {
“IsSingleByte”: true,
“BodyName”: “us-ascii”,
“EncodingName”: “US-ASCII”,
“HeaderName”: “us-ascii”,
WebName“: “us-ascii”,
“WindowsCodePage”: 1252,
“IsBrowserDisplay”: false,
“IsBrowserSave”: false,
“IsMailNewsDisplay”: true,
“IsMailNewsSave”: true,
“EncoderFallback”: {
“DefaultString”: “?”,
“MaxCharCount”: 1
},
“DecoderFallback”: {
“DefaultString”: “?”,
“MaxCharCount”: 1
},
“IsReadOnly”: true,
“CodePage”: 20127
}
}

This is too much for someone to edit . We want a simple string that can be edited easy – and the WebName ,  that is , in fact , a string from https://www.iana.org/assignments/character-sets/character-sets.xhtml seems the obvious choice.

So –  I have done a JSONConverter class just for this property. It is very simple:

public class JsonEncodingConverter : JsonConverter
{
    public override bool CanConvert(Type objectType)
    {
        return (typeof(Encoding).IsAssignableFrom(objectType));
    }

    public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
    {
        string webName = "";
        if (reader.TokenType == JsonToken.String)
        {

            webName = reader.Value?.ToString();
        }
        existingValue = Encoding.GetEncoding(webName);

        return existingValue;
    }

    public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
    {
        string webName = (value as Encoding).WebName;
        serializer.Serialize(writer, webName);
    }
}

And can be used very easy:

MyTest m = new MyTest();
JsonEncodingConverter[] conv = new[] { new JsonEncodingConverter() };
string original = JsonConvert.SerializeObject(m, Formatting.Indented);
string data = JsonConvert.SerializeObject(m, Formatting.Indented, conv);
//https://www.iana.org/assignments/character-sets/character-sets.xhtml
Console.WriteLine(data);
Console.WriteLine("and now the original");
Console.WriteLine(original);
MyTest s = JsonConvert.DeserializeObject<MyTest>(data, conv);
Console.WriteLine(s.enc.WebName);

The result of serializing it is now

{
“enc”: “us-ascii”
}

And because it has this code

public override bool CanConvert(Type objectType)
{
    return (typeof(Encoding).IsAssignableFrom(objectType));
}

it means it will serialize just the encoding, not other tools.

And it is more easy to be edited by someone.

Moral: Aim for simple string that can be edited can be achieved when serializing. Do not stay with defaults!

 

( you can easy achieve backwards compatibility for already serialized Encoding by asking

if (reader.TokenType == JsonToken.StartObject)
{
    webName = reader.Value?.ToString();
    //handling old data format for encoding
    while (reader.TokenType != JsonToken.EndObject)
    {
        if (!reader.Read())
            break;
        if (reader.TokenType != JsonToken.PropertyName)
            continue;
        var val = reader.Value?.ToString();
        if (string.Compare("webname", val, StringComparison.InvariantCultureIgnoreCase) == 0)
        {
            webName = reader.ReadAsString();
            //do not break - advance reading to the end
            //break;
        }
    }

}

)

C# integration testing in AzureDevOps with Docker containers– SqlServer and Cachet example

Every software that we make depends on others. For Stankins , as a general ETL data, it is more important to be tested with real data providers.For example, we may want to take data from Sql Server and send to Cachet . How can we have a SqlServer and a Cachet up and running easy ? The obvious answer our days is Docker.

Let’s see how a test for SqlServer looks

using FluentAssertions;
using Stankins.Alive;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Xbehave;
using Xunit;

namespace StankinsTestXUnit
{
    [Trait("ReceiverSqlServer", "")]
    [Trait("ExternalDependency","SqlServer")]
    public class TestReceiverSqlServer
    {
        [Scenario]
        [Example("Server=(local);Database=master;User Id=SA;Password = <YourStrong!Passw0rd>;")]
        public void TestReceiverDBServer(string connectionString)
        {
            IReceive status = null;
            IDataToSent data = null;
            $"Assume Sql Server instance {connectionString} exists , if not see docker folder".w(() => {

            });
            $"When I create the ReceiverDBServer ".w(() => status = new ReceiverDBSqlServer(connectionString));
            $"and receive data".w(async () =>
            {
                data = await status.TransformData(null);
            });
            $"the data should have a table".w(() =>
            {
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"and the result should be true".w(() =>
            {
                data.DataToBeSentFurther[0].Rows[0]["IsSuccess"].Should().Be(true);
            });


        }
    }
}

and for cachet :



using FluentAssertions;
using Stankins.FileOps;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Stankins.Rest;
using Xbehave;
using Xunit;
using static System.Environment;
using Stankins.Trello;
using Stankins.Cachet;

namespace StankinsTestXUnit
{
    [Trait("Cachet", "")]
    [Trait("ExternalDependency", "Cachet")]
    public class TestSenderCachet
    {
        [Scenario]
        [Example("Assets/JSON/CachetV1Simple.txt", 3)]
        public void TestSimpleJSON(string fileName,int NumberRows)
        {
            IReceive receiver = null;
           
            IDataToSent data=null;
            var nl = Environment.NewLine;
            $"Given the file {fileName}".w(() =>
            {
                File.Exists(fileName).Should().BeTrue();
            });
            $"When I create the {nameof(ReceiveRest)} for the {fileName}".w(() => receiver = new ReceiveRestFromFile(fileName));
            $"And I read the data".w(async () =>data= await receiver.TransformData(null));
            $"Then should be a data".w(() => data.Should().NotBeNull());
            $"With a table".w(() =>
            {
                data.DataToBeSentFurther.Should().NotBeNull();
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"The number of rows should be {NumberRows}".w(() => data.DataToBeSentFurther[0].Rows.Count.Should().Be(NumberRows));
            $"and now I transform with {nameof(SenderCachet)}".w(async ()=>
                data=await new SenderCachet("http://localhost:8000","5DiHQgKbsJqck4TWhMVO").TransformData(data)
            );

        } 

    }
}

( I have use XBehave for extensions)

Nice and easy , right ? Not so!

For up and running SqlServer I have used a docker compose file

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

and in AzureDevOps yaml start the containers, run the tests, collect the code coverage, stop the containers

docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml up -d  
        

echo 'start regular test'
        
         dotnet build -c $(buildConfiguration) stankinsv2/solution/StankinsV2/StankinsV2.sln
        
         dotnet test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --logger trx  --logger "console;verbosity=normal" --collect "Code coverage"
         echo 'coverlet'
         coverlet stankinsv2/solution/StankinsV2/StankinsTestXUnit/bin/$(buildConfiguration)/netcoreapp2.2/StankinsTestXUnit.dll --target "dotnet" --targetargs "test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --configuration $(buildConfiguration) --no-build" --format opencover --exclude "[xunit*]*"
        
         echo 'compose down'
         docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml down
        

Easy, right ? That’s because SqlServer is well behaved and has a fully functional image on Docker

That is not so easy with Cachet . Cachet requires configuration – and more, after configuration, it generates a random token for write data  ( http://localhost:8000","5DiHQgKbsJqck4TWhMVO ) .

So it will be a task for docker to export the container and import again  - easy stuff, right ? Again, not.

So I start a small docker container with

docker run -p 8000:8000 –name myCachetContainer -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

and then browsing to http://localhost:8000 I have configured and grab the token

Now it is time to export :

docker export myCachetContainer -o cachet.tar

And to import as an image

docker import cachet.tar  mycac

And to run the image again

docker run -p 8000:8000  -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

And the image stopped! After many tries and docker inspect the initial image , I have resulted to

docker run -it -p 8000:8000 -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite --workdir /var/www/html --user 1001:1001 mycac "/sbin/entrypoint.sh"

So the workdir, user, and the entry point are not copied into the image and you should do youurself.

The final preparing for CI with Docker for Cachet ? I have docker push myimage to Docker Hub , and I will run it from docker compose.

So now my docker compose with sql server and cachet looks this way

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

  cachet:
     image: ignatandrei/ci_cachet
     ports:
       - "8000:8000"
      
     environment:
       APP_KEY: "base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc="
       DEBUG: "false"
       DB_DRIVER: "sqlite"
       
     user: "1001"   
     working_dir: "/var/www/html"
     entrypoint: "/sbin/entrypoint.sh"

And I have a nice C# integration tests with Azure Devops, Docker, Sql Server and Cachet ! You can see the code coverage report at https://codecov.io/gh/ignatandrei/stankins/src/master/stankinsv2/solution/StankinsV2/Stankins.Cachet/SenderCachet.cs

Create a new exception–add fields and/or properties

This post is not about why we need custom exception (https://blogs.msdn.microsoft.com/jaredpar/2008/10/20/custom-exceptions-when-should-you-create-them/ ) . It is (more a rant ) about a specific  item in best practices in Exceptions( https://docs.microsoft.com/en-us/dotnet/standard/exceptions/best-practices-for-exceptions )

It says:

In custom exceptions, provide additional properties as needed

Provide additional properties for an exception (in addition to the custom message string) only when there’s a programmatic scenario where the additional information is useful. For example, the FileNotFoundExceptionprovides the FileName property.

What I want to add : In EVERY exception that you create in code, DEFINE a custom field. It  is useless without !

Why this rant ? In Stankins I have to intercept KeyNotFoundException  and I want to find the Key that was not found ( problem with some dictionary ) to provide it (Yes, it is a flawed design – but this is not the point here) . The problem was the definition:

public class KeyNotFoundException : SystemException, ISerializable
{

public KeyNotFoundException();

public KeyNotFoundException(string message);

public KeyNotFoundException(string message, Exception innerException);

protected KeyNotFoundException(SerializationInfo info, StreamingContext context);
}

See the problem ? No way to find WHAT is the Key that was not found. So I ended up with this code:

name =innerKeyEx.Message;
// The given key 'nameColumn' was not present in the dictionary.
var first=name.IndexOf("'");
var last= name.IndexOf("'",first+1);
name= name.Substring(first+1,last-first-1);

Moral of the post ? Do NOT define a custom Exception without defining a field / property inside!

OpenSource library- publishing

Following the rules at https://docs.microsoft.com/en-us/dotnet/standard/library-guidance/publish-nuget-package

Nr

Recomandation

AOP Roslyn

1

DO publish stable packages and pre-release packages you want community feedback on to NuGet.org.

Done

2

CONSIDER publishing pre-release packages to a MyGet feed from a continuous integration build.

No

3

CONSIDER testing packages in your development environment using a local feed or MyGet. Check the package works then publish it to NuGet.org.

Yes

4

DO use a Microsoft account to sign in to NuGet.

Yes

5

DO enable two-factor authentication for accessing NuGet.

Yes

6

DO enable email notification when a package is published.

Yes

For 1: Done already. However, I do not like pre-release packages

For 2: Too much hassle for a single developer. Using instead NuGet

For 3: Yes , I have already a build.bat to do this

For 4: Yes.(NuGet allows only that now)

For 5 : Yes

For 6: Yes.

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.