Category: azure

DotNet CLI Tools

There is a revival of CLI tools. And dotnet is going with the wave.

You can find the tools installed with .NET Core here ,  and here


Usually you do not need those – since Visual Studio is good to have all from the GUI.

However, you can make your own tool – and you have the instruction here:

But, before re-inventing the wheel, take a look at the list here

I have used for AzureDevOps CI:



Did you use some CLI tools?

C# integration testing in AzureDevOps with Docker containers– SqlServer and Cachet example

Every software that we make depends on others. For Stankins , as a general ETL data, it is more important to be tested with real data providers.For example, we may want to take data from Sql Server and send to Cachet . How can we have a SqlServer and a Cachet up and running easy ? The obvious answer our days is Docker.

Let’s see how a test for SqlServer looks

using FluentAssertions;
using Stankins.Alive;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Xbehave;
using Xunit;

namespace StankinsTestXUnit
    [Trait("ReceiverSqlServer", "")]
    public class TestReceiverSqlServer
        [Example("Server=(local);Database=master;User Id=SA;Password = <YourStrong!Passw0rd>;")]
        public void TestReceiverDBServer(string connectionString)
            IReceive status = null;
            IDataToSent data = null;
            $"Assume Sql Server instance {connectionString} exists , if not see docker folder".w(() => {

            $"When I create the ReceiverDBServer ".w(() => status = new ReceiverDBSqlServer(connectionString));
            $"and receive data".w(async () =>
                data = await status.TransformData(null);
            $"the data should have a table".w(() =>
            $"and the result should be true".w(() =>


and for cachet :

using FluentAssertions;
using Stankins.FileOps;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Stankins.Rest;
using Xbehave;
using Xunit;
using static System.Environment;
using Stankins.Trello;
using Stankins.Cachet;

namespace StankinsTestXUnit
    [Trait("Cachet", "")]
    [Trait("ExternalDependency", "Cachet")]
    public class TestSenderCachet
        [Example("Assets/JSON/CachetV1Simple.txt", 3)]
        public void TestSimpleJSON(string fileName,int NumberRows)
            IReceive receiver = null;
            IDataToSent data=null;
            var nl = Environment.NewLine;
            $"Given the file {fileName}".w(() =>
            $"When I create the {nameof(ReceiveRest)} for the {fileName}".w(() => receiver = new ReceiveRestFromFile(fileName));
            $"And I read the data".w(async () =>data= await receiver.TransformData(null));
            $"Then should be a data".w(() => data.Should().NotBeNull());
            $"With a table".w(() =>
            $"The number of rows should be {NumberRows}".w(() => data.DataToBeSentFurther[0].Rows.Count.Should().Be(NumberRows));
            $"and now I transform with {nameof(SenderCachet)}".w(async ()=>
                data=await new SenderCachet("http://localhost:8000","5DiHQgKbsJqck4TWhMVO").TransformData(data)



( I have use XBehave for extensions)

Nice and easy , right ? Not so!

For up and running SqlServer I have used a docker compose file

version: '3'
       - "1433:1433"
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

and in AzureDevOps yaml start the containers, run the tests, collect the code coverage, stop the containers

docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml up -d  

echo 'start regular test'
         dotnet build -c $(buildConfiguration) stankinsv2/solution/StankinsV2/StankinsV2.sln
         dotnet test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --logger trx  --logger "console;verbosity=normal" --collect "Code coverage"
         echo 'coverlet'
         coverlet stankinsv2/solution/StankinsV2/StankinsTestXUnit/bin/$(buildConfiguration)/netcoreapp2.2/StankinsTestXUnit.dll --target "dotnet" --targetargs "test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --configuration $(buildConfiguration) --no-build" --format opencover --exclude "[xunit*]*"
         echo 'compose down'
         docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml down

Easy, right ? That’s because SqlServer is well behaved and has a fully functional image on Docker

That is not so easy with Cachet . Cachet requires configuration – and more, after configuration, it generates a random token for write data  ( http://localhost:8000","5DiHQgKbsJqck4TWhMVO ) .

So it will be a task for docker to export the container and import again  - easy stuff, right ? Again, not.

So I start a small docker container with

docker run -p 8000:8000 –name myCachetContainer -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

and then browsing to http://localhost:8000 I have configured and grab the token

Now it is time to export :

docker export myCachetContainer -o cachet.tar

And to import as an image

docker import cachet.tar  mycac

And to run the image again

docker run -p 8000:8000  -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

And the image stopped! After many tries and docker inspect the initial image , I have resulted to

docker run -it -p 8000:8000 -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite --workdir /var/www/html --user 1001:1001 mycac "/sbin/"

So the workdir, user, and the entry point are not copied into the image and you should do youurself.

The final preparing for CI with Docker for Cachet ? I have docker push myimage to Docker Hub , and I will run it from docker compose.

So now my docker compose with sql server and cachet looks this way

version: '3'
       - "1433:1433"
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

     image: ignatandrei/ci_cachet
       - "8000:8000"
       APP_KEY: "base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc="
       DEBUG: "false"
       DB_DRIVER: "sqlite"
     user: "1001"   
     working_dir: "/var/www/html"
     entrypoint: "/sbin/"

And I have a nice C# integration tests with Azure Devops, Docker, Sql Server and Cachet ! You can see the code coverage report at

.NET Core Alphabet

What I wanted is a simple application ( Web, Mobile, Desktop) that can list , alphabetically, the .NET Core keywords. What is the purpose?

  1. For interviews – suppose you want to test the people knowledge in C#. You start the application( again: Desktop or Web or Mobile) and let the candidate choose a letter. Then you see the keywords for this letter and ask the candidate to explain some of the keywords
  2. For remembering features: there are so many features in .NET language ( ) that for a programmer it is good to know – or to revisit – the features that are in the language.
  3. For contest within programmers  – like the interviews, but for the passionate programmers that want to have an easy way to decide the one with the best memory
  4. Maybe other uses that I do not know  ? Please share in comments

Now with the realization: What I want is the simple application, that has inside the database with keywords and links and any others. From this database, the code sources for the data will be generated and the application(s) will be generated. Also, data should be publicly available to profit from the crowd power –anyone that want to add something can add.

stankins.console execute -o ReceiveRestFromFile -a primaryData/netCoreAlphabet.json -o SenderToTypeScript -a “” -o TransformerConcatenateOutputString -a a.ts -o SenderOutputToFolder -a $(Build.ArtifactStagingDirectory)/data/ -a false

stankins.console execute -o ReceiveRestFromFile -a primaryData/netCoreAlphabet.json -o SenderToRazorFromFile -a primaryData/markdown.txt -o TransformerConcatenateOutputString -a -o SenderOutputToFolder -a $(Build.ArtifactStagingDirectory)/data/ -a false

And to complete all those, it will be put in an AzureDevops pipeline 

You can see the result on Android :  , WebSite:

Also, if you want , please contribute by making a PR by editing or by contributing to enchance the application by solving

MVC Browser history provider for azure–trying an implementation for 3 hours

first, implement  IBrowserUserHistoryRepository  – that means implement:

public void Save(IEnumerable<BrowserUserHistoryData> history)


Azure have PartitionKey/RowKey – I have to add a new class.

Also or connectiing, I have to put


connectionString="UseDevelopmentStorage=true;" /


I tried to add a bulk history :
The result was:
Unexpected response code for operation : 0
<add key="TableStorageEndpoint" value=""/>
And one hour has been gone.
Run dsinit to have storage emulator:
No connection could be made because the target machine actively refused it

modified code to old Azure code:

now the answer was:

One of the request inputs is out of range. – All letters in a container name must be lowercase.

Tried that – same result:

One of the request inputs is out of range.

Maybe timestam is wrong? No…

Now debug with Fiddler :



Added to connection string:


 And see this in Fiddler :



: base(UserName, UserName)
So the problem is that RowKey does not support url values.
Now , after removing url from the RowKey  - and put username, the error was:

The specified entity already exists

Another hour passes


Now, that it works, thinking about rowkey and partitionkey : no username + url => put date.ToString("yyyyMMdd_HHmmss_tttt")

0:The specified entity already exists.

Oh no, not again?

Look tables =>20121220_064024_AM -  ok, it should be


0:The specified entity already exists

Again? debug, please

The real problem:

Forget about sending whole items history - not just not saved ones…
Now it works – kind of

Server Error in ‘/’ Application.

The method or operation is not implemented.

public IEnumerable<KeyValuePair<string, int>> MostUsed(int Count, DateTime? date)
Line 80:         {
Line 81:             throw new NotImplementedException();
Line 82:         }
Line 83: 
Implementing MostUsed(int Count, DateTime? date) 
Research about filter with data -
Research about GroupBY – not supported!
So now thinking about a way to STORE the data in a convenient format to can retrieve…
 It must take into consideration Count for a date and Count for all dates( date can be null) – AND BOTH THE FACT THAT THE OPERATION WILL BE DONE PER USER.
Time to think – because another hour has passed!

Azure tools

Azure storage Explorer :  – like in VS , but simpler and cleaner

Windows Azure ASP.NET Providers Sample : – utils for fast membership and roles. Small problem on local.

More samples here:

And that will be all , after reading the documentation and understanding the concepts ( for example,if you understand the session problem in azure, then you will find a Session provider in the samples and use it)


I am working at a new application ( Azure + MVC + logging) and I am having a good time  – when I know resources. This are the steps that works for me:

  1. Azure SDK ( it would be good if you have Sql Server). Now version 1.6, download from
      1. Optional : run DSInit , , to modify default SqlServer instance
  2. Windows Azure ASP.NET Providers Sample – good to fast use membership in Azure.
      1. for local , the .config keys are

        <add key="AccountName" value="devstoreaccount1" />
            <add key="AccountSharedKey" value=""></add>
            <add key="BlobStorageEndpoint" value="" />
            <add key="TableStorageEndpoint" value="" />

      2. for azure, the .config keys are

           <add key="AccountName" value="…………………………" />
          <add key="AccountSharedKey" value="………………"></add>

          <add key="BlobStorageEndpoint" value="" />
          <add key="TableStorageEndpoint" value="" />

  3. Add Table Storage Provider- either from samples, either from here  ( read too)
      1. for local , the .config keys are
      2. <add key="BlobStorageEndpoint" value="" />
        <add key="TableStorageEndpoint" value="" />

      3. for azure, the .config keys are

<add key="BlobStorageEndpoint" value="" />
<add key="TableStorageEndpoint" value="" />

From now, you do have

a) An Membership Provider -  to create users

b) Azure Tables that can be searched with IQueryable .

This is all you need for now. Later you will enquire about caching / session / application. I would post them when I will be programming more. However , for caching I will be starting at and for session at

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.