Category: azure

C# integration testing in AzureDevOps with Docker containers– SqlServer and Cachet example

Every software that we make depends on others. For Stankins , as a general ETL data, it is more important to be tested with real data providers.For example, we may want to take data from Sql Server and send to Cachet . How can we have a SqlServer and a Cachet up and running easy ? The obvious answer our days is Docker.

Let’s see how a test for SqlServer looks

using FluentAssertions;
using Stankins.Alive;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Xbehave;
using Xunit;

namespace StankinsTestXUnit
{
    [Trait("ReceiverSqlServer", "")]
    [Trait("ExternalDependency","SqlServer")]
    public class TestReceiverSqlServer
    {
        [Scenario]
        [Example("Server=(local);Database=master;User Id=SA;Password = <YourStrong!Passw0rd>;")]
        public void TestReceiverDBServer(string connectionString)
        {
            IReceive status = null;
            IDataToSent data = null;
            $"Assume Sql Server instance {connectionString} exists , if not see docker folder".w(() => {

            });
            $"When I create the ReceiverDBServer ".w(() => status = new ReceiverDBSqlServer(connectionString));
            $"and receive data".w(async () =>
            {
                data = await status.TransformData(null);
            });
            $"the data should have a table".w(() =>
            {
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"and the result should be true".w(() =>
            {
                data.DataToBeSentFurther[0].Rows[0]["IsSuccess"].Should().Be(true);
            });


        }
    }
}

and for cachet :



using FluentAssertions;
using Stankins.FileOps;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Stankins.Rest;
using Xbehave;
using Xunit;
using static System.Environment;
using Stankins.Trello;
using Stankins.Cachet;

namespace StankinsTestXUnit
{
    [Trait("Cachet", "")]
    [Trait("ExternalDependency", "Cachet")]
    public class TestSenderCachet
    {
        [Scenario]
        [Example("Assets/JSON/CachetV1Simple.txt", 3)]
        public void TestSimpleJSON(string fileName,int NumberRows)
        {
            IReceive receiver = null;
           
            IDataToSent data=null;
            var nl = Environment.NewLine;
            $"Given the file {fileName}".w(() =>
            {
                File.Exists(fileName).Should().BeTrue();
            });
            $"When I create the {nameof(ReceiveRest)} for the {fileName}".w(() => receiver = new ReceiveRestFromFile(fileName));
            $"And I read the data".w(async () =>data= await receiver.TransformData(null));
            $"Then should be a data".w(() => data.Should().NotBeNull());
            $"With a table".w(() =>
            {
                data.DataToBeSentFurther.Should().NotBeNull();
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"The number of rows should be {NumberRows}".w(() => data.DataToBeSentFurther[0].Rows.Count.Should().Be(NumberRows));
            $"and now I transform with {nameof(SenderCachet)}".w(async ()=>
                data=await new SenderCachet("http://localhost:8000","5DiHQgKbsJqck4TWhMVO").TransformData(data)
            );

        } 

    }
}

( I have use XBehave for extensions)

Nice and easy , right ? Not so!

For up and running SqlServer I have used a docker compose file

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

and in AzureDevOps yaml start the containers, run the tests, collect the code coverage, stop the containers

docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml up -d  
        

echo 'start regular test'
        
         dotnet build -c $(buildConfiguration) stankinsv2/solution/StankinsV2/StankinsV2.sln
        
         dotnet test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --logger trx  --logger "console;verbosity=normal" --collect "Code coverage"
         echo 'coverlet'
         coverlet stankinsv2/solution/StankinsV2/StankinsTestXUnit/bin/$(buildConfiguration)/netcoreapp2.2/StankinsTestXUnit.dll --target "dotnet" --targetargs "test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --configuration $(buildConfiguration) --no-build" --format opencover --exclude "[xunit*]*"
        
         echo 'compose down'
         docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml down
        

Easy, right ? That’s because SqlServer is well behaved and has a fully functional image on Docker

That is not so easy with Cachet . Cachet requires configuration – and more, after configuration, it generates a random token for write data  ( http://localhost:8000","5DiHQgKbsJqck4TWhMVO ) .

So it will be a task for docker to export the container and import again  - easy stuff, right ? Again, not.

So I start a small docker container with

docker run -p 8000:8000 –name myCachetContainer -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

and then browsing to http://localhost:8000 I have configured and grab the token

Now it is time to export :

docker export myCachetContainer -o cachet.tar

And to import as an image

docker import cachet.tar  mycac

And to run the image again

docker run -p 8000:8000  -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

And the image stopped! After many tries and docker inspect the initial image , I have resulted to

docker run -it -p 8000:8000 -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite --workdir /var/www/html --user 1001:1001 mycac "/sbin/entrypoint.sh"

So the workdir, user, and the entry point are not copied into the image and you should do youurself.

The final preparing for CI with Docker for Cachet ? I have docker push myimage to Docker Hub , and I will run it from docker compose.

So now my docker compose with sql server and cachet looks this way

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

  cachet:
     image: ignatandrei/ci_cachet
     ports:
       - "8000:8000"
      
     environment:
       APP_KEY: "base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc="
       DEBUG: "false"
       DB_DRIVER: "sqlite"
       
     user: "1001"   
     working_dir: "/var/www/html"
     entrypoint: "/sbin/entrypoint.sh"

And I have a nice C# integration tests with Azure Devops, Docker, Sql Server and Cachet ! You can see the code coverage report at https://codecov.io/gh/ignatandrei/stankins/src/master/stankinsv2/solution/StankinsV2/Stankins.Cachet/SenderCachet.cs

.NET Core Alphabet

What I wanted is a simple application ( Web, Mobile, Desktop) that can list , alphabetically, the .NET Core keywords. What is the purpose?

  1. For interviews – suppose you want to test the people knowledge in C#. You start the application( again: Desktop or Web or Mobile) and let the candidate choose a letter. Then you see the keywords for this letter and ask the candidate to explain some of the keywords
  2. For remembering features: there are so many features in .NET language (  https://docs.microsoft.com/en-us/dotnet/csharp/whats-new/csharp-version-history ) that for a programmer it is good to know – or to revisit – the features that are in the language.
  3. For contest within programmers  – like the interviews, but for the passionate programmers that want to have an easy way to decide the one with the best memory
  4. Maybe other uses that I do not know  ? Please share in comments

Now with the realization: What I want is the simple application, that has inside the database with keywords and links and any others. From this database, the code sources for the data will be generated and the application(s) will be generated. Also, data should be publicly available to profit from the crowd power –anyone that want to add something can add.

stankins.console execute -o ReceiveRestFromFile -a primaryData/netCoreAlphabet.json -o SenderToTypeScript -a “” -o TransformerConcatenateOutputString -a a.ts -o SenderOutputToFolder -a $(Build.ArtifactStagingDirectory)/data/ -a false

stankins.console execute -o ReceiveRestFromFile -a primaryData/netCoreAlphabet.json -o SenderToRazorFromFile -a primaryData/markdown.txt -o TransformerConcatenateOutputString -a cards.md -o SenderOutputToFolder -a $(Build.ArtifactStagingDirectory)/data/ -a false
 

And to complete all those, it will be put in an AzureDevops pipeline https://github.com/ignatandrei/netCoreAlphabet/blob/master/azure-pipelines.yml 

You can see the result on Android : https://play.google.com/store/apps/details?id=com.github.ignatandrei.netcorealphabet&hl=en  , WebSite: https://ignatandrei.github.io/netCoreAlphabet

Also, if you want , please contribute by making a PR by editing https://github.com/ignatandrei/netCoreAlphabet/blob/master/primaryData/netCoreAlphabet.json or by contributing to enchance the application by solving https://github.com/ignatandrei/netCoreAlphabet/issues

MVC Browser history provider for azure–trying an implementation for 3 hours

first, implement  IBrowserUserHistoryRepository  – that means implement:

public void Save(IEnumerable<BrowserUserHistoryData> history)

 

Azure have PartitionKey/RowKey – I have to add a new class.

Also or connectiing, I have to put

 

connectionString="UseDevelopmentStorage=true;" /

 

I tried to add a bulk history :
tableHistory.ExecuteBatch(batchOperation);
The result was:
Unexpected response code for operation : 0
Magic: 
<add key="TableStorageEndpoint" value="http://127.0.0.1:1002/"/>
And one hour has been gone.
Run dsinit to have storage emulator:
No connection could be made because the target machine actively refused it 127.0.0.1:10002

modified code to old Azure code:

now the answer was:

One of the request inputs is out of range.

http://msdn.microsoft.com/en-us/library/dd135715.aspx – All letters in a container name must be lowercase.

Tried that – same result:

One of the request inputs is out of range.

Maybe timestam is wrong? No…

Now debug with Fiddler :

http://sepialabs.com/blog/2012/02/17/profiling-azure-storage-with-fiddler/

image

 

Added to connection string:

 

DevelopmentStorageProxyUri=http://ipv4.fiddler
 And see this in Fiddler :

<d:PartitionKey>zungb4ovunqjd5rtal5ytc3r</d:PartitionKey>

<d:RowKey>http://localhost:2728/</d:RowKey>

: base(UserName, UserName)
 
So the problem is that RowKey does not support url values.
Now , after removing url from the RowKey  - and put username, the error was:

The specified entity already exists

Another hour passes

——————

Now, that it works, thinking about rowkey and partitionkey : no username + url => put date.ToString("yyyyMMdd_HHmmss_tttt")

0:The specified entity already exists.

Oh no, not again?

Look tables =>20121220_064024_AM -  ok, it should be

date.ToString("yyyyMMdd_HHmmss_ffffzzz")

0:The specified entity already exists

Again? debug, please

The real problem:

Forget about sending whole items history - not just not saved ones…
Now it works – kind of

Server Error in ‘/’ Application.


The method or operation is not implemented.

public IEnumerable<KeyValuePair<string, int>> MostUsed(int Count, DateTime? date)
Line 80:         {
Line 81:             throw new NotImplementedException();
Line 82:         }
Line 83: 
-----------
Implementing MostUsed(int Count, DateTime? date) 
Research about filter with data - http://storageextensions.codeplex.com/SourceControl/changeset/view/81826#1914483
Research about GroupBY – not supported!http://msdn.microsoft.com/en-us/library/windowsazure/dd135725.aspx
So now thinking about a way to STORE the data in a convenient format to can retrieve…
 It must take into consideration Count for a date and Count for all dates( date can be null) – AND BOTH THE FACT THAT THE OPERATION WILL BE DONE PER USER.
Time to think – because another hour has passed!
 

Azure tools

Azure storage Explorer : http://azurestorageexplorer.codeplex.com/  – like in VS , but simpler and cleaner

Windows Azure ASP.NET Providers Sample : http://code.msdn.microsoft.com/windowsazure/Windows-Azure-ASPNET-03d5dc14 – utils for fast membership and roles. Small problem on local.

More samples here: http://code.msdn.microsoft.com/windowsazure/

And that will be all , after reading the documentation and understanding the concepts ( for example,if you understand the session problem in azure, then you will find a Session provider in the samples and use it)

Azure

I am working at a new application ( Azure + MVC + logging) and I am having a good time  – when I know resources. This are the steps that works for me:

  1. Azure SDK ( it would be good if you have Sql Server). Now version 1.6, download from http://www.microsoft.com/download/en/details.aspx?id=28045
      1. Optional : run DSInit , http://msdn.microsoft.com/en-us/library/windowsazure/gg433005.aspx , to modify default SqlServer instance
  2. Windows Azure ASP.NET Providers Sample – good to fast use membership in Azure.
      1. for local , the .config keys are

        <add key="AccountName" value="devstoreaccount1" />
            <add key="AccountSharedKey" value=""></add>
            <add key="BlobStorageEndpoint" value="http://127.0.0.1:10000/devstoreaccount1" />
            <add key="TableStorageEndpoint" value="http://127.0.0.1:10002/devstoreaccount1" />

      2. for azure, the .config keys are

           <add key="AccountName" value="…………………………" />
          <add key="AccountSharedKey" value="………………"></add>

          <add key="BlobStorageEndpoint" value="https://logcollectorazure.blob.core.windows.net" />
          <add key="TableStorageEndpoint" value="https://logcollectorazure.table.core.windows.net" />

  3. Add Table Storage Provider- either from samples, either from here  http://blogs.msdn.com/b/jnak/archive/2010/01/06/walkthrough-windows-azure-table-storage-nov-2009-and-later.aspx  ( read http://blogs.msdn.com/b/jnak/archive/2008/10/28/walkthrough-simple-table-storage.aspx too)
      1. for local , the .config keys are
      2. <add key="BlobStorageEndpoint" value="http://127.0.0.1:10000/devstoreaccount1" />
        <add key="TableStorageEndpoint" value="http://127.0.0.1:10002/devstoreaccount1" />

      3. for azure, the .config keys are

<add key="BlobStorageEndpoint" value="https://logcollectorazure.blob.core.windows.net" />
<add key="TableStorageEndpoint" value="https://logcollectorazure.table.core.windows.net" />

From now, you do have

a) An Membership Provider -  to create users

b) Azure Tables that can be searched with IQueryable .

This is all you need for now. Later you will enquire about caching / session / application. I would post them when I will be programming more. However , for caching I will be starting at http://code.msdn.microsoft.com/windowsazure/Using-AppFabric-Cache-595454bb and for session at http://code.msdn.microsoft.com/windowsazure/Using-AppFabric-Cache-595454bb

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.