Category: azure

Adding database in Azure –part 26

Now getting the database – needed for historic data.

When I go to Create New Database in Azure Portal , there are many types of Database (mongo, postgres, many others) – the easy choice, for me, is SqlServer ( even if the tables are not be updated – the exchange rates does not change a lot). There is something that is easy to deploy a WebApp : WebApp + Sql

But I wil create a new one – Azure Sql Managed Instance – the price for 4 vCore is > 400 EUR. Sure do not want this…

So I decided yo have SqlServer, with a database with 2 GB storage  – smaller price, like 4.21 per month– for a normal person.

I can access with SSMS to create tables – and then I generate the script

CREATE TABLE [dbo].[NBR](
[ExchangeFrom] [nvarchar](50) NOT NULL,
[ExchangeTo] [nvarchar](50) NOT NULL,
[Date] [date] NOT NULL,
[ExchangeValue] [decimal](18, 6) NOT NULL,
CONSTRAINT [PK_NBR] PRIMARY KEY CLUSTERED
(
[ExchangeFrom] ASC,
[ExchangeTo] ASC,
[Date] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO

 

Create an ISave interface ( trying to cope with CQRS) .

Now I want to scaffold C# from database like https://docs.microsoft.com/en-us/ef/core/miscellaneous/cli/dotnet

Installing

dotnet tool install –global dotnet-ef

gives error about versioning

dotnet tool install –global dotnet-ef –version 3.0.0

SUccess!

Now trying to scaffold

dotnet ef dbcontext scaffold “Server=(localdb)\mssqllocaldb;Database=Blogging;Trusted_Connection=True;” Microsoft.EntityFrameworkCore.SqlServer -o Models

Error – does not find project

Ok. Run in the folder with the project. Now it does not like .NET Standard – wants .NET Core . Changed in .csproj <TargetFramework>netcoreapp3.0</TargetFramework>

Now it says:

Your startup project ‘InfovalutarDB’ doesn’t reference Microsoft.EntityFrameworkCore.Design.

Adding the NuGet Package

Now the command line

dotnet ef dbcontext scaffold “Server=(localdb)\mssqllocaldb;Database=Blogging;Trusted_Connection=True;” Microsoft.EntityFrameworkCore.SqlServer -o Models

Works!

Reading about storing connection strings safe:

http://go.microsoft.com/fwlink/?LinkId=723263

Figuring a way to use either inmemory database( for fast testing) or sqlserver

var config = ConfigurationManager.ConnectionStrings[“DB”];
var opt= new DbContextOptionsBuilder<InfoValutarContext>();
if (config == null)
{
opt.UseInMemoryDatabase(“write”);
}
else
{
opt.UseSqlServer(config.ConnectionString);
}

var cnt = new InfoValutarContext(opt.Options);

 

So a new Test created:

ISave s = new SaveSqlServer();
var response = await File.ReadAllTextAsync(Path.Combine(“Data”, “20191020bnr.txt”));
var m = new MockHttpMessageHandler();
m.When(“https://www.bnr.ro/nbrfxrates.xml”)
.Respond(“application/text”, response);

var nbr = new GetNBRExchange(m);
var data = await nbr.GetActualRates().ToArrayAsync();
var nr= await s.Save(data);
Assert.Equal(nr, data.Length);

 

And modified in the Docker file to restore this new project

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Deploy to Azure–part 23

Create in Azure an AppService infovalutar with InfovalutarRG resource group – can be done like here

https://docs.microsoft.com/en-us/azure/app-service/app-service-web-get-started-dotnet

or from the Azure Portal ( create AppService – I have put Linux with .NET Core 3.0)

Searching how to connect AzureDevOps – the connection should be added from Project Settings (a link at the bottom of the page)

Creating the service InfoValutarServiceConnection on AzureDevops

Saving the yaml files ( with inherent spaces problems)

https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-rm-web-app-deployment?view=azure-devops

Now it says the error:

2019-11-28T19:02:29.1210252Z ##[error]Error Code: ERROR_DESTINATION_NOT_REACHABLE
More Information: Could not connect to the remote computer ("infovalutar.scm.azurewebsites.net"). On the remote computer, make sure that Web Deploy is installed and that the required process ("Web Management Service") is started.  Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_DESTINATION_NOT_REACHABLE.

Good – but my site is  https://infovalutar.azurewebsites.net/ ,  not infovalutar.scm.azurewebsites.net

Reading documentation from https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-rm-web-app-deployment?view=azure-devops

Changing from

– task: AzureRmWebAppDeployment@3

inputs:

azureSubscription: ‘InfoValutarServiceConnection’

WebAppName: ‘infovalutar’

Package: $(System.ArtifactsDirectory)/*InfoValLinuxX64*.zip

ResourceGroupName: ‘infovalutarRG’

to

– task: AzureRmWebAppDeployment@4

and hoping in magic  ( also, put in docker job

dependsOn:

– Build_With_Test

condition: and(succeeded(),false)

in order to not execute docker job now…

And now the deployment works! See yourself at https://infovalutar.azurewebsites.net/

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Docker–fourth part–part 20

Time to push to docker

1. putting the docker password into the pipeline variables

2. Modifying the yaml

3. Waiting for build

Error in Docker:  Incorrect name or password

Now, I want to create 2 jobs to isolate Docker from the main build.

I encounter some errors in the yaml file . That shows fast in the AzureDevops, but you must figure what it is by reading the line

/azure-pipelines.yml: (Line: 8, Col: 7, Idx: 271) – (Line: 8, Col: 7, Idx: 271): Mapping values are not allowed in this context.

/azure-pipelines.yml: (Line: 45, Col: 3, Idx: 1225) – (Line: 45, Col: 4, Idx: 1226): While parsing a block mapping, did not find expected key.

Job Build Docker has an invalid name. Valid names may only contain alphanumeric characters and ‘_’ and may not start with a number.

(a name attribte was not indented, steps was not found before script, build name having spaces)

Now trying to build again. Error because I have put the password variable from Azure secret? Delete the variable, putting another variable with the same name, not secret. Works!

Now I have on docker hub a new image, ignatandrei/infovalutar  : https://hub.docker.com/r/ignatandrei/infovalutar

You can run with

docker run –rm -it -p 8080:8080  ignatandrei/infovalutar:latest

And then go to http://localhost:8080/swagger/

And that was all! ( pipeline definition at https://dev.azure.com/ignatandrei0674/InfoValutar/_build?definitionId=5&_a=summary )

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Bingo for meetings- azure integrations–part 10

Bingo

Bingo is a small project, written in TypeScript , and developed with Alexandru Badita in launch break (one hour - more or less). You can find sources at https://github.com/alexandru360/PresentationBingoCards/ . Those are my blog posts for Bingo : ( scroll below for the post)
NrLink
1Create meeting
2Create Tests
3Finalize Create meeting
4Sharing meeting
5Keep Score
6Add obsolete
7Finalizing obsolete
8End meeting
9Dockerize tests
10Azure CI tests
11Yarn workspaces
12CLI
13Intermezzo - CLI improvements
14typescript compile run with node
15NestJS ,swagger and create a meeting
16Finalizing API
17Intermezzo - jest vs jasmine error
18Refactor WebAPI and test service
19Heroku Deploy NestJs
20Angular
21Deploy Angular to GitHub
22WebAPI and Web
23Documentation
24Documentation of the code
25Conclusions

Now it is about Continuous Integrations. We want tests to be run each time we push something to GitHub.  For this we could use Azure DevOps. It is free for GitHub public repositories . We want to configure an azure pipeline to automatically run tests  that we have in Docker.

So the pipeline will just have to explicit gather the test results ( tests + code coverage ) in order to display in the Azure Pipeline and in the project. Azure DevOps wants the test coverage in JaCoCo or Cobertura . Jest has Istanbul as default test coverage, and Istanbul has Cobertura report. So we modify the jest.config.js to support cobertura

module.exports = {

preset: ‘ts-jest’,

transform: {

‘^.+\\.tsx?$’: ‘ts-jest’,

},

testEnvironment: ‘node’,

collectCoverage: true,

coverageReporters : [“json”, “lcov”, “text”, “clover”,”cobertura”]

};

And to copy when docker building the tests to the local path

docker build ../Src -f docker_ci_test.txt -t bingo_ci_test

docker run -d –rm –name bingo_ci_test_container bingo_ci_test

docker cp bingo_ci_test_container:/app/jest-stare .

docker cp bingo_ci_test_container:/app/junit.xml .

docker cp bingo_ci_test_container:/app/coverage/cobertura-coverage.xml .   

docker container kill bingo_ci_test_container

And then copy to the AzureDevOps test system

#https://docs.microsoft.com/en-us/azure/devops/pipelines/build/options?view=vsts&tabs=yaml

variables:

year: $(Date:yyyy)

month: $(Date:MM)

day: $(Date:dd)

uk: $(Date:yyyyMMdd)

messagePush: $(Build.SourceVersionMessage)

name: $(TeamProject)_$(BuildDefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r)

jobs:

– job: FullTestOnLinux

pool:

vmImage: ‘ubuntu-16.04’

steps:

– checkout: self #skip checking out the default repository resource

clean: true

– script: |

cd dockerize

ls -l

chmod 777 ./ci_test.bat

./ci_test.bat

docker image ls

docker container ls

cp -r -v ./jest-stare $(Build.ArtifactStagingDirectory)/jest-stare/

cp ./junit.xml $(Build.ArtifactStagingDirectory)/junit.xml

cp ./cobertura-coverage.xml $(Build.ArtifactStagingDirectory)/cobertura-coverage.xml

displayName: test DDD

– task: PublishBuildArtifacts@1

inputs:

artifactName: Tests

displayName: ‘Publish Artifact: drop’

– task: PublishTestResults@2

inputs:

testRunner: JUnit

testResultsFiles: ‘$(Build.ArtifactStagingDirectory)/junit.xml’

– task: PublishCodeCoverageResults@1

inputs:

codeCoverageTool: ‘cobertura’

summaryFileLocation: ‘$(Build.ArtifactStagingDirectory)/cobertura-coverage.xml’

You can see the tests and the test coverage at https://dev.azure.com/ignatandrei0674/BingoAzureDevOps/_build/results?buildId=953&view=ms.vss-test-web.build-test-results-tab

Bingo for meetings- dockerize tests–part 9

Bingo

Bingo is a small project, written in TypeScript , and developed with Alexandru Badita in launch break (one hour - more or less). You can find sources at https://github.com/alexandru360/PresentationBingoCards/ . Those are my blog posts for Bingo : ( scroll below for the post)
NrLink
1Create meeting
2Create Tests
3Finalize Create meeting
4Sharing meeting
5Keep Score
6Add obsolete
7Finalizing obsolete
8End meeting
9Dockerize tests
10Azure CI tests
11Yarn workspaces
12CLI
13Intermezzo - CLI improvements
14typescript compile run with node
15NestJS ,swagger and create a meeting
16Finalizing API
17Intermezzo - jest vs jasmine error
18Refactor WebAPI and test service
19Heroku Deploy NestJs
20Angular
21Deploy Angular to GitHub
22WebAPI and Web
23Documentation
24Documentation of the code
25Conclusions

We have now full DDD and tests that should be run for the objects. However, we need a way to automatically have the tests run . The

easy way is to dockerize the tests – run in a container, grab the results, display somewhere.

First we should have the tests display in a nice form some data.  For this, jest have the “reporters” features – but no documentation . So I try to find and https://github.com/dkelosky/jest-stare .

 

So what are the steps ?

  1. Create docker from node
  2. Copy sources ( add a .dockerignore to not copy node_modules)
  3. Install dependencies
  4. Run test
  5. run image  into container and grab the tests results

 

The docker file , named docker_ci_test.txt , has the following content

FROM node:8
WORKDIR /app
COPY . ./
RUN yarn
RUN yarn test –reporters default jest-stare
CMD tail -f /dev/null

The bat that runs the image and grab results from the container

docker build ../src -f docker_ci_test.txt -t bingo_ci_test
docker run -d –rm –name bingo_ci_test_container bingo_ci_test
docker cp bingo_ci_test_container:/app/jest-stare .
docker container kill bingo_ci_test_container

Feel free to download the project from https://github.com/alexandru360/PresentationBingoCards/  and run the ci_test.bat file from dockerize folder.

DotNet CLI Tools

There is a revival of CLI tools. And dotnet is going with the wave.

You can find the tools installed with .NET Core here , https://docs.microsoft.com/en-us/dotnet/core/tools/?tabs=netcore2x  and here https://docs.microsoft.com/en-us/dotnet/core/additional-tools/

 

Usually you do not need those – since Visual Studio is good to have all from the GUI.

However, you can make your own tool – and you have the instruction here: https://docs.microsoft.com/en-us/dotnet/core/tools/global-tools-how-to-create

But, before re-inventing the wheel, take a look at the list here https://github.com/natemcmaster/dotnet-tools

I have used for AzureDevOps CI:

  1.  https://github.com/tonerdo/coverlet 
  2. https://github.com/ignatandrei/AOP_With_Roslyn
  3. https://github.com/loresoft/DotNet.Property
  4. https://github.com/SonarSource/sonar-scanner-msbuild
  5. https://github.com/KrystianKolad/DotnetThx
  6. https://github.com/danielpalme/ReportGenerator/
  7. https://github.com/Hubert-Rybak/dotnet-warp
  8. https://github.com/aspnet/AspNetCore/tree/master/src/Tools/dotnet-watch

 

Did you use some CLI tools?

C# integration testing in AzureDevOps with Docker containers– SqlServer and Cachet example

Every software that we make depends on others. For Stankins , as a general ETL data, it is more important to be tested with real data providers.For example, we may want to take data from Sql Server and send to Cachet . How can we have a SqlServer and a Cachet up and running easy ? The obvious answer our days is Docker.

Let’s see how a test for SqlServer looks

using FluentAssertions;
using Stankins.Alive;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Xbehave;
using Xunit;

namespace StankinsTestXUnit
{
    [Trait("ReceiverSqlServer", "")]
    [Trait("ExternalDependency","SqlServer")]
    public class TestReceiverSqlServer
    {
        [Scenario]
        [Example("Server=(local);Database=master;User Id=SA;Password = <YourStrong!Passw0rd>;")]
        public void TestReceiverDBServer(string connectionString)
        {
            IReceive status = null;
            IDataToSent data = null;
            $"Assume Sql Server instance {connectionString} exists , if not see docker folder".w(() => {

            });
            $"When I create the ReceiverDBServer ".w(() => status = new ReceiverDBSqlServer(connectionString));
            $"and receive data".w(async () =>
            {
                data = await status.TransformData(null);
            });
            $"the data should have a table".w(() =>
            {
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"and the result should be true".w(() =>
            {
                data.DataToBeSentFurther[0].Rows[0]["IsSuccess"].Should().Be(true);
            });


        }
    }
}

and for cachet :



using FluentAssertions;
using Stankins.FileOps;
using Stankins.Interfaces;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Stankins.Rest;
using Xbehave;
using Xunit;
using static System.Environment;
using Stankins.Trello;
using Stankins.Cachet;

namespace StankinsTestXUnit
{
    [Trait("Cachet", "")]
    [Trait("ExternalDependency", "Cachet")]
    public class TestSenderCachet
    {
        [Scenario]
        [Example("Assets/JSON/CachetV1Simple.txt", 3)]
        public void TestSimpleJSON(string fileName,int NumberRows)
        {
            IReceive receiver = null;
           
            IDataToSent data=null;
            var nl = Environment.NewLine;
            $"Given the file {fileName}".w(() =>
            {
                File.Exists(fileName).Should().BeTrue();
            });
            $"When I create the {nameof(ReceiveRest)} for the {fileName}".w(() => receiver = new ReceiveRestFromFile(fileName));
            $"And I read the data".w(async () =>data= await receiver.TransformData(null));
            $"Then should be a data".w(() => data.Should().NotBeNull());
            $"With a table".w(() =>
            {
                data.DataToBeSentFurther.Should().NotBeNull();
                data.DataToBeSentFurther.Count.Should().Be(1);
            });
            $"The number of rows should be {NumberRows}".w(() => data.DataToBeSentFurther[0].Rows.Count.Should().Be(NumberRows));
            $"and now I transform with {nameof(SenderCachet)}".w(async ()=>
                data=await new SenderCachet("http://localhost:8000","5DiHQgKbsJqck4TWhMVO").TransformData(data)
            );

        } 

    }
}

( I have use XBehave for extensions)

Nice and easy , right ? Not so!

For up and running SqlServer I have used a docker compose file

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

and in AzureDevOps yaml start the containers, run the tests, collect the code coverage, stop the containers

docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml up -d  
        

echo 'start regular test'
        
         dotnet build -c $(buildConfiguration) stankinsv2/solution/StankinsV2/StankinsV2.sln
        
         dotnet test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --logger trx  --logger "console;verbosity=normal" --collect "Code coverage"
         echo 'coverlet'
         coverlet stankinsv2/solution/StankinsV2/StankinsTestXUnit/bin/$(buildConfiguration)/netcoreapp2.2/StankinsTestXUnit.dll --target "dotnet" --targetargs "test stankinsv2/solution/StankinsV2/StankinsTestXUnit/StankinsTestXUnit.csproj --configuration $(buildConfiguration) --no-build" --format opencover --exclude "[xunit*]*"
        
         echo 'compose down'
         docker-compose -f stankinsv2/solution/StankinsV2/StankinsTestXUnit/Docker/docker-sqlserver-instance-linux.yaml down
        

Easy, right ? That’s because SqlServer is well behaved and has a fully functional image on Docker

That is not so easy with Cachet . Cachet requires configuration – and more, after configuration, it generates a random token for write data  ( http://localhost:8000","5DiHQgKbsJqck4TWhMVO ) .

So it will be a task for docker to export the container and import again  - easy stuff, right ? Again, not.

So I start a small docker container with

docker run -p 8000:8000 –name myCachetContainer -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

and then browsing to http://localhost:8000 I have configured and grab the token

Now it is time to export :

docker export myCachetContainer -o cachet.tar

And to import as an image

docker import cachet.tar  mycac

And to run the image again

docker run -p 8000:8000  -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite cachethq/docker

And the image stopped! After many tries and docker inspect the initial image , I have resulted to

docker run -it -p 8000:8000 -e APP_KEY=base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc= -e DEBUG=false -e DB_DRIVER=sqlite --workdir /var/www/html --user 1001:1001 mycac "/sbin/entrypoint.sh"

So the workdir, user, and the entry point are not copied into the image and you should do youurself.

The final preparing for CI with Docker for Cachet ? I have docker push myimage to Docker Hub , and I will run it from docker compose.

So now my docker compose with sql server and cachet looks this way

version: '3'
services:
   db:
     image: mcr.microsoft.com/mssql/server
     ports:
       - "1433:1433"
     environment:
       SA_PASSWORD: "<YourStrong!Passw0rd>"
       ACCEPT_EULA: "Y"
     healthcheck:
       test: sqlcmd -S (local) -U SA -P '<YourStrong!Passw0rd>' -Q 'select 1'

  cachet:
     image: ignatandrei/ci_cachet
     ports:
       - "8000:8000"
      
     environment:
       APP_KEY: "base64:ybug5it9Koxwhfi5a6CORbWdpjVqXxkz/Tyj4K45GKc="
       DEBUG: "false"
       DB_DRIVER: "sqlite"
       
     user: "1001"   
     working_dir: "/var/www/html"
     entrypoint: "/sbin/entrypoint.sh"

And I have a nice C# integration tests with Azure Devops, Docker, Sql Server and Cachet ! You can see the code coverage report at https://codecov.io/gh/ignatandrei/stankins/src/master/stankinsv2/solution/StankinsV2/Stankins.Cachet/SenderCachet.cs

.NET Core Alphabet

What I wanted is a simple application ( Web, Mobile, Desktop) that can list , alphabetically, the .NET Core keywords. What is the purpose?

  1. For interviews – suppose you want to test the people knowledge in C#. You start the application( again: Desktop or Web or Mobile) and let the candidate choose a letter. Then you see the keywords for this letter and ask the candidate to explain some of the keywords
  2. For remembering features: there are so many features in .NET language (  https://docs.microsoft.com/en-us/dotnet/csharp/whats-new/csharp-version-history ) that for a programmer it is good to know – or to revisit – the features that are in the language.
  3. For contest within programmers  – like the interviews, but for the passionate programmers that want to have an easy way to decide the one with the best memory
  4. Maybe other uses that I do not know  ? Please share in comments

Now with the realization: What I want is the simple application, that has inside the database with keywords and links and any others. From this database, the code sources for the data will be generated and the application(s) will be generated. Also, data should be publicly available to profit from the crowd power –anyone that want to add something can add.

stankins.console execute -o ReceiveRestFromFile -a primaryData/netCoreAlphabet.json -o SenderToTypeScript -a “” -o TransformerConcatenateOutputString -a a.ts -o SenderOutputToFolder -a $(Build.ArtifactStagingDirectory)/data/ -a false

stankins.console execute -o ReceiveRestFromFile -a primaryData/netCoreAlphabet.json -o SenderToRazorFromFile -a primaryData/markdown.txt -o TransformerConcatenateOutputString -a cards.md -o SenderOutputToFolder -a $(Build.ArtifactStagingDirectory)/data/ -a false
 

And to complete all those, it will be put in an AzureDevops pipeline https://github.com/ignatandrei/netCoreAlphabet/blob/master/azure-pipelines.yml 

You can see the result on Android : https://play.google.com/store/apps/details?id=com.github.ignatandrei.netcorealphabet&hl=en  , WebSite: https://ignatandrei.github.io/netCoreAlphabet

Also, if you want , please contribute by making a PR by editing https://github.com/ignatandrei/netCoreAlphabet/blob/master/primaryData/netCoreAlphabet.json or by contributing to enchance the application by solving https://github.com/ignatandrei/netCoreAlphabet/issues

MVC Browser history provider for azure–trying an implementation for 3 hours

first, implement  IBrowserUserHistoryRepository  – that means implement:

public void Save(IEnumerable<BrowserUserHistoryData> history)

 

Azure have PartitionKey/RowKey – I have to add a new class.

Also or connectiing, I have to put

 

connectionString="UseDevelopmentStorage=true;" /

 

I tried to add a bulk history :
tableHistory.ExecuteBatch(batchOperation);
The result was:
Unexpected response code for operation : 0
Magic: 
<add key="TableStorageEndpoint" value="http://127.0.0.1:1002/"/>
And one hour has been gone.
Run dsinit to have storage emulator:
No connection could be made because the target machine actively refused it 127.0.0.1:10002

modified code to old Azure code:

now the answer was:

One of the request inputs is out of range.

http://msdn.microsoft.com/en-us/library/dd135715.aspx – All letters in a container name must be lowercase.

Tried that – same result:

One of the request inputs is out of range.

Maybe timestam is wrong? No…

Now debug with Fiddler :

http://sepialabs.com/blog/2012/02/17/profiling-azure-storage-with-fiddler/

image

 

Added to connection string:

 

DevelopmentStorageProxyUri=http://ipv4.fiddler
 And see this in Fiddler :

<d:PartitionKey>zungb4ovunqjd5rtal5ytc3r</d:PartitionKey>

<d:RowKey>http://localhost:2728/</d:RowKey>

: base(UserName, UserName)
 
So the problem is that RowKey does not support url values.
Now , after removing url from the RowKey  - and put username, the error was:

The specified entity already exists

Another hour passes

——————

Now, that it works, thinking about rowkey and partitionkey : no username + url => put date.ToString("yyyyMMdd_HHmmss_tttt")

0:The specified entity already exists.

Oh no, not again?

Look tables =>20121220_064024_AM -  ok, it should be

date.ToString("yyyyMMdd_HHmmss_ffffzzz")

0:The specified entity already exists

Again? debug, please

The real problem:

Forget about sending whole items history - not just not saved ones…
Now it works – kind of

Server Error in ‘/’ Application.


The method or operation is not implemented.

public IEnumerable<KeyValuePair<string, int>> MostUsed(int Count, DateTime? date)
Line 80:         {
Line 81:             throw new NotImplementedException();
Line 82:         }
Line 83: 
-----------
Implementing MostUsed(int Count, DateTime? date) 
Research about filter with data - http://storageextensions.codeplex.com/SourceControl/changeset/view/81826#1914483
Research about GroupBY – not supported!http://msdn.microsoft.com/en-us/library/windowsazure/dd135725.aspx
So now thinking about a way to STORE the data in a convenient format to can retrieve…
 It must take into consideration Count for a date and Count for all dates( date can be null) – AND BOTH THE FACT THAT THE OPERATION WILL BE DONE PER USER.
Time to think – because another hour has passed!
 

Azure tools

Azure storage Explorer : http://azurestorageexplorer.codeplex.com/  – like in VS , but simpler and cleaner

Windows Azure ASP.NET Providers Sample : http://code.msdn.microsoft.com/windowsazure/Windows-Azure-ASPNET-03d5dc14 – utils for fast membership and roles. Small problem on local.

More samples here: http://code.msdn.microsoft.com/windowsazure/

And that will be all , after reading the documentation and understanding the concepts ( for example,if you understand the session problem in azure, then you will find a Session provider in the samples and use it)

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.