Category: azure devops

Poor software developer simple changelog for CD

TL;DR; : Simple change log for Nuget Packages made with GIT Commands and Azure Devops

The NetCoreBlockly project have different versions on Nuget: https://www.nuget.org/packages/NetCore2Blockly/  . When I decide to put a new version on Nuget, it is enough to modify in the azure-pipelines.yml the

deployNuget: ‘0’

from 0 to 1 and AzureDevOps takes care of the rest ( including setting the version )

But I need also a changelog – something to show what is different between different versions .  If you want to do it properly, read https://keepachangelog.com/ and then find on some tool that enforces it ( for example, https://github.com/bzumhagen/dotnet-gitchanges  or https://www.npmjs.com/search?q=keywords:changelog )

For me, I wanted something simple, that generates the change log from the title of the commits . So this is my workflow

  1. Deploy the  nuget package
  2. Copy  the version from NuGet . In this case , 1.1.2020.13325959 ( the last number is trotal seconds from the stat of the year)
  3. Make modifications to come back to normal
    1. Tag the latest modif twith the version from NuGet
    2. Modify from 0 to 1 in azure-pipelines.yml
    3. Run the command  git log –pretty=format:”%n #### [%s] %n Author %an on %ai %n%n hash %h %H” 1.1.2020.12824981..1.1.2020.13325959 > a.txt
    4. Modify changelog.md with the results from a.txt
    5. Delete the a.txt file
    6. Commit/push to github

You can see the results at https://github.com/ignatandrei/NETCoreBlockly/blob/master/changelog.md

.NET Core local tools

( Video at https://youtu.be/iHLRBxi4S7c )

.NET Core has the concept of “local tools”  – that means, tools for a solution / project. That is different from “global tools”  by the fact that you can have it registered for the solution in a \.config\dotnet-tools.json file. You can read about those at https://docs.microsoft.com/en-us/dotnet/core/tools/local-tools-how-to-use

But I want to show you my local tools. First , the json:

{

“version”: 1,

“isRoot”: true,

“tools”: {

“dotnet-property”: {

“version”: “1.0.0.11”,

“commands”: [

“dotnet-property”

]

},

“powershell”: {

“version”: “7.0.0”,

“commands”: [

“pwsh”

]

},

“xunit-cli”: {

“version”: “0.1.3”,

“commands”: [

“xunit”

]

},

“coverlet.console”: {

“version”: “1.7.0”,

“commands”: [

“coverlet”

]

},

“dotnet-reportgenerator-globaltool”: {

“version”: “4.5.0”,

“commands”: [

“reportgenerator”

]

},

“dotnet-aop”: {

“version”: “2020.2.17.1904”,

“commands”: [

“dotnet-aop”

]

},

“loxsmoke.mddox”: {

“version”: “0.5.1”,

“commands”: [

“mddox”

]

},

“dotnet-project-licenses”: {

“version”: “1.1.1”,

“commands”: [

“dotnet-project-licenses”

]

},

“dotnetthx”: {

“version”: “0.2.0”,

“commands”: [

“dotnet-thx”

]

},

“dotnet-depends”: {

“version”: “0.4.0”,

“commands”: [

“dotnet-depends”

]

},

“dotnet-outdated”: {

“version”: “2.10.0”,

“commands”: [

“dotnet-outdated”

]

}

}

}

Now, how I use for code coverage:

dotnet tool restore –add-source https://myget.org/F/natemcmaster/api/v3/index.json

pwsh ./setVersion.ps1

dotnet coverlet bin\$(buildConfiguration)\netcoreapp3.1\CLITests.dll –target “dotnet” –targetargs “test –no-build –configuration $(buildConfiguration)” –exclude ‘[*Test*]*’ –format opencover  –output $(Build.ArtifactStagingDirectory)\testResults\coverlet.xml

dotnet reportgenerator “-reports:$(Build.ArtifactStagingDirectory)\testResults\coverlet.xml” “-targetdir:$(Build.ArtifactStagingDirectory)\testResults” “-reporttypes:Cobertura;HtmlSummary;Badges;HtmlInline_AzurePipelines”

In this way I can have set version and running tests and see code coverage in Azure Devops: https://dev.azure.com/ignatandrei0674/WebAPI2CLI/_build?definitionId=7&_a=summary

( also, for code coverage, see video at  https://youtu.be/JvahoA0WWms  )

You can find also a list of tools here: https://github.com/natemcmaster/dotnet-tools

BitBucket pipelines vs AzureDevOps pipelines

TL;DR; : Choose Azure – it can integrate also with BitBucket repository

For my  previous experience with AzureDevOps, please see http://msprogrammer.serviciipeweb.ro/category/azure-devops/ ,

I  have had the opportunity to play with BitBucket CI this weekend. Nothing fance, just a CI + test + artifacts for a .NET solution.

Both have  yaml files for CI /CD

Both have support for Docker

However  there were some things in BitBucket that were unpleaseant

– for the artifacts in BitBucket, I cannot find how to name it differently (https://confluence.atlassian.com/bitbucket/using-artifacts-in-steps-935389074.html  ) .  For Azure DevOps, you can put a name : https://docs.microsoft.com/en-us/azure/devops/pipelines/artifacts/build-artifacts?view=azure-devops&tabs=yaml

– artifacts in BitBucket and a .tar and a .gs file . That means, for a regular Windows user, 2 operations to get the sources. For AzureDevops, it is zip.

– both have test concepts. However, AzureDevOps let you see the CodeCoverage and test details( see a run at https://dev.azure.com/ignatandrei0674/WebAPI2CLI/_build?definitionId=7&_a=summary ) – BitBucket  just list the number of tests passed.

– For a repository, BitBucket is giving 50 minutes per month free (https://bitbucket.org/blog/everything-you-need-to-know-about-build-minutes-in-bitbucket-pipelines )  -that means something like 2 build per day . Azure Devops is giving 1800 minutes free https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/

Devops + CI/CD-part 5

WebAPI2CLI

This is a part of the series where about how I made the WebAPI2CLI - Execute ASP.NET Core WebAPI from Command Line
Source code on https://github.com/ignatandrei/webAPI2CLI/
1WebAPI2CLI - Description
2WebAPI2CLI- Organization
3WebAPI2CLI - implementing
4WebAPI2CLI - tests
5WebAPI2CLI - Devops and CI/CD
6WebAPI2CLI - documentation
7WebAPI2CLI - Conclusions
8WebAPI2CLI - Zip application

What I need for the devops:

1. Building the solution

2. Running tests

3. Deploying packages to Nuget

Being part of the Microsoft stack, it is normal that I have choose for CI / CD the Azure Devops.

You can see the AzureDevops at https://dev.azure.com/ignatandrei0674/WebAPI2CLI/_build?definitionId=7&_a=summary and how it is done by reading https://github.com/ignatandrei/WebAPI2CLI/blob/master/azure-pipelines.yml

Seeing code coverage in AzureDevops

I need not only to know that tests have runned with success, but also to see the code coverage

For this I use the .NET local tools, coverlet and report generator

dotnet coverlet bin\$(buildConfiguration)\netcoreapp3.1\CLITests.dll –target “dotnet” –targetargs “test –no-build –configuration $(buildConfiguration)” –exclude ‘[*Test*]*’ –format opencover  –output $(Build.ArtifactStagingDirectory)\testResults\coverlet.xml

dotnet reportgenerator “-reports:$(Build.ArtifactStagingDirectory)\testResults\coverlet.xml” “-targetdir:$(Build.ArtifactStagingDirectory)\testResults” “-reporttypes:Cobertura;HtmlSummary;Badges;HtmlInline_AzurePipelines”

– task: PublishTestResults@2

inputs:

testResultsFormat: ‘VSTest’

testResultsFiles: ‘**/*.trx’

searchFolder: ‘$(Build.ArtifactStagingDirectory)\trx’

displayName: publish tests

– task: PublishCodeCoverageResults@1

displayName: ‘Publish code coverage’

inputs:

codeCoverageTool: Cobertura

summaryFileLocation: ‘$(Build.ArtifactStagingDirectory)\testResults\Cobertura.xml’

reportDirectory: ‘$(Build.ArtifactStagingDirectory)\testResults’

In this way , you can see the

Modify version 

For a good CD I need also a way to modify the version automatically.

 

The .Net local Tools provides a way to install some tools on the local system to be run within dotnet.

So I install pwsh  -and run a setversion.ps1

 

$TimeNow = Get-Date

$d = $TimeNow.ToUniversalTime()

$year = $TimeNow.Year

$startOfYear = Get-Date -Year $year -Month 1 -Day 1 -Hour 0 -Minute 0 -Second 0 -Millisecond 0

$diff = NEW-TIMESPAN -Start $startOfYear -End $TimeNow

#$diff.TotalSeconds -as [int]

$assemblyVersion=$d.ToString(“1.yyyy.1MMdd.1HHmm”)

dotnet-property “**/*.csproj” AssemblyVersion:”$assemblyVersion”

dotnet dotnet-property “**/*.csproj” AssemblyVersion:”$assemblyVersion”

$version=$d.ToString(“1.0.yyyy.”) + ($diff.TotalSeconds -as [int]).ToString()

dotnet-property “**/*.csproj” Version:”$version”

dotnet dotnet-property “**/*.csproj” Version:”$version”

$releaseNotes = “BuildNumber $env:BUILD_BUILDNUMBER”

$releaseNotes += “;author $env:BUILD_SOURCEVERSIONAUTHOR”

$releaseNotes += “;message $env:BUILD_SOURCEVERSIONMESSAGE”

$releaseNotes +=”;source for this release github.com/ignatandrei/webAPI2CLI/commit/$env:BUILD_SOURCEVERSION”

$releaseNotes

dotnet-property “**/*.csproj” PackageReleaseNotes:”$releaseNotes”

dotnet dotnet-property “**/*.csproj” PackageReleaseNotes:”$releaseNotes”

I think that it is self explanatory.

Be a good nuget citizen –do not deploy every time a modification is made

For this I use devops variable :  deployNuget: ‘0’

– task: NuGetCommand@2

condition: and(succeeded(), eq(variables[‘deployNuget’], ‘1’))

inputs:

command: push

nuGetFeedType: external

packagesToPush: ‘$(Build.ArtifactStagingDirectory)/**/*.symbols.nupkg’

publishFeedCredentials: ‘nugetAndrei’

displayName: ‘dotnet nuget push’

Be a good AzureDevops citizen – do not run automation every time

I have some documentation on .md files – no need to rebuild everything when just the documentation is done

This is done in AzureDevOps by the trigger:

trigger:

branches:

include:

– master

paths:

exclude:

– docs/*

– README.md

Test your application

I also can test in AzureDevops what I have done by runnning

TestWebAPISite.exe  –CLI_ENABLED=1 –CLI_Commands=”GetMathId_Http,MathPOST”

It is just nice – maybe I can find a way to do something practical …

.NET Core global tool CD/CI with AzureDevops to Nuget

I want to can deploy automatically from GitHub  to Nuget a .NET Global ( or local ) tool.

It is enough simple;

1. Get an API key from GitHub

2. Goto to your AzureDevops , settings ( down the page ) and connect with a new Service Connection AzureDevops to GitHub ( name: nugetAndrei)

3. Make an yaml file similar with this:

# ASP.NET Core

# Build and test ASP.NET Core web applications targeting .NET Core.

# Add steps that run tests, create a NuGet package, deploy, and more:

# https://docs.microsoft.com/vsts/pipelines/languages/dotnet-core

#https://dev.azure.com/ignatandrei1970/AOPRoslyn/

pool:

vmImage: ‘VS2017-Win2016’

variables:

buildConfiguration: ‘Release’

deployNuget: ‘0’

steps:

– script: |

    cd AOPRoslyn

    dotnet tool restore

displayName: ‘restore tool’

– script: dotnet restore AOPRoslyn\AOPRoslyn.sln

displayName: ‘restore project’

– script: |

    cd AOPRoslyn

    dotnet tool run pwsh -f ./makenuget.ps1

displayName: ‘powershell to version and pack’

– script: dotnet build AOPRoslyn\AOPRoslyn.sln –configuration $(buildConfiguration)

displayName: ‘dotnet build $(buildConfiguration)’

– script: dotnet pack AOPRoslyn\aopCmd\aop.csproj  –no-build -o $(Build.ArtifactStagingDirectory) /p:Configuration=$(buildConfiguration) # –verbosity Detailed

displayName: ‘dotnet pack ‘

– task: PublishBuildArtifacts@1

inputs:

pathtoPublish: ‘$(Build.ArtifactStagingDirectory)’

artifactName: drop1

# – task: NuGetAuthenticate@0

#   inputs:

#     nuGetServiceConnections: ‘nugetAndrei’

– task: NuGetCommand@2

condition: and(succeeded(), eq(variables[‘deployNuget’], ‘1’))

inputs:

command: push

nuGetFeedType: external

packagesToPush: ‘$(Build.ArtifactStagingDirectory)/**/*.nupkg’

publishFeedCredentials: ‘nugetAndrei’

displayName: ‘dotnet nuget push’

# – task: DotNetCoreCLI@2

#   displayName: Push Nuget Package

#   inputs:

#     command: custom

#     custom: nuget

#     arguments: >

#       push $(Build.ArtifactStagingDirectory)\*.nupkg

#       -s https://api.nuget.org/v3/index.json

#       -k $(NuGetSourceServerApiKey)

You can find the source here:
https://github.com/ignatandrei/AOP_With_Roslyn/commit/e5b244f61e77cd0c5ec34c97e1a0122f9625de25

Adding Angular to WebAPI site-part 41

First, I want to add an index.html file – to see the result.

For this, I add to the startup:

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
         //more code
app.UseDefaultFiles();
app.UseStaticFiles();

I also add an index.html into a wwwroot folder ( also created into the root)

You can see the modifications here: https://github.com/ignatandrei/InfoValutar/commit/4deb32528aee7b667f22a38c8e96899052cbfd4c

Now I want to compile the Angular application and add the index html generated by Angular to the wwwroot site

I create a powershell ( easy for me , because you can install dotnet tool powershell )

echo “starting build angular”
cd InfovalutarWebAng
npm i
ng build  –prod –build-optimizer
cd ..

$source= “InfovalutarWebAng/dist/InfovalutarWebAng/”
$dest= “InfoValutarWebAPI/wwwroot/”
echo “delete files”
Get-ChildItem -Path $dest -Include *.* -File -Recurse | foreach { $_.Delete()}
echo “copy files”
Get-ChildItem -Path $source | Copy-Item -Destination $dest

and put in Azure Devops pipelines

 
– powershell: |
       cd InfoValutar
       .\copyAng.ps1
   displayName: copy angular site to web api

Now commit in GitHub (https://github.com/ignatandrei/InfoValutar/comAmit/5208036a4cb1da719692966880236dc33b1b2e74 )and waiting to see if it works

The error is : “The term ‘ng’ is not recognized as the name of a cmdlet, function, script file, or

The term ‘ng’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the

spelling of the name, or if a path was included, verify that the path is correct and try again.

Adding

npm i -g @angular/cli

It works!

Last Commit info–GitHub and AzureDevOps–part 39

I was thinking that I need to see the date of last CD – who done what. For this, I need 2 things: to have a controller/gui to show the info and the CD process, via GitHub/AzureDevOps  ,to take care of that.

For the part with code, the problem was pretty simple:

 

using Microsoft.AspNetCore.Mvc;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;

namespace InfoValutarWebAPI.Controllers
{
    /// <summary>
    /// info about commit
    /// </summary>
    public class LastCommitInfo
    {
        /// <summary>
        /// comment latest commit
        /// </summary>
        public string LatestCommit { get; set; }
        /// <summary>
        /// last date of commit
        /// </summary>
        public DateTime DateCommit { get; set; }
        /// <summary>
        /// last author of commit
        /// </summary>
        public string LastAuthor { get; set; }
    }
    /// <summary>
    /// controller about info the application
    /// </summary>
    [ApiController]
    [ApiVersion("1.0")]
    [Route("api/v{version:apiVersion}/rates")]
    public class InfoController
    {
        /// <summary>
        /// info about latest commit
        /// </summary>
        /// <returns></returns>
        public LastCommitInfo GetLatestCommit()
        {
            return
                new LastCommitInfo()
                {
                    LatestCommit = "{LatestCommit}",
                    DateCommit = DateTime.ParseExact("{DateCommit}", "yyyyMMdd:HHmmss", null),
                    LastAuthor = "{LastAuthor}"
                }
                ;
        }
    }
}

 

What about the CD process ?

Well, this was cumbersome. To see ALL the environment  variables, I used cmd /K set ( in command ) or Get-ChildItem Env: ( in powershell).

And I come with this:

A bash script to take the version

– bash: |

git log –format=’%s’ -1

git log –pretty=oneline | head -1

gitMessage=$(git log –format=’%s’ -1)

echo “##vso[task.setvariable variable=commitMessage;isOutput=true]$gitMessage”

displayName: Store commit message in variable

– powershell: .\modifyinfo.ps1

displayName: modify info

 

And a  .ps1 powershell

$file = “.\InfoValutar\InfoValutarWebAPI\Controllers\InfoController.cs”

$date = Get-Date -Format “yyyyMMdd:HHmmss”

Get-ChildItem Env:

$author= $Env:BUILD_SOURCEVERSIONAUTHOR

$commitText = $env:BASH_COMMITMESSAGE

((Get-Content -path $file -Raw) -replace ‘{LatestCommit}’,$commitText -replace ‘{LastAuthor}’,$author -replace ‘{DateCommit}’ , $date ) | Set-Content -Path $file

(Get-Content -path $file -Raw)

The result can be seen at  https://infovalutar.azurewebsites.net/api/v1.0/info

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Exchange rates–what I have done in 37 hours–part 38

What I have create for now in 37 hours :

  1. A source control – https://github.com/ignatandrei/InfoValutar
  2. A plugin based software – you can use to load any kind of exchange rates, for anywhere , provided that you implement the interface – see implementation
  3. Tests for some of the code
  4. Deployment:
  5. A SqlServer database – to store datas
  6. An Azure Function  – https://azurefuncloaddata20191205080713.azurewebsites.net/ –  to load data at time based cron intervals
  7. A GitHub action to compile ,run tests , – https://github.com/ignatandrei/InfoValutar/actions
  8. An AzureDevops CI + CD  to do all 1-6 things +code coverage + deploy https://dev.azure.com/ignatandrei0674/InfoValutar/_build?definitionId=5&_a=summary

 

I did say it is a nice work for 37 hours of work, right ?

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Azure function–solving my own code problems–part 35

1. My fault – the plugins does not exists in output

this should be added to the output in order to the plugins to be copied to the output directory

<ItemGroup>
<None Remove=”plugins\” />
<Content Include=”plugins\**\*.dll” CopyToOutputDirectory=”Always” />
</ItemGroup>

2. Deploying  , I should see what it is convenient: context.FunctionDirectory  OR context.FunctionAppDirectory

log.LogInformation($”!!! C# Timer trigger function executed at: {DateTime.Now} next {myTimer.FormatNextOccurrences(1)} “);
var folder = Path.Combine(context.FunctionDirectory, “plugins”);
log.LogInformation($”!!! Folder {folder} Folder exists: {Directory.Exists(folder)}”);

folder = Path.Combine(context.FunctionAppDirectory, “plugins”);
log.LogInformation($”!!!Folder {folder} Folder exists: {Directory.Exists(folder)}”);

3. Plugins loading – missing dll’s

Because I work with Nate Mc Master Plugins  , https://github.com/natemcmaster/DotNetCorePlugins, I encounter the  error “ could not load file or assembly ‘System.Runtime.Loader, Version=4.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a

I was trying self-contained in building Azure  – does not publish the file ‘System.Runtime.Loader . But the InfoValutarLoadingLibs/InfoValutarLoadingLibs.csproj it does!

Solution : publish self contained loading libs, build azure function , copy files from loading libs to azure function project

4. Now encountering this one:

https://github.com/Azure/Azure-Functions/issues/1250

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Deploy to Azure–part 23

Create in Azure an AppService infovalutar with InfovalutarRG resource group – can be done like here

https://docs.microsoft.com/en-us/azure/app-service/app-service-web-get-started-dotnet

or from the Azure Portal ( create AppService – I have put Linux with .NET Core 3.0)

Searching how to connect AzureDevOps – the connection should be added from Project Settings (a link at the bottom of the page)

Creating the service InfoValutarServiceConnection on AzureDevops

Saving the yaml files ( with inherent spaces problems)

https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-rm-web-app-deployment?view=azure-devops

Now it says the error:

2019-11-28T19:02:29.1210252Z ##[error]Error Code: ERROR_DESTINATION_NOT_REACHABLE
More Information: Could not connect to the remote computer ("infovalutar.scm.azurewebsites.net"). On the remote computer, make sure that Web Deploy is installed and that the required process ("Web Management Service") is started.  Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_DESTINATION_NOT_REACHABLE.

Good – but my site is  https://infovalutar.azurewebsites.net/ ,  not infovalutar.scm.azurewebsites.net

Reading documentation from https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-rm-web-app-deployment?view=azure-devops

Changing from

– task: AzureRmWebAppDeployment@3

inputs:

azureSubscription: ‘InfoValutarServiceConnection’

WebAppName: ‘infovalutar’

Package: $(System.ArtifactsDirectory)/*InfoValLinuxX64*.zip

ResourceGroupName: ‘infovalutarRG’

to

– task: AzureRmWebAppDeployment@4

and hoping in magic  ( also, put in docker job

dependsOn:

– Build_With_Test

condition: and(succeeded(),false)

in order to not execute docker job now…

And now the deployment works! See yourself at https://infovalutar.azurewebsites.net/

Infovalutar

And one hour passes...
(This is the result of 1 hour per day auto-challenge as a full cycle developer for an exchange rates application)
( You can see the sources at https://github.com/ignatandrei/InfoValutar/ )
NrPost 
1Start
2Reading NBR from internet
3Source control and build
4Badge and test
5CI and action
6Artifacts and dotnet try
7Docker with .NET Try
8ECB
9Intermezzo - Various implementations for programmers
10Intermezzo - similar code - options
11Plugin implementation
12GUI for console
13WebAPI
14Plugin in .NET Core 3
15Build and Versioning
16Add swagger
17Docker - first part
18Docker - second part
19Docker - build Azure
20Pipeline send to Docker Hub
21Play with Docker - online
22Run VSCode and Docker
23Deploy Azure
24VSCode see tests and powershell
25Code Coverage
26Database in Azure
27Sql In Memory or Azure
28Azure ConString, RSS
29Middleware for backward compatibility
30Identical Tables in EFCore
31Multiple Data in EFCore
32Dot net try again
33Start Azure Function
34Azure function - deploy
35Solving my problems
36IAsyncEnumerable transformed to IEnumerable and making Azure Functions works
37Azure functions - final
38Review of 37 hours
39Last Commit in AzureDevOps
40Create Angular WebSite
41Add static Angular to WebAPI .NET Core
42Docker for Angular
43Angular and CORS
44SSL , VSCode, Docker
45Routing in Angular
46RxJS for Routing
47RxJs Unsubscribe

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.