Connecting to Azure Devops with a Service Principal


I see a lot of blogs and examples on the internet that shows you how to connect to environments using a username and password. This is all well and good for testing, but I believe it is bad for real world scenarios.

I’m a contractor, and my time at the job is defined, after I leave the contract and move onto the next one, the company should disable my account. What happens next? Everything I have built using a username/password, stops working. Yes, I could argue, it ensures I get a call back, but most contracts I’ve been involved in have a clause that any bugs found will be fixed for free up to 3-6 months after the job is done. Also, I like to leave a place with them thinking “That guy is awesome, lets get him back for the next project!“.

This post is going to show you how to set up a Service Principal for your Azure Devops CI/CD. At the end, I will give a very basic deployment that creates a Resource Group in Azure.

  • Go into your project.
  • At the bottom left of your screen click Project Settings
  • Within Project settings, underneath Pipelines click Service connections*. If you have a star next to the Service connections word, it means that you are viewing the preview version. I’m going to show the following screens using a preview version.
  • Click Create service connection
  • Select Azure Resource Manager, click Next
  • Ensure you have Service Principal Authentication selected. Give your connection name a title to identify where this is going to connect to. If you are using an account that doesn’t have Owner access to the subscription, then they will not show in the Subscription dropdown. In the production environment, I am not an owner. Therefore, I will need an owner on hand to help me. I will continue the next few steps where I’m not an owner. To do this, click on the link highlighted below. “Use the full version of the service connection dialog”
  • Grab the Subscription ID from your Azure environment, and the Tenant ID. Enter them into the form as shown below. I also pasted the Subscription name for reference. If you click Verify connection at this point you will get a failed connection. Now click, “use the automated version of the service connection dialog”. This might seem strange, as it reverts the page back to the original form, but the subscription part is filled in correctly now.
  • I’ve selected “Allow all pipelines to use this connection” as I will want all pipelines in this project to use this connection. Then click OK.
  • As you click OK, a login dialog will appear. This is where you will require the owner of the Azure Subscription to sign in for you. This authenticates that a user allows the service principal to be created.

The Service Principal is now connected. To see what this looks like within the tenant that you have connected to, log into Azure Portal, and head towards the App Registration section. The name of the Service Principal will be [Dev-Ops Organisation Name]-[Project Name]-[SubscriptionID/Randomguid]. The Service Principal has contribute access on the subscription.

You can also add API permissions, such as Graph, and then make direct calls to Graph API using PowerShell within the pipeline.

Proving that Service Principal connections works.

Within your Dev Ops project, click on Pipelines and select releases. We are going to create a Resource Group within our subscription.

  • Click on New Pipeline, and select empty job.
  • Close the stage window. We don’t need any artifacts for this as I’m going to write AZ CLI code inline. Click on 1 Job, 0 task.
  • Add an Azure CLI task.
    Task version: 2* (preview) – This allows us to use inline Powershell script.
    Azure Subscription: Pick the subscription you have created in the previous section.
    Script Type: Powershell
    Script Location: Inline Script
    Inline script:

    az group create –name "Demo-rg" –location uksouth
    

  • Click Save and create a release. After the release has run, and you have received success, an empty resource group should have been created within the subscription.

Updating an expired Client Secret of a SharePoint Add-in using Azure-AD/Az Cli


Back in May 2016, I wrote a post to show you how to update the Client Secret of a SharePoint add-in. This used the PowerShell module MSOL. https://cann0nf0dder.wordpress.com/2016/05/18/updating-an-expired-client-secret-of-sharepoint-add-in/

The MSOL module is now (or going to be) deprecated. Therefore, I needed to find a different way of doing this, and ideally something that could be done with Azure Dev-ops pipelines. I originally started with AZ CLI. Although the Client Secret is tied to a Service Principal in Azure AD, I was unable to change the Key Credentials or Password Credentials for it.

Due to the issues I was getting with AZ CLI, I used Azure-AD module instead. See at end of blog post, how I got Az CLI to work with a workaround.

Updating Client Secret

Connect to Azure-AD

First you will need to connect to Azure-AD

The above code ensures that you have Azure AD installed on your machine, and logs you in.

Getting the Add-in as a Service Principal

Once you have logged in, you will be able to call back your SharePoint Add-in. The SharePoint Add-in is actually a Service Principal within Azure AD, and we will grab this using Get-AzureADServicePrincipal. You can do this by using the AppId, or the AppName. The AppId is fine to use, but when you want to use the same script across multiple environments, you will need to ensure you are passing in the different AppId for each environment. THis is why I have used the Name of the Add-In. (Assuming you have given your App the same name in each environment)

Create a new Secret

Next you need to be able to create a new secret. This is done by creating random bytes and converting to a Base64String. I ensure the password is valid for an additional 2 years.

Create Key Credentials and Password Credential for the App

The SharePoint Add-In requires 2 Key Credentials (One Sign and one Verify) and 1 Password Credentials. The following script creates new ones, this allows both the old password and the new password to continue working at the same time, until you are able to update the code that uses the ClientID and ClientSecret.

Removing the original Key and Password Credential for the App

The following code shows how to loop round the Key and Password credentials and remove the original ones. It does this by looking for any Credentials that were created before the start date of the new ones. I would only run this part of my code, once I know I have updated my application to use the new password. Not in my example here, but where I’m using it in the real world, my Client Secret is stored within a Keyvault. I would update the keyvault value (ensuring to disable the previous version).

Connecting with AZ Cli Workaround

Using Azure Dev-Ops Pipeline, I really wanted to use AZ cli to be able to update the Client Secret of a SharePoint Add-in. Due to the error messages when I attempted to update the Service Principal Key Credential and Password Credentials, I was forced to use Azure-AD instead. So how can I uses AZ Cli.

My Dev-Ops Pipeline uses a service account, and I have ensured this service account has permissions to update the directory.

Then I can connect from Az Cli to Azure-AD doing the following:

I add the above code in the full script just after the parameter, as the pipeline will already be signed in as the Pipeline Service Principal. It will then grab the Access token to sign in with Azure AD, and then able to run the rest of the script.

Full Script

Unable to see public Teams within ‘Join or create a team’ section of MS Teams


Had an issue for a while now where the public teams were no longer showing within our Microsoft Teams client. Whenever you go to ‘Join or create a team’ at the bottom of the team’s panel

You were present with a screen similar to below. Also, the search box on the right hand of the screen was missing.

To fix this you need to go into your Admin Teams setting. https://admin.teams.microsoft.com

On the left-hand side, expand Org-wide settings and select Teams Settings

At the bottom of the Teams settings, there is a Search by Name section. Ensure this is turned OFF.

Once this has been turned off, it takes about 30 minutes for the tenant to update with the changes.

If you go back to ‘Join or create a team’ you should see the search box and the public teams.

Note: If you are using the browser, you might need to do a Ctrl + F5 to clear the cache.

Note: Within the Teams app, you might need to log out and in again to clear the cache.

AZ CLI putting message on a storage queue – not a valid Base-64 string


Using Az CLI a lot recently, it has made interactions with Azure so much easier using PowerShell.

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest

I had to write a simple PowerShell script that added items to a Storage queue. Once the items were added to a queue an Azure function picked up the items and processed them.

According to the documentation of Azure CLI you need to use az storage message put.

#Login
az login
#Get the connection string
$connectionString = az storage account show-connection-string --name "mystorageaccountname" –resource-group "MyResourceGroup" --query connectionString | ForEach-Object { $PSItem -join '' } | ConvertFrom-Json
#Create message
$message = "{""Name"":""Paul"",""LastName"":""Matthews"",""ID"":""d4b2ffc9-4380-46e4-a0bf-8a9ca58734d2""}"
#Add item to the queue
az storage message put --content $message --queue-name "myQueueName" --connection-string $connectionString

This fires successfully and I can see the item on the queue. However, when my Azure Function starts to run, I get the error message:

The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.

To solve this problem you just need to convert the string.

#Login
az login
#Get the connection string
$connectionString = az storage account show-connection-string --name "mystorageaccountname" –resource-group "MyResourceGroup" --query connectionString | ForEach-Object { $PSItem -join '' } | ConvertFrom-Json
#Create message
$message = "{""Name"":""Paul"",""LastName"":""Matthews"",""ID"":""d4b2ffc9-4380-46e4-a0bf-8a9ca58734d2""}"
#Encode to bytes
$b = [System.Text.Encoding]::UTF8.GetBytes($message)
#Convert to Base64String
$message64Base = [System.Convert]::ToBase64String($b)
#Add item to the queue
az storage message put --content $message64Base --queue-name "myQueueName" --connection-string $connectionString

Amazingly the value you push upto the queue is something like below:

“eyJOYW1lIjoiUGF1bCIsIkxhc3ROYW1lIjoiTWF0dGhld3MiLCJJRCI6ImQ0YjJmZmM5LTQzODAtNDZlNC1hMGJmLThhOWNhNTg3MzRkMiJ9”

But it appears on the queue correctly.

Unable to change Office 365 Group Membership


Was recently having a problem trying to change the group membership of a 365 Group. I was trying to add external users to the group, and through SharePoint it always redirects you to Outlook to do this.

  • Click on members top right of the screen.
  • Click Add members
  • Click go to Outlook to add Guests.
  • This should redirect you to the group information for the group, where you can edit; about this group, change membership, see emails, and files related to the group.

The problem I was getting, was that as soon as it hit the above page, it was redirecting to https://outlook.office365.com/people/. I also couldn’t see the Groups part, as highlighted below.

It made no sense that I couldn’t see it, I was a global administrator, I created the site, I was an owner of the site, I had a E5 license.

It turned out, it was a simple thing that took Microsoft Support, and several engineers a while to help me solve. Somehow my account mailbox had been converted to a Shared Mailbox. How or why this happened doesn’t matter.

By going to the Exchange admin centre, clicking on Recipients and Shared it displays all the Shared Mailboxes.

In the example above, David Mamam (a made-up person in my demo tenant) has a Shared Mailbox. If David attempted to click on the ‘go to outlook’ link in the SharePoint site, he would be re-directed to https://outlook.offic365.com/people. To fix this problem, David’s mailbox needs to be converted back to a regular mailbox.

To do this, click on the ‘convert’ link underneath the ‘Convert to Regular Mailbox’ within the Exchange admin center, as show above. The conversion takes a few moments.

Once complete, the user will be able to click the link to modify the Office 365 Group that they were an owner of.

Visual Studio – NuGet – No connection could be made because the target machine actively refused it 127.0.0.1:8888


Have you ever had the problem after installing / updating Visual Studio Enterprise / Pro / Community and when trying to use NuGet, you get the following error message?

“[nuget.org] Unable to load the service index for source https://api.nuget.org/v3/index.json.

An error occurred while sending the request.

Unable to connect to the remote server

No connection could be made because the target machine actively refused it 127.0.0.1:8888″

I’ve had this message a little while back and fixed the problem. Then yesterday I updated Visual Studio and the problem come back. It took me a little while to find how to fix the problem, so I decided to blog about it in case it happens again.

Now, I’m not 100% sure what caused it, but it is proxy related problem. I do have Zscaler on my machine (which I have no control over), I also have fiddler. Maybe one of them is causing the problem. Whatever causing the problem, it’s a very simple fix.

Open the file devenv.exe.config in a text editor. This can be found at the following location:

C:\Program Files (x86)\Microsoft Visual Studio\<Year>\<Version>\Common7\IDE

e.g.,

C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE

Find (mine was near the bottom of the file) and add the following:

  <defaultProxy enabled="true" useDefaultCredentials="true">
      <proxy usesystemdefault="true" bypassonlocal="true" />
   </defaultProxy>
 

So, it looks like:

<system.net>
    <defaultProxy enabled="true" useDefaultCredentials="true">
      <proxy usesystemdefault="true" bypassonlocal="true" />
    </defaultProxy>
    <settings>
      <ipv6 enabled="true"/>
    </settings>
  </system.net>

Restart your Visual Studio and then navigate to add Nuget packages.

Visual Studio Code Extension – Settings Sync


I have recently seen many posts about the top Visual Studio Code Extensions to have. For example:

https://medium.com/swlh/60-extensions-to-supercharge-visual-studio-code-2f93a51b3cf4

https://tahoeninjas.blog/2019/03/14/ultimate-developer-tool-list-for-spfx/

https://scotch.io/bar-talk/22-best-visual-studio-code-extensions-for-web-development

Depending on what type of development you work on, will depend on which extensions are best for you.

I’m not going to give you a list of extensions you need to install. I’m just going to offer you one.

Settings Sync Extension

As a contractor, I’m often having to use the client machines. Day one is always the same steps:

  1. Install Visual Studio Code
  2. Try and remember all the extensions I like to use and install.
  3. Configure extensions

The Sync Settings Extension written by Shan Khan stores your configuration in a private GIST (can be public). All you need to remember/store is your GitHub Access Token and your GIST ID. Let me explain how to set it up. (Or you can just read the Extension readme page)

Getting a GitHub Access Token

  • Sign into GitHub and generate New Token in GitHub https://github.com/settings/tokens/new?description=code-setting-sync&scopes=gist
  • In Note section, you can name your access token something different to code-setting-sync if you wish.
  • Ensure that gist is ticked.
  • Click Generate token.
  • On the next page, you will get your Personal Access token. Take note of this and store it in a safe place. Once you have closed this page you will not be able to get the value again and will need to generate a new access token.

Setting up Settings Sync for the first time.

>Sync: Upload
  • This will prompt you to enter your GitHub Personal Access Token that you created in the previous step.

(Image taken directly from the extension webpage)

  • It will upload all your settings and provide you with a GIST ID. Take a copy of this ID and store it somewhere save. It doesn’t matter too much if you lose this, as you can find this ID directly in GIST.


(Image taken directly from the extension webpage)

Setting up Settings Sync on other machines.

>Sync: Download
  • Enter your GitHub access token and press enter.
  • Enter your GIST ID
  • All your Settings / Extensions that were previously uploaded on the first machine are now downloading on your second machine and will prompt for you to restart VSCode for the extensions to start working.

What you should do on all machines.

To keep the machines in sync, I have turned on both Auto Download and Auto Upload on change. Complete the following to set this up on your machines too.

  • In Visual Studio Code, open the command palette, (Ctrl + Shift + P) and type
>Sync: Advance Options

  • Select Toggle Auto-Download On Startup
  • Repeat the process to get back to the Advance Options and select Toggle Auto-Upload on Setting Change

What is stored in the GIST

If you navigate in a browser to your GIST

https://gist.github.com/{your_userName}/{gist_id}

Or

https://gist.github.com and look for your GIST called cloudSettings.

The cloudSettings GIST contains 5 different .json files.

The cloudSettings.json file holds the extension version and the last uploaded time.

The extensions.json file contains all the extensions that you have installed.

The keybindings.json file contains any custom keybindings you have configured.

The keybindingsMac.json file contains any custom keybindings you have configured on an Apple device.

Lastly settings.json file contains any custom settings you have configured in Visual Studio Code.

There are many extensions within the marketplace and some are useful, others not so much. Although I have no personal involvement with this extension, it’s the best one for me. I only need to remember this one extension! And all my machines will keep in sync with each other.