Programmatically connecting to Azure DevOps with a Service Principal (Management Group)


Continuing from my last post of programmatically connecting to Azure DevOps with a Service Principal at subscription level, I also wanted to show how you can create a DevOps service connection programmatically at a Management Group level.

Unlike the subscription level, you cannot just uses Az DevOps command with a management group parameter. This does not appear to be available, the answer is to pass in a json template.

You will need a few things already configured:

* See code in the powershell folder from the post on https://cann0nf0dder.wordpress.com/2020/09/14/app-only-auth-connect-to-sharepoint-online-with-msal-and-azure-keyvault/ to see how you can create this programatically.

Management Groups Enabled

  • Click on Start using management groups. This will create your “Tenant Root Group” and apply your subscriptions to the management group.

The above will set you up to walk through this demo, however, please ensure you understand what Management groups are, and how to use. https://docs.microsoft.com/en-us/azure/governance/management-groups/overview

‘User Access Administrator’ access.

On the above picture, you can see next to the words Tenant Root Group a link (details). You probably do not have the details link clickable at this time. This is because although you have been able to create the initial Tenant Root Group – Management Group, you need to promote your account access to it.

Note: You can only do this as a Global Administrator.

Manually

  • Go to your Azure Portal https://portal.azure.com
  • Go to Azure Active Directory
  • In the left hand navigation under Manage, click Properties
  • Under Access management for Azure resources switch the button to Yes.

Now if you go back to the Tenant Root Group – Management Group, you will be able to click the details link and have access to the Management group, see deployments made at that level, modify access for others etc.

Switching the button to No will then remove your access.

Programatically

Using Az Cli, log in with you account first az login. The below snippet, on line 2 shows how to give yourself access. Where line 5 & 6 would remove the account.

#Give access
az rest –method post –uri 'https://management.azure.com/providers/Microsoft.Authorization/elevateAccess?api-version=2015-07-01'
#Remove access
$account = username@example.com
az role assignment delete –assignee $account –role 'User Access Administrator' –scope '/'

The Code

Running of this code, will create a DevOps project if it doesn’t exist, and then create a Management Group level Service Connection to the Tenant Root Management Group. To apply to a different level management group would require modification to the code to grab the name and ID of the management group you wish to use and pass into the JSON template.

Your account is now set up to run, you will need to first be logged into AZ Cli.

Note: This can be a Service Principal, as long as the account being used is able to list App Registrations, and has ‘User Access Administrator’ RBAC on the Tenant Root Group – Management Group.

You will need to create a ‘management-group.json’ file which is used as a template, and key tokens will be replaced within the script.

{
"administratorsGroup": null,
"authorization": {
"scheme": "ServicePrincipal",
"parameters": {
"serviceprincipalid": "##ServicePrincipalId##",
"authenticationType": "spnKey",
"serviceprincipalkey": "##ServicePrincipalKey##",
"tenantid": "##TenantId##"
}
},
"createdBy": null,
"data": {
"environment": "AzureCloud",
"scopeLevel": "ManagementGroup",
"managementGroupId": "##ManagementGroupId##",
"managementGroupName": "##ManagementGroupName##"
},
"description": "Management Group Service Connection",
"groupScopeId": null,
"name": "##Name##",
"operationStatus": null,
"readersGroup": null,
"serviceEndpointProjectReferences": [
{
"description": "Management Group Service Connection",
"name": "##Name##",
"projectReference": {
"id": "##ProjectId##",
"name": "##ProjectName##"
}
}
],
"type": "azurerm",
"url": "https://management.azure.com/",
"isShared": false,
"owner": "library"
}

In the code below important parts to note:

(Line 32) – How the authentication works with DevOps. The Personal Access Token is added to the $Env: variable “AZURE_DEVOPS_EXT_PAT”

(Line 61 – 74) – Updating the json template, saving the file as a temp file, and then creating the Service Connection passing in the json template. The json template is the same template used by Azure DevOps when you set up the Management Group Service connection manually, you can see this by watching the network traffic.

<#
.SYNOPSIS
Creates a service connection for a ManagementGroup
Please ensure you are already logged to azure using az login
#>
param(
# Azure DevOps Personal Access Token (PAT) for the 'https://dev.azure.com/%5BORG%5D&#39; Azure DevOps tenancy
[Parameter(Mandatory)]
[string]
$PersonalAccessToken,
# The Azure DevOps organisation to create the service connection in, available from System.TeamFoundationCollectionUri if running from pipeline.
[string]
$TeamFoundationCollectionUri = $($Env:System_TeamFoundationCollectionUri -replace '%20', ' '),
# The name of the project to which this build or release belongs, available from $(System.TeamProject) if running from pipeline
[string]
$TeamProject = $Env:System_TeamProject,
[string]
$AppRegistrationName,
[securestring]
$AppPassword
)
$ErrorActionPreference = 'Stop'
$InformationPreference = 'Continue'
#Clearing default.
az configure –defaults group=
$account = az account show | ConvertFrom-Json
$Env:AZURE_DEVOPS_EXT_PAT = $PersonalAccessToken
Write-Information -MessageData:"Adding Azure DevOps Extension…"
az extension add –name azure-devops
Write-Information -MessageData "Configure defaults Organization:$TeamFoundationCollectionUri…"
az devops configure –defaults organization="$TeamFoundationCollectionUri"
Write-Information -MessageData "Getting App Registration: $AppRegistrationName…"
$AppReg = az ad app list –all –query "[?displayName == '$AppRegistrationName']" | ConvertFrom-Json
Write-Information -MessageData "Give App Registration access to Management Group Root…"
az role assignment create –role "Owner" –assignee $($AppReg.appId) –scope "/"
Write-Information -MessageData "Checking if $TeamProject project exists…"
$ProjectDetails = az devops project list –query "value[?name == '$TeamProject']" | Select-Object -First 1 | ConvertFrom-Json
if(-not $ProjectDetails){
Write-Information -MessageData "Creating $TeamProject project…"
$ProjectDetails = az devops project create –name $TeamProject
}
Write-Information -MessageData "Checking if service endpoint already exists…"
$ServiceEndpoint = az devops service-endpoint list –project "$TeamProject" –query "[?name == '$($AppReg.DisplayName)-Mg']" | Select-Object -First 1 | ConvertFrom-Json
if (-not $ServiceEndpoint) {
Write-Information -MessageData "Getting Json file for Management Group…"
$managementGroupJson = Get-Content -Raw -Path "$PSScriptRoot/management-group.json"
$configFilePath = "$PSScriptRoot/temp-managementGroup.json"
$managementGroupJson = $managementGroupJson -replace '##TenantId##', $($Account.homeTenantId) `
-replace '##ManagementGroupId##', $($Account.homeTenantId) `
-replace '##ManagementGroupName##', "Tenant Root Group" `
-replace '##ServicePrincipalId##', $($AppReg.appId) `
-replace '##ServicePrincipalKey##', $(ConvertFrom-SecureString -SecureString:$AppPassword -AsPlainText) `
-replace '##Name##', "$($AppReg.DisplayName)-Mg" `
-replace '##ProjectId##', $($ProjectDetails.id) `
-replace '##ProjectName##', $($ProjectDetails.name)
Write-Information -MessageData "Saving management json file…"
Set-Content -Value:$managementGroupJson -Path:$configFilePath
Write-Information -MessageData "Creating Service Connection name:$($AppReg.DisplayName)-Mg for project $TeamProject…"
$ServiceEndpoint = az devops service-endpoint create –project "$TeamProject" –service-endpoint-configuration "$configFilePath" | ConvertFrom-Json
Write-Information -MessageData "Clean up temp files"
Remove-Item -Path $configFilePath
}
Write-Information -MessageData "Updating Service Connection to be enabled for all pipelines…"
az devops service-endpoint update –project "$TeamProject" –id "$($ServiceEndpoint.id)" –enable-for-all true | Out-Null

To run the above code, you will need to put in your parameters. Replace with your values then run the below script, this will call the script above.

$PersonalAccessToken = "<PAT TOKEN>"
$TeamProject = '<PROJECT NAME>'
$TeamFoundationCollectionUri = 'https://dev.azure.com/<organizationName >'
$AppRegistrationName = '<Service Principal Name>'
$AppPassword = '<Service Principal Secret>'
$AppSecurePassword = ConvertTo-SecureString -String:$AppPassword -AsPlainText -Force
.\Install-ServiceConnectionManagementGroup.ps1 -PersonalAccessToken $PersonalAccessToken `
-TeamFoundationCollectionUri:$TeamFoundationCollectionUri `
-TeamProject:$TeamProject `
-AppRegistrationName:$AppRegistrationName `
-AppPassword:$AppSecurePassword

My team project is called AutomateDevOpsMG, and I used an App Registration called DevOps.

Running Script

Service Principal with Owner access on Management Group Level

Project ‘AutomateDevOpsMG’ and Service connection ‘DevOps-Mg’

Programmatically connecting to Azure Devops with a Service Principal (Subscription)


A previous post of mine Connecting to Azure Devops with a Service Principal has been popular since I have written it. Therefore, I’ve decided to extend on the topic and show how you can do it programatically with AZ DevOps.

You will need a few things already configured:

* See code in the powershell folder from the post on https://cann0nf0dder.wordpress.com/2020/09/14/app-only-auth-connect-to-sharepoint-online-with-msal-and-azure-keyvault/ to see how you can create this programatically.

Create a DevOps PAT token

  • Go to your Azure devops https://dev.azure.com
  • Sign in and click on User settings -> Personal access tokens
  • Click New Token
    • Give it a meaningful name so you know what the PAT token is for in the future. (E.g, Devops Service Connection)
    • Select your Organization
    • Select the Expiration date for as long as you need. Maximum 1 Year
    • Select Scopes at Full access (You might want to tighten your permission in a production environment, for this demo Full access is fine).
    • Click Create
  • Once you have clicked Create this is the only chance to grab a copy of the token. Please take a copy of this token as you will require it later.

The Code

You will need to first be logged into Az Cli. You can sign in using a service principal as you might with a pipeline, as long as the account being used is able to list App Registrations, and ‘User Access Administrator’ RBAC role to be able to apply contribute access to the DevOps service principal on the subscription (Line 43) .

The important part to note in the code is how the authentication works with Devops. The Personal Access Token is added to the $Env: variable “AZURE_DEVOPS_EXT_PAT”. (Line 32)

<#
.SYNOPSIS
Creates a service connection for a subscription
Please ensure you are already logged to azure using az login
#>
param(
# Azure DevOps Personal Access Token (PAT) for the 'https://dev.azure.com/%5BORG%5D&#39; Azure DevOps tenancy
[Parameter(Mandatory)]
[string]
$PersonalAccessToken,
# The Azure DevOps organisation to create the service connection in, available from System.TeamFoundationCollectionUri if running from pipeline.
[string]
$TeamFoundationCollectionUri = $($Env:System_TeamFoundationCollectionUri -replace '%20', ' '),
# The name of the project to which this build or release belongs, available from $(System.TeamProject) if running from pipeline
[string]
$TeamProject = $Env:System_TeamProject,
[string]
$AppRegistrationName,
[securestring]
$AppPassword
)
$ErrorActionPreference = 'Stop'
$InformationPreference = 'Continue'
$account = az account show | ConvertFrom-Json
#Clearing default.
az configure –defaults group=
$Env:AZURE_DEVOPS_EXT_PAT = $PersonalAccessToken
Write-Information -MessageData:"Adding Azure DevOps Extension…"
az extension add –name azure-devops
Write-Information -MessageData "Configure defaults Organization:$TeamFoundationCollectionUri …"
az devops configure –defaults organization="$TeamFoundationCollectionUri"
Write-Information -MessageData "Getting App Registration: $AppRegistrationName…"
$AppReg = az ad app list –all –query "[?displayName == '$AppRegistrationName']" | ConvertFrom-Json
Write-Information -MessageData "Give App Registration Contributor access to Subscription…"
az role assignment create –role 'Contributor' –assignee $($AppReg.appId)
Write-Information -MessageData "Checking if $TeamProject project exists…"
$ProjectDetails = az devops project list –query "value[?name == '$TeamProject']" | ConvertFrom-Json
if(-not $ProjectDetails){
Write-Information -MessageData "Creating $TeamProject project…"
$ProjectDetails = az devops project create –name $TeamProject
}
Write-Information -MessageData "Checking if service endpoint already exists…"
$ServiceEndpoint = az devops service-endpoint list –project "$TeamProject" –query "[?name == '$($AppReg.DisplayName)-Subscription']" | Select-Object -First 1 | ConvertFrom-Json
if(-not $ServiceEndpoint){
Write-Information -MessageData "Creating Service Connection name:$($AppReg.DisplayName)-Subscription for project $TeamProject…"
$Env:AZURE_DEVOPS_EXT_AZURE_RM_SERVICE_PRINCIPAL_KEY = $(ConvertFrom-SecureString -SecureString:$AppPassword -AsPlainText)
$ServiceEndpoint = az devops service-endpoint azurerm create –project "$TeamProject" –name "$($AppReg.DisplayName)-Subscription" –azure-rm-service-principal-id "$($AppReg.appId)" –azure-rm-subscription-id "$($Account.id)" –azure-rm-subscription-name "$($Account.name)" –azure-rm-tenant-id "$($Account.tenantId)" | ConvertFrom-Json
}
Write-Information -MessageData "Updating Service Connection to be enabled for all pipelines…"
az devops service-endpoint update –project "$TeamProject" –id "$($ServiceEndpoint.id)" –enable-for-all true | Out-Null

To run the above code, you will need to put in your parameters. Replace with your values then run the script, this will call the script above.

$PersonalAccessToken = '<Put your PAT Token>'
$TeamProject = '<Project Name>'
$TeamFoundationCollectionUri = 'https://dev.azure.com/<OrganizationName >'
$AppRegistrationName = '<Service Principal Name>'
$AppPassword = '<Service Principal Secret>'
$AppSecurePassword = ConvertTo-SecureString -String:$AppPassword -AsPlainText -Force
.\Install-ServiceConnectionSubscription.ps1 -PersonalAccessToken $PersonalAccessToken `
-TeamFoundationCollectionUri:$TeamFoundationCollectionUri `
-TeamProject:$TeamProject `
-AppRegistrationName:$AppRegistrationName `
-AppPassword:$AppSecurePassword

My team project is called AutomateDevOps, and I used an App Registration called DevOps.

Running Script

Service Principal with Contribute on Subscription

Project ‘AutomateDevOps’ and Service connection ‘DevOps-Subscription’

My next blog post explains how do make a Management Group Service Connection instead of a Subscription level. ‘Programmatically connecting to Azure Devops with a Service Principal (Management Group)

App-Only Auth Connect to SharePoint Online with MSAL and Azure KeyVault


Now that SharePoint Online CSOM is now works with .NET Framework, I thought I would put together a demo using Visual Studio code that connects to SharePoint.

A previous colleague and still good friend of mine Vardhaman Deshpande in June wrote a blog post showing how to connect to SharePoint Online using MSAL. It is so well written that really writing another blog about it seems a little pointless, so I have taken his blog a little further by connecting to a KeyVault in Azure and grabbing the certificate directly from there.

.NET Standard CSOM of SharePoint Online now uses OAuth for authentication. This means an Access Token needs to be grabbed and passed to every call that is made to SharePoint Online. We will do this by grabbing the AppID and Certificate from the KeyVault and then get the Access Token through ConfidentialClientApplicationBuilder. The Access Token will then be passed into the ClientContext so all calls are made with the Access Token to SharePoint Online.

Walk-through Demo

The demo I have put together can be found at my GitHub repository. By using a Azure AD App Registration and a client certificate, I will walk-through the steps here to set up the following:

  • Create a Resource Group in Azure
  • Create a KeyVault
  • Create a Certificate and Store it in the KeyVault
  • Create an Azure AD App registration
  • Store the ClientID in the KeyVault Secrets
  • Grant Azure AD Application permission for SharePoint – Sites.FullControl.All
  • Console code that connects to SharePoint Online.

Setup

To perform all the steps above as a manual walk-through would take a lot of time to go through. Also, where I can automate things I do. Therefore in the GitHub project under the PowerShell Folder there is a PowerShell file called Install-AzureEnvironment.ps1.

This uses AZ Cli and running the below script will create the above for you in your Azure environment. Replace “Contso” with the name of your tenant.

az login
$tenantName = "contso"
# Defaults to UK South
.\Install-AzureEnvironment.ps1 Environment $tenantName Name "SharePointMSAL"
# If wish to change location
#.\Install-AzureEnvironment.ps1 -Environment $tenantName -Name "SharePointMSAL" -Location:'<Location>'

The above will create the following Azure Resources (using the example of Consto as tenant name)

  • Resource Group: Contso-SharePointMSAL
  • App Registration: Contso-SharePointMSAL (Granted with SharePoint > Sites.FullControl.All)
  • Key Vault: Contso-SharePointMSAL (Will be truncated to 24 characters if longer)
  • CertificateName stored in KeyVault Certificates: Contso-SharePointMSAL
  • ClientId stored in KeyVault Secret: ConstoSharePointMSAL

Console Application

Using Visual Studio Code, I’ve create a .NET Core 3.1 console application and added the following nuget packages. Please see my previous blog post “Basic dotnet commands to create C# project in Visual Studio Code” on how to create a Console application and add nuget packages.

  • Microsoft.SharePointOnline.CSOM
    • Used for the SharePoint CSOM calls
  • Microsoft.Identity.Client
    • Used for OAuth authentication
  • Azure.Identity
    • Used for KeyVault authentication
  • Azure.Security.KeyVault.Secrets
    • Used for getting the Secret and Certificate from the vault
  • Microsoft.Extensions.Configuration
    • Used for collecting app.config values
  • Microsoft.Externsions.Configuration.FileExtensions
    • Used for collecting app.config values
  • Microsoft.Extensions.Configuration.Json
    • Used for collecting app.config values

Next you will need to create (or update if cloned the github project) the appsettings.json file. Replace the environment and site values for your environment.

{
"environment": "<tenantName>",
"name": "SharePointMSAL",
"site": "<relative URL e.g, /sites/teamsite>"
}

Then update the program.cs file with the following code. The code has been written assuming your Azure resources using the .\Install-AzureEnvironment.ps1.

using System;
using Microsoft.Identity.Client;
using Microsoft.SharePoint.Client;
using System.Security.Cryptography.X509Certificates;
using System.Threading.Tasks;
using Microsoft.Extensions.Configuration;
using Azure.Identity;
using Azure.Security.KeyVault.Secrets;
namespace SharePointMSAL
{
class Program
{
static async Task Main(string[] args)
{
IConfiguration config = new ConfigurationBuilder()
.AddJsonFile("appsettings.json", true, true)
.Build();
string siteUrl = $"https://{config["environment"]}.sharepoint.com{config["site"]}";
string identity = $"{config["environment"]}-{config["name"]}";
string keyVaultName = GetKeyVaultName(identity);
string certificateName = identity;
string tenantId = $"{config["environment"]}.onmicrosoft.com";
string clientIDSecret = identity.Replace("_","").Replace("-","");
string clientId = GetSecretFromKeyVault(keyVaultName, clientIDSecret);
//For SharePoint app only auth, the scope will be the Sharepoint tenant name followed by /.default
var scopes = new string[] { $"https://{config["environment"]}.sharepoint.com/.default" };
var accessToken = await GetApplicationAuthenticatedClient(clientId, keyVaultName, certificateName, scopes, tenantId);
var ctx = GetClientContextWithAccessToken(siteUrl, accessToken);
Web web = ctx.Web;
ctx.Load(web);
await ctx.ExecuteQueryAsync();
Console.WriteLine(web.Title);
}
private static string GetKeyVaultName(string identity)
{
var keyVaultName = identity;
if (keyVaultName.Length > 24)
{
keyVaultName = keyVaultName.Substring(0, 24);
}
return keyVaultName;
}
private static async Task<string> GetApplicationAuthenticatedClient(string clientId, string keyVaultName, string certificateName, string[] scopes, string tenantId)
{
var certificate = GetAppOnlyCertificate(keyVaultName, certificateName);
IConfidentialClientApplication clientApp = ConfidentialClientApplicationBuilder
.Create(clientId)
.WithCertificate(certificate)
.WithTenantId(tenantId)
.Build();
AuthenticationResult authResult = await clientApp.AcquireTokenForClient(scopes).ExecuteAsync();
string accessToken = authResult.AccessToken;
return accessToken;
}
public static ClientContext GetClientContextWithAccessToken(string targetUrl, string accessToken)
{
ClientContext clientContext = new ClientContext(targetUrl);
clientContext.ExecutingWebRequest += delegate (object oSender, WebRequestEventArgs webRequestEventArgs)
{
webRequestEventArgs.WebRequestExecutor.RequestHeaders["Authorization"] = "Bearer " + accessToken;
};
return clientContext;
}
public static X509Certificate2 GetAppOnlyCertificate(string keyVaultName, string certificateName)
{
var keyVaultUrl = $"https://{keyVaultName}.vault.azure.net";
var client = new SecretClient(new Uri(keyVaultUrl), new DefaultAzureCredential());
KeyVaultSecret keyVaultSecret = client.GetSecret(certificateName);
X509Certificate2 certificate = new X509Certificate2(Convert.FromBase64String(keyVaultSecret.Value), string.Empty,
X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet |
X509KeyStorageFlags.Exportable);
return certificate;
}
public static string GetSecretFromKeyVault(string keyVaultName, string secretName){
var keyVaultUrl = $"https://{keyVaultName}.vault.azure.net";
var client = new SecretClient(new Uri(keyVaultUrl), new DefaultAzureCredential());
KeyVaultSecret keyVaultSecret = client.GetSecret(secretName);
return keyVaultSecret.Value;
}
}
}

After the code runs it will display the site Title. In my case ‘TestAPISite’.

The important piece of code to get the certificate from Azure Key Vault is GetAppOnlyCertificate function on line 78. This is using the new Azure.Identity and Azure.Security.KeyVault.Secrets libraries.

The Azure.Identity information can be found here: https://docs.microsoft.com/en-us/dotnet/api/overview/azure/identity-readme?view=azure-dotnet

The key to authentication to the KeyVault is on line 82 using DefaultAzureCredential, as this attempts to connect using different authentication methods. Once connected, it retrieves the certificate value and creates a X509Certificate2 certificate in memory. The only confusing part of the code is using Azure.Security.KeyVault.Secrets to get the value not Azure.Security.KeyVault.Certificates.

The image below taken from the Microsoft documentation, shows how the DefaultAzureCredential will attempt to authenticate via the following mechanisms in order.

https://docs.microsoft.com/en-us/dotnet/api/overview/azure/identity-readme?view=azure-dotnet#defaultazurecredential
  • Environment – The DefaultAzureCredential will read account information specified via environment variables and use it to authenticate.
  • Managed Identity – If the application is deployed to an Azure host with Managed Identity enabled, the DefaultAzureCredential will authenticate with that account.
  • Visual Studio – If the developer has authenticated via Visual Studio, the DefaultAzureCredential will authenticate with that account.
  • Visual Studio Code – If the developer has authenticated via the Visual Studio Code Azure Account plugin, the DefaultAzureCredential will authenticate with that account.
  • Azure CLI – If the developer has authenticated an account via the Azure CLI az login command, the DefaultAzureCredential will authenticate with that account.
  • Interactive – If enabled the DefaultAzureCredential will interactively authenticate the developer via the current system’s default browser.

Note: When I first ran my code on Visual Studio Code, I kept getting an authentication issue, it was because I had Visual Studio Enterprise installed on my machine and it was picking up the authentication method selected there which was pointing to a different tenant. You can see this in Visual Studio Enterprise by going into Tools > Options

The great thing about DefaultAzureCredential, is that if this code was within a Azure Function, I could run it first on my computer, then deploy it to Azure Functions with a Managed Identity, and it would still work without any changes to the code.

I hope you find this blog post useful.

Grant Application and Delegate Permissions using an App Registration


This blog post came about because I wanted a way to create new Application Registrations and grant consent for the tenant, all programmatically. This is so I can use Devops pipelines to create and deploy my code without any human interaction or using a person account.

The AZ cli can grants permissions, but it does not seem to work for Admin consented permissions. I found the following post by Sam Coganhttps://samcogan.com/provide-admin-consent-fora-azure-ad-applications-programmatically/ saying that it was possible if you use REST API calls. This did work for me; however, it was just the Delegated permissions.

By reading through Sam’s post it helped me understand the connection between Application Registrations, Service Principals and Oauth2permission, and helped me on the quest of understand how to grant the Application permissions through appRoleAssignments.

I also want to credit Sahil Malik as I found his post https://winsmarts.com/how-to-grant-admin-consent-to-an-api-programmatically-e32f4a100e9d after I worked it all out myself, and was able to confirm that what I was doing was right.

At my Github project https://github.com/pmatthews05/CFAppOnlyGrantPermissions the README.md will walk you through how to set up and run the code. At the end of the README.md file you should have 2 Application Registration, where the Azure API Registration app would have created the second app (in my case CFCodeApp) for you. This code is idempotent. You can change the permissions for an existing Application Registration by providing it a different Permission.json file.

As the README.md file gives the instruction on how to run the code, I will not replicate it here. I will use the rest of this post to explain how the code works.

Permissions required for ‘Azure API Registration’

To allow the Azure API Registration to create new Application Registrations using AZ cli it requires to use both the legacy Azure Active Directory Graph and Microsoft Graph permissions. It seems that some of the commands in the az cli still points to https://graph.windows.net when it makes calls, according to some issue notes in the az cli git hub repository, it looks like this is in the process of being changed.

With Azure Active Directory Graph we need 2 permissions

  • Application.ReadWrite.All – This allows us to read and write the Application Registrations.
  • Directory.ReadWrite.All – This allows us to read the application registration permission list, and service principal information.

With Microsoft Graph we also need permission.

  • AppRoleAssignment.ReadWrite.All – This allows us to call the REST API to grant permissions and assign Role assignment permissions.

Steps in the code

  • Set-AppRegistration
  • Set-AppCredentials
  • Set-ServicePrincipalForAppId
  • Remove-CurrentAppPermissions
  • Set-DelegatePermissions
    • Remove-CurrentOauth2PermissionGrants
  • Set-ApplicationPermissions
    • Remove-CurrentServicePrincipalGrants

Please note, the snippets of code I am showing here in the blog, are showing the command(s) that are performing the main action, not the full function.

Set-AppRegistration

We need an App Registration to be created first. If the name already exists it just returns the existing App Registration

#Creates or updates an existing App Registration
az ad app create --display-name '$ApplicationName'
view raw appReg.ps1 hosted with ❤ by GitHub

Set-AppCredentials

#Creates a secret with a random password
az ad app credential reset --id $AppId --credential-description 'Registration' --end-date 2299-12-31

This will create a secret for the App Registration with a random secret with the description set to Registration. There are a couple of override parameters that I am not using, where you can give it your own Description, and provide your own SecureString secret. This returns the appCredentials that supply the appId, name, password and tenantId. In the script it outputs this to screen at the end, however, if using in production environment, you would probably want to put the secret value in a keyvault, without displaying to the user what the value is.

Set-ServicePrincipalForAppId

#Creates a Service Principal for a given Application Registration Id
az ad sp create --id $AppId

All App Registrations require a Service Principal behind them. When you manually create an App Registration and assign permissions, it automatically creates a service principal for you. When you create an App Registration programmatically, it is your responsibility to also create the Service Principal. It is the Service Principal that defines the access policy and permissions for the user/application in the Azure AD tenant. A multi-tenant App Registration would have the same app Id in all tenants, but all have a different Service Principal which allows them access within that tenant. For example, in all tenants the AppId for the Microsoft Graph API is ‘00000003-0000-0000-c000-000000000000’ and in your tenant it has an associated Service Principal, which is a different object Id in your tenant compared to mine. When I finally understood this, it made more sense how this all ties together.

Reference: https://docs.microsoft.com/en-us/azure/active-directory/develop/app-objects-and-service-principals

The above az command is not idempotent, and therefore a check to see if it already exists is required.

Remove-CurrentAppPermissions

To allow idempotency of my script, I wanted to ensure that it removes all existing permissions before adding them back in. This piece of code does not remove the permission from the service principal, and if you stop the code after this command, you will see that your API Permissions in the GUI would look like this.

Seeing the permissions separated at the bottom of the screen, now understanding the relationship between Application Registrations and Service Principal, it makes a lot more sense to me now. The service principal still has access at this point and calls to these API’s will still work.

#Get all permissions for the Application Registration
$currentPermissionCollection = @(az ad app permission list --id $AppId | ConvertFrom-Json)
#Remove the permissions (resourceAppId) for the Application Registration
$currentPermissionCollection | ForEach-Object {
$permission = $PSItem
if ($permission.Count -eq 0) { return }
$permission.resourceAppId | ForEach-Object{
$resourceAppId = $PSItem
az ad app permission delete --id $AppId --api $resourceAppId
}
}

The code gets a list of all the permissions assigned to the Application Registration, then loops through each resourceAppId (the objectId value of the API permission service principal e.g, Microsoft Graph, SharePoint)*2 and deletes the permission.

Set-DelegatePermissions

To ensure the code is idempotent the first thing I am doing is removing the delegate permission from the Service Principal. See the next section on how this works.

Now we need to assign the Application Registration permissions for the delegate permissions. We do this by providing the AppID of our Application Registration, the API Permission AppID (the appID of the API Permission service principal e.g, Microsoft Graph)*2 and the oauth2Permissions scope Id*4

#Get Graph APIServicePrinicpal information
$APIServicePrincipal = az ad sp list --query "[?appDisplayName=='Microsoft Graph'].{appId:appId,objectId:objectId}" --all | ConvertFrom-Json
#Get Directory.ReadWrite.All oauth2Permission
$delegatePermInfo = az ad sp show --id $($APIServicePrincipal.appId) --query "oauth2Permissions[?value=='Directory.ReadWrite.All']" | ConvertFrom-Json
#Add Permission (Scope means Delegate)
az ad app permission add --id $appId --api $($APIServicePrincipal.appId) --api-permissions $($delegatePermInfo.id)=Scope

Next, we need to assign the Application Registration associated Service Principal oauth2Permissions to grant these permissions to the tenant.

Using Graph API explorer, you can view all the delegate permissions in your tenant using the following URL:

https://graph.microsoft.com/v1.0/oauth2permissiongrants

To find all the permissions grants for your Application Registration you will need the Service Principal Object ID*1 and then use the following URL:

https://graph.microsoft.com/v1.0/oauth2permissiongrants?$filter=clientId eq ‘<servicePrincipalObjectId>’ and consentType eq ‘AllPrincipals’

{
"@odata.context": "https://graph.microsoft.com/v1.0/$metadata#oauth2PermissionGrants&quot;,
"value": [
{
"clientId": "8xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx8299",
"consentType": "AllPrincipals",
"id": "AxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxWry4",
"principalId": null,
"resourceId": "a2adb674-25ec-4bba-ba12-f6fc2f96af2e",
"scope": "User.Read Group.ReadWrite.All"
}
]
}
  • clientId: This is the Service Principal Object ID that is tied to your App Registration
  • consentType: Set to AllPrincipals when granted to the entire tenant, or Principal when granted to an individual user
  • id: The ID of the oauth2permissiongrants
  • principalId: This is set to null if using AllPrincipals, otherwise it will contain the objectID of the User that has been granted the permission
  • resourceId: This is the Serivce Principal Object ID value of the API Permission*2
  • scope: This is a string array of granted scope values for the given ResourceId. (e.g User.Read Directory.Read.All etc)

If the oauth2permissiongrants with the App Registration Service Principal Object ID and API Permission Service Principal Object ID (clientId and resourceId) doesn’t exist in your tenant, then you will need to POST a new oauth2permissiongrants, otherwise you will require to PATCH an existing oauth2permissiongrants/<id> with the new string array of scope values.

#Get a access Token
$tokenResponse = az account get-access-token --resource-type ms-graph | convertFrom-Json
$body = @{
clientId = $($servicePrincipal.objectId)
consentType = "AllPrincipals"
principalId = $null
resourceId = $($APIServicePrincipal.objectId)
scope = "User.Read Directory.ReadWrite.All"
startTime = "0001-01-01T00:00:00Z"
expiryTime = "2299-12-31T00:00:00Z"
}
$apiUrl = "https://graph.microsoft.com/v1.0/oauth2Permissiongrants&quot;
Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization = "Bearer $($tokenResponse.accessToken)" } -Method $method -Body $($body | ConvertTo-Json) -ContentType "application/json"

You must add a startTime and expiryTime, it does not matter what the datetime is, as long as expiryTime is later than the startTime.

Remove-CurrentOauth2PermissionGrants

To remove the permissions from the Service Principal for the Delegate Permissions, we need to remove the Oauth2PermissionGrants.

Unfortunately, with App Only permissions you cannot delete an oauth2permissiongrants. You require to access the directory as a person to delete. I found that by setting the scope to empty string, gives the same desired effect as removing them.

#Get all Permission for the Service prinipal ObjectId.
$exisitingCollection = az ad app permission list-grants --filter "clientId eq '$($ServicePrincipalObjectId)' and consentType eq 'AllPrincipals'" | ConvertFrom-Json
#Get a access Token
$tokenResponse = az account get-access-token --resource-type ms-graph | convertFrom-Json
$existingCollection | ForEach-Object {
$existing = $PSItem
#Get the PermissionGrant
$apiUrlPatch = "https://graph.microsoft.com/v1.0/oauth2Permissiongrants/$($existing.objectId)&quot;
$body = @{
scope = ""
}
#Patch with an empty scope.
Invoke-RestMethod -uri $apiUrlPatch -Headers @{Authorization = "Bearer $(tokenResponse.accessToken)"} -Method $PATCH -Body $($body | ConvertTo-Json) -ContentType "application/json"
}

Please Note: I am using Invoke-RestMethod instead of az rest because I have not been able to get it to work without an error message.

Set-ApplicationPermissions

To ensure the code is idempotent the first thing I am doing is removing the application permission grants from the Service Principal. See the next section on how this works.

Now we need to assign the Application Registration permissions for the application permissions. We do this by providing the AppID of our Application Registration, the API Permission AppID (the appID of the API Permission service principal e.g, Microsoft Graph)*2 and the AppRoles scope Id*3

#Get Graph APIServicePrinicpal information
$APIServicePrincipal = az ad sp list --query "[?appDisplayName=='Microsoft Graph'].{appId:appId,objectId:objectId}" --all | ConvertFrom-Json
#Get Directory.ReadWrite.All appRolesPermission
$appRolePermInfo = az ad sp show --id $($APIServicePrincipal.appId) --query "appRoles[?value=='Directory.ReadWrite.All']" | ConvertFrom-Json
#Add Permission (Role means Application)
az ad app permission add --id $appId --api $($APIServicePrincipal.appId) --api-permissions $($appRolePermInfo.id)=Role

Next, we need to assign the Application Registration associated Service Principal AppRoleAssignments to grant these permissions to the tenant.

Using Graph API explorer, you can view all the Application Role Grants for your Application Registration. You will need the Service Principal Object ID*1 and then use the following URL:

https://graph.microsoft.com/v1.0/servicePrincipals//appRoleAssignments

"@odata.context": "https://graph.microsoft.com/v1.0/$metadata#servicePrincipals('862b7a01-7bb3-4540-ae4e-a2a84c7d8299&#39;)/appRoleAssignments",
"value": [
{
"id": "AxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxCw",
"deletedDateTime": null,
"appRoleId": "57xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx04",
"createdDateTime": "2020-06-18T10:45:50.700178Z",
"principalDisplayName": "CFCodeApp",
"principalId": "86xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx99",
"principalType": "ServicePrincipal",
"resourceDisplayName": "Windows Azure Active Directory",
"resourceId": "5cxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx27"
},
{
"id": "AxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxA8",
"deletedDateTime": null,
"appRoleId": "33xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx0d",
"createdDateTime": "2020-06-18T09:50:01.3746638Z",
"principalDisplayName": "CFCodeApp",
"principalId": "86xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx99",
"principalType": "ServicePrincipal",
"resourceDisplayName": "Microsoft Graph",
"resourceId": "a2xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx2e"
},
{
"id": "AxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxHI",
"deletedDateTime": null,
"appRoleId": "7axxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx61",
"createdDateTime": "2020-06-18T09:49:51.3226179Z",
"principalDisplayName": "CFCodeApp",
"principalId": "86xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx99",
"principalType": "ServicePrincipal",
"resourceDisplayName": "Microsoft Graph",
"resourceId": "a2xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx2e"
}
]
}

Unlike the Oauth2PermissionGrants, where there is only one entry per clientid and resourceId which contains all the scopes, with AppRoleAssignments there is an entry for each scope, and it uses the appRoleId scope Id*3 instead of the scope string value.

  • id: The ID of the appRoleAssignment
  • principalId: The Service Principal Object ID that is tied to your App Registration
  • resourceId: This is the Serivce Principal Object ID value of the API Permission*2
  • appRoleId: This is the scope Id*3
#Get a access Token
$tokenResponse = az account get-access-token --resource-type ms-graph | convertFrom-Json
$body = @{
principalId = $ServicePrincipal.objectId
resourceId = $APIServicePrincipal.objectId
appRoleId = $appPermInfo.id
}
$appRoleAssignmentUrl = "https://graph.microsoft.com/v1.0/servicePrincipals/$($ServicePrincipal.objectId)/appRoleAssignments&quot;
Invoke-RestMethod -Uri $appRoleAssignmentUrl -Headers @{Authorization = "Bearer $($tokenResponse.accessToken)" } -Method POST -Body $($body | ConvertTo-Json) -ContentType "application/json"

Remove-CurrentServicePrincipalGrants

To remove the permission from the Service Principal for the Application Role permissions, we need to remove the AppRoleAssignments ID’s for the service principal

#Get a access Token
$tokenResponse = az account get-access-token --resource-type ms-graph | convertFrom-Json
#Get all App Role Assignment Permission for the Service prinipal ObjectId.
$apiUrl = "https://graph.microsoft.com/v1.0/servicePrincipals/$ServicePrincipalObjectId/appRoleAssignments&quot;
$appRoleAssignmentCollection = @(Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization = "Bearer $($tokenResponse.accessToken)" } -Method GET -ContentType "application/json").value
$appRoleAssignmentCollection | ForEach-Object {
$appRoleAssignment = $PSItem
$deleteApiUrl = "$apiUrl/$($appRoleAssignment.id)"
Invoke-RestMethod -Uri $deleteApiUrl -Headers @{Authorization = "bearer $($tokenResponse.accessToken)" } -Method Delete -ContentType "application/json"
}

Please Note: I am using Invoke-RestMethod instead of az rest because I have not been able to get it to work without an error message.

Conclusion

That is it. In the Data folder of the github project there is an examplePermission.json file. As you can see that the JSON format is very flexible to add more or remove permissions. The name can be the appDisplayName or the AppId of the API Permission.

Run the Add-RegistrationAndGrantPermissions.ps1 script passing in the name of your App Registration you wish to create / update and your custom permission file. The script will run fine with user logged in, or an App Only with the correct permissions.

Please feel free to use/enhance the github project.

Footnotes

*1 How to find your Service Principal Object ID of your Application Registration

Using the GUI, the quickest way to find your Service Principal Object ID is to first go to the overview of the Application Registration. Then on the right-hand side of the screen, click the name of your Application Registration where Manage application in local directory is.

This will take you to the Service Principal information in your tenant. It is here you can get the ObjectID.

*2 How to find the Service Principal Object Id for the Permission API.

You may already know that Microsoft Graph API appId is 00000003-0000-0000-c000-000000000000, this is the same on all tenants, but the service principal object ID is different in each tenant. This is the resourceId. To find out what the objectID is in your tenant run the following script.

az ad sp list --query "[?appId=='00000003-0000-0000-c000-000000000000'].{appDisplayName:appDisplayName,appId:appId,objectId:objectId}" --all --output table
AppDisplayName AppId ObjectId
---------------- ------------------------------------ ------------------------------------
Microsoft Graph 00000003-0000-0000-c000-000000000000 axxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxe
....

Remove “?appId== ‘00000003-0000-0000-c000-000000000000′” between the [ ] and it will list all for your tenant.

*3 How to find the Application Role Id of a Permission API Scope.

For each Permission API such as Microsoft Graph API, there are Application Role. No matter what tenant you are in, the id of them is always the same. The below example gets the Application Scope of Directory.Read.All from the Permission API Microsoft Graph.

az ad sp show --id '00000003-0000-0000-c000-000000000000' --query "appRoles[?value=='Directory.Read.All']"

Output:

[
{
"allowedMemberTypes": [
"Application"
],
"description": "Allows the app to read data in your organization's directory, such as users, groups and apps, without a signed-in user.",
"displayName": "Read directory data",
"id": "7ab1d382-f21e-4acd-a863-ba3e13f7da61",
"isEnabled": true,
"value": "Directory.Read.All"
}
]

*4 How to find the oAuth2Permission Id of a Permission API Scope.

For each Permission API such as Microsoft Graph API, there are oauth2Permissions. No matter what tenant you are in, the id of them is always the same. The below example gets the Delegation Scope of Directory.Read.All from the Permission API Microsoft Graph.

az ad sp show --id '00000003-0000-0000-c000-000000000000' --query "oauth2Permissions[?value=='Directory.Read.All']"

Output:

[
{
"adminConsentDescription": "Allows the app to read data in your organization's directory, such as users, groups and apps.",
"adminConsentDisplayName": "Read directory data",
"id": "06da0dbc-49e2-44d2-8312-53f166ab848a",
"isEnabled": true,
"type": "Admin",
"userConsentDescription": "Allows the app to read data in your organization's directory.",
"userConsentDisplayName": "Read directory data",
"value": "Directory.Read.All"
}
]

Removing External Users fully from a SharePoint Tenancy using PowerShell


This blog post has all come about as the client I was working for was having problems sharing documents in SharePoint with some external users. It turned out that the user was already in Azure AD as a Contact which is part of Exchange. This meant when an internal person attempted to share/Invite into SharePoint/MSTeams it all appeared to work correctly for the external user, but sometimes it didn’t. When looking at external users through the Admin portal, this external user was showing, but their email address was blank. After speaking with Microsoft, it turns out, because the email address was already found within the tenancy, it creates a unique violation when adding the external user to the Active Directory.

I have been working with Microsoft support regarding this, and the resolution was that this is as design!!??! Only by feeding back on the Office 365 uservoice this issue “might” looked at and fixed. See resolution notes below:

Symptom:
When you invite external users who exist as contacts in your environment, their email does not get populated in their guest user ID which results in them not being able to login to your environment and access the shared data.
Cause:
The issue is coming from a conflict caused by the email address which is already populated for the mail contact.
Resolution:
This is behavior by design as all objects in Azure AD have to be unique.
You cannot have 2 objects with the same email address.

When you invite one of your contacts to your content in O365, it actually creates a completely new guest user object in your environment and since the email address which is supposed to be populated in the email attribute is already in use by the contact, the email address does not get populated.

The only way to resolve this issue at the moment is to eliminate any conflicts that are in place, by removing the conflicting email contact and re-invite the user to your content.
More information:
The best thing I can offer to you is the following:

Please go to our UserVoice portal where other people are facing the same behavior and up-vote it, comment and have the whole IT department do the same as well.

Allow a “Guest User” to be converted to a different account type
https://office365.uservoice.com/forums/273493-office-365-admin/suggestions/19966537-allow-a-guest-user-to-be-converted-to-a-differen

This led me to working on a process and script that would remove the users from everywhere.

Locations to remove the External User from:

  • Contacts
  • Azure AD Guest Users
  • Azure AD Deleted Users
  • All SharePoint Sites
  • All SharePoint Hidden User lists
  • SharePoint User Profile

Contacts

To remove the External User from the contacts you will need to use the MSOL PowerShell module.

$UserEmail = "<ExternalUserEmailAddress>"
Connect-MsolService
Get-MsolContact | ? EmailAddress -eq $UserEmail | Remove-MsolContact -Force

Or you can manually do this by going to admin.microsoft.com and under Users -> Contacts select the user and click Delete contacts.

Azure AD

To remove the External User from Azure AD you will still require using the MSOL PowerShell module. In fact, this script and the above script could be merged.

$Environment = "<TenantName>"
$UserEmail = "<ExternalUserEmailAddress>"
Connect-MsolService
$externalConversionEmail = ($UserEmail -replace '@', '_') + "#EXT#@" + $Environment + ".onmicrosoft.com"
$FoundUser = Get-MsolUser | ? UserPrincipalName -eq $externalConversionEmail
if($FoundUser){
Remove-MsolUser -UserPrincipalName $($FoundUser.UserPrincipalName) -Force
#To see All Deleted User Get-MsolUser -ReturnDeletedUsers
Remove-MsolUser -UserPrincipalName $($FoundUser.UserPrincipalName) -RemoveFromRecycelBin -Force
#To Remove All Deleted Users Get-MsolUsers -ReturnDeletedUsers | Remove-MsolUser -RemoveFromRecycleBin -Force
}

To do this manually, in admin.microsoft.com under Users -> Guest Users, select the user and click delete.

Then go into Users -> Deleted users and remove them from there.

Remove from SharePoint

To remove from SharePoint, if you have a large tenancy and you don’t know all the places where the external user could have been shared with, then you will have to use the following script. This script will remove the external user from the SharePoint Site, ensure that they are removed from the User Information list, and then lastly it will clear the person from the SharePoint User Profile.

I discovered that if I didn’t remove them from the User Profile, when attempted to reshare a document with that user, the people picker would grab the internal userprincipalname (<ExternalUserEmail>#EXT#@<Tenant>.onmicrosoft.com) as the email address and then prevent me clicking the Sharing button. This is because the people picker uses Graph API /Me/People and grabs the value from there. Once removed from everywhere, including the User Profile this no longer happens.

The following script uses SPO PowerShell Module and you will need to connect first using Connect-SPOService. The account that you use, needs to be a SharePoint Global Administrator.

The script checks if it can find the ExternalUser, and if it can remove the user using Remove-SPOExternalUser.

Then it loops through every site collection and looks for the user using Get-SPOUser with the internal userprincipalname. If found it removes the user using Remove-SPOUser. Once it has looped through all SharePoint sites, it then checks the SharePoint User Profile and removes the user from UserProfile Remove-SPOUserProfile. This command will remove a user from the UserProfile if they in the “Active Profiles” or the “Profiles Missing from Import”

<#
.SYNOPSIS
Loops through the SharePoint sites of the tenant, looking for the external user and removing them.
You need to have already connected to the Tenant as a SharePoint Global Adminstrator using Connect-SPOService -url:https://<tenant>-admin.sharepoint.com
.EXAMPLE
.\Remove-ExternalUserFromTenant.ps1 -Environment:<tenant> -UserEmail:<externalEmailAddres>
#For Tenant called Dev34223 and external email address fred.bloggs@outlookdomain.com
.\Remove-ExternalUserFromTenant.ps1 -Environment:Dev34223 -UserEmail:fred.bloggs@outlookdomain.com
#>
[CmdletBinding(SupportsShouldProcess)]
param(
[Parameter(Mandatory)]
[string]
$Environment,
[Parameter(Mandatory)]
[string]
$UserEmail
)
Clear-Host
$sites = Get-SPOSite -Limit ALL
$externalConversionEmail = ($UserEmail -replace '@', '_') + "#EXT#@" + $Environment + ".onmicrosoft.com"
$ErrorActionPreference = 'Stop'
$InformationPreference = 'Continue'
Write-Information -MessageData "Get $UserEmail External User within SharePoint"
$ExtUser = Get-SPOExternalUser -Filter $UserEmail
if ($null -ne $ExtUser) {
Write-Information -MessageData "Remove $UserEmail within SharePoint"
Remove-SPOExternalUser -UniqueIDs @($ExtUser.UniqueId) -Confirm:$false
}
$found = $false
$Sites | ForEach-Object {
$site = $PSItem
$i = $i + 1
try {
Get-SPOUser -site:$($site.Url) -LoginName:$externalConversionEmail
write-Information "Found user $UserEmail in site $($site.Title) Url:$($site.Url)"
Remove-SPOUser -site:$($site.Url) -LoginName:$externalConversionEmail
$found = $true;
}
catch {
#User not found.
}
Write-Progress -Activity "Removing User - $UserEmail" -Status "Progress:$($site.Url)" -PercentComplete ($i / $Sites.count * 100)
}
if ($found) {
Write-Information "User $UserEmail removed from SharePoint Sites"
}
else {
Write-Information "User $UserEmail wasn't found within SharePoint Sites"
}
Write-Information -MessageData "Remove $externalConversionEmail from SharePoint User profile"
try {
Remove-SPOUserProfile -LoginName $externalConversionEmail
}
catch {
Write-Information "Unable to find $externalConversionEmail in the user profiles."
}

If the plan is to add the external person back into your tenant, once the script has run, you will need to wait at least a few hours (maybe leave it for a day to be sure) to ensure all back end processes of Microsoft have completed.

When you share a document/folder with the external user they will get the invited link and enter a code experience, this way they do not turn up inside you Azure AD. However, if you share a site with them, or add them to a MS Teams, they will appear in your Azure AD correctly.

Dive into the code for O365 Audit logs webhooks


This is part two of a 2-part blog post.

  1. Walkthrough Setting up WebHook for O365 Audit Logs
  2. Dive into the code for O365 Audit Log webhooks to see how it works – (This Post)

The previous blog post showed how to get you up and running with O365 Audit logs and webhooks. In this blog post I’m going to show and explain parts of the code that ties everything together.

The full code can be found at my Github repo https://github.com/pmatthews05/O365AuditWebHook

PowerShell to initialize the Webhook to the Audit logs

.\Set-AuditLogs.ps1 -ClientId:<ClientID>
-ClientSecret:<AppSecret>
-TenantDomain:<Tenant>.onmicrosoft.com
-TenantGUID:<Directory ID>
-WebHookUrl:https://<Environment>-auditwebhook.azurewebsites.net/API/AuditWebHook
-ContentType:Audit.SharePoint

Run on one line.

From inside the PowerShell folder (.\O365AuditWebhook\PowerShell) there is a PowerShell file called Set-AuditLogs.ps1 This PowerShell file Starts a subscription to the given Audit Content Type. This is done by calling:

https://manage.office.com/api/v1.0/{tenant_id}/activity/feed/start?contentType={ContentType}

The above call is a POST call and uses the ClientID and Secret to authenticate against the tenant. The body is a Json object

{
"webhook": {
"authId": "365notificationaad_Audit.Sharepoint",
"expiration": "",
"address": "https://environment-auditwebhook.azurewebsites.net/API/AuditWebHook&quot;
}
}
  • authId – Optional string that will be included as the WebHook-AuthID header in notifiations sent to the webhook as a means of identifying and authorizing the source of the request to the webhook
  • expiration – Optional datetime that indicates the datatime after which notifications should no longer be sent to the webhook. By leaving it empty, indicates the subscription will be active for the next 180 days.
  • address – Required HTTPS endpoint that can receive notifications. A test message will be sent to the webhook to validate the webhook before creating the subscription.

When the /start operation is called, the webhook URL specified in the address will be sent a validation notification to validate that an active listener can accept and process notifications.

The Azure Function AuditWebhook found in the O365AuditWebhook.cs file has two parts to it.

string stringvalue = await req.Content.ReadAsStringAsync();
log.LogInformation($"Req.Content {stringvalue}");
try
{
log.LogInformation("Getting validation code");
dynamic data = await req.Content.ReadAsAsync<object>();
string validationToken = data.validationCode.ToString();
log.LogInformation($"Validation Token: {validationToken} received");
HttpResponseMessage response = req.CreateResponse(HttpStatusCode.OK);
response.Content = new StringContent(validationToken);
return response;
}
catch (Exception)
{
log.LogInformation("No ValidationCode, therefore process WebHook as content");
}

The first part, as shown above, handles the validation. It looks for a validation code within the content, and if found it response back with a 200 status (OK) and includes the validation code.

If an OK is not received back, then the webhook will not be added and the subscription will remain unchanged.

The second part of the AuditWebhook Azure function is explained in the next section.

Webhook handling O365 notifications

After the initial validation, notifications will be sent to the webhook as the content logs become available.

From the first part of the AuditWebHook Azure Function, notifications do not have the validationCode, this allows us to determine that notifications have been sent, instead of a new subscription.

The content of these notifications contains an array of one or more JSON objects that represent the available content blobs.

log.LogInformation($"Audit Logs triggered the webhook");
string content = await req.Content.ReadAsStringAsync();
log.LogInformation($"Received following payload: {content}");
List<AuditContentEntity> auditContents = JsonConvert.DeserializeObject<List<AuditContentEntity>>(content);
foreach (var auditcontent in auditContents)
{
if (AuditContentUriQueue == null)
{
string cloudStorageAccountConnectionString = System.Environment.GetEnvironmentVariable("AzureWebJobsStorage", EnvironmentVariableTarget.Process);
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(cloudStorageAccountConnectionString);
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
AuditContentUriQueue = queueClient.GetQueueReference("auditcontenturi");
await AuditContentUriQueue.CreateIfNotExistsAsync();
}
log.LogInformation($"Content Queue Message: {auditcontent.ContentUri}");
AuditContentQueue acq = new AuditContentQueue
{
ContentType = auditcontent.ContentType,
ContentUri = auditcontent.ContentUri,
TenantID = auditcontent.TenantId
};
string message = JsonConvert.SerializeObject(acq);
log.LogInformation($"Adding a message to the queue. Message content: {message}");
await AuditContentUriQueue.AddMessageAsync(new CloudQueueMessage(message));
log.LogInformation($"Message added")
}
return new HttpResponseMessage(HttpStatusCode.OK);

On line 5 of the above code, show where I handle the content of deserialize json object (notifications) to a list of AuditContentEntity.

The notification/AuditContentEntity contains the following:

  • tenantId –
    The GUID of the tenant to which the content belongs
  • clientId – The GUID of your application that created the subscription
  • contentType – Indicates the content type
  • contentId – An opaque string that uniquely identifies the content
  • contentUri – The URL to use when retrieving the content
  • contentCreated – The datetime when the content was made available
  • contentExpiration – The datetime after which the content will no longer be available for retrieval.

At this point you do not have any log information, you just have a collection of contentUri which when called will provide you with the logs. To ensure that the webhook response quickly so that it can continue to handle incoming requests, we place the contentUri, contentType, and TenantId onto an Azure Storage Queue. This allows a different Azure function to handle getting the actual logs.

Lines 9-16 will set up the storage queue if it doesn’t exist.

Lines 19-26 prepares my queue object and serialize it to a json string.

Line 28 adds the message to the Azure Storage Queue.

Once all notifications/AuditContentEntity have been processed, a 200 status (OK) is passed back. The subscription that calls our webhook is waiting for an OK response. If it encounters failure, it has a built in retry mechanism that will exponentially increase the time between retries. If the subscription continues to receive failure response, the subscription can disable the webhook and stop sending notifications. The subscription will need to be started again to re-enable the disabled webook.

Processing the Storage Queue AuditContentUri

As items are put on the Storage Queue the Azure Function AuditContentUri found in the O365AuditWebhook.cs file fires.

string token = await AcquireTokenForApplication();
var uri = auditContentQueue.ContentUri;
do
{
uri = uri.Contains("?") ? $"{uri}&PublisherIdentifier={auditContentQueue.TenantID}" : $"{uri}?PublisherIdentifier={auditContentQueue.TenantID}";
log.LogInformation($"URL:{uri}");
var results = await RestAPI.GetRestDataAsync(uri, token);
var array = JArray.Parse(results.RestResponse);
foreach(var logEntry in array)
{
log.LogInformation(logEntry.ToString());
}
uri = results.WebHeaderCollections.Get("NextPageUri");
} while (uri != null);
view raw AuditContent.cs hosted with ❤ by GitHub

First you need an authorization token to read the audit logs, we do this with AcquireTokenForApplication method. This uses the Tenant Name, ClientId and Secret that is stored within your Azure configurations. See ‘How to acquire token for application?’ below.

It grabs the ContentUri and then goes into a do loop. This is because the logs that come back, if it is a very busy tenant, not all the logs will be returned, and there will be a NextPageUri value in the header of the response to allow you to obtain the next page of logs.

Line 7 – This adds your tenantID to the end of the URI as a PublisherIdentifier. This parameter is used for throttling the request rate. Make sure this parameter is specified in all issued requests to get a dedicated quota. All requests received without this parameter will share the same quota. The IF statement ensures it is added to the end of the URI correctly.

Line 9 – This calls the ContentUri and gets a results and request headers. You can see the file .\O365AuditWebHook\AuditWebHook\Utilities\RestAPI.cs
The Method GetRestDataAsync is very similar to the GetRestData call you find within PNP Core code. Creates a HttpWebRequest, passing in Authorization Token, and calling the ContentUri. Only difference in my code is that I’m grabbing the response.Headers to find out if there are additional logs, and passes them back with the results.

Line 10 – This parse the results into a JArray. (Json Array object). Here you can manipulate what comes back. For example, instead of grabbing all results and then displaying them out, you can query the results for a particular log type.

In the example code below, this would be using the Audit.General logs, and it will grab any logs that are of RecordType 25 (Indicates Microsoft Teams event) where the operation is creating a new channel, and the Channel type is Private. I then convert the JArray to an object list of AuditGeneralEntity.

For further details about properties of the audit logs can be found here: https://docs.microsoft.com/en-us/microsoft-365/compliance/detailed-properties-in-the-office-365-audit-log

Line 14 – Logs out an individual log entry, this is in a json format. Different schema’s can be found here: https://docs.microsoft.com/en-gb/office/office-365-management-api/office-365-management-activity-api-schema

Line 17 – If there are any additional pages, then this will return a value, and the loop will loop until no more pages are found.

How to acquire token for application?

In the previous section, I called a method AcquireTokenForApplication. This is a helper class and method that I use quite often, when I need to obtain an AccessToken. You can find this in the repo at .\O365AuditWebHook\AuditWebHook\Utilities\AuthenticationHelper.cs. This solution has a cut down version of the helper class I use. It is cut down as it just gets an access token for Audit Logs using AppId and Secret.

internal static async Task<string> AcquireTokenForApplication()
{
var tenant = System.Environment.GetEnvironmentVariable("Tenant", EnvironmentVariableTarget.Process);
var clientId = System.Environment.GetEnvironmentVariable("ClientId", EnvironmentVariableTarget.Process);
var secret = System.Environment.GetEnvironmentVariable("Secret", EnvironmentVariableTarget.Process);
var authorityUri = $"https://login.microsoftonline.com/{tenant}.onmicrosoft.com";
var resourceUri = "https://manage.office.com&quot;;
var microsoftToken = await GetTokenRetry(resourceUri, authorityUri, clientId, secret);
return microsoftToken;
}
private static async Task<string> GetTokenRetry(string resourceUri, string authorityUri, string clientId, string secret, int retryCount = 5, int delay = 500)
{
...
AuthenticationContext authContext = new AuthenticationContext(authorityUri, false);
ClientCredential clientCred = new ClientCredential(clientId, secret);
var authenticationResult = await authContext.AcquireTokenAsync(resourceUri, clientCred);
token = authenticationResult.AccessToken;
return token;
...
}

Above is a snippet, as you can see it is wrapped in a retry method in case there is throttling.

PowerShell to stop the Audit logs

Within the PowerShell folder I have also included a file called Remove-AuditLogs.ps1

.\Remove-AuditLogs.ps1 -ClientId:<ClientID>
-ClientSecret:<AppSecret>
-TenantDomain:<Tenant>.onmicrosoft.com
-TenantGUID:<Directory ID>
-WebHookUrl:https://<Environment>-auditwebhook.azurewebsites.net/API/AuditWebHook
-ContentType:Audit.SharePoint

Run on one line.

This works exactly like the Set-AuditLogs.ps1 file except it calls the /stop endpoint:

https://manage.office.com/api/v1.0/{tenant_id}/activity/feed/stop?contentType={ContentType}

Once the subscription is stopped, no notifications will be sent to your webhook, and you will not be able to retrieve available content. Please note, if you decide to start the subscription again later using the Set-AuditLogs.ps1 you will not receive any content that was available between the stop and start time of the subscription.

 

This is quite a heavy post; I hope it has helped you in some way. It is just a starter, as you will probably want to do something with the logs instead of just writing them out to the Azure Logs. Maybe capturing a given process to then implement some logic to react. You might also want to put different Audit content types ContentUri onto different Azure Storage queue, so that different Azure Functions can process the ContentUri.

Setting up Webhook for O365 Audit logs


This is part one of a 2-part blog post.

  1. Walkthrough Setting up WebHook for O365 Audit Logs – (This Post)
  2. Dive into the code for O365 Audit Log webhooks to see how it works

In this blog post I’m going to show you how to get the O365 Audit logs using WebHooks. The full code can be found at my Github repo https://github.com/pmatthews05/O365AuditWebHook. My post will show you how to set up with screenshots and the expected results. In my next blog post I will dive into the important parts of the code to get this Audit WebHook connected and working.

Set up – Walkthrough

Creating an App Only Token

Once you have downloaded a copy from my Repo you will need to set up your environment. First thing we are going to do is create an App Only Token that will be able to read the Audit Logs.

  • For your Office 365 Tenant go to https://portal.azure.com
  • Select Active Directory
  • Select App Registrations
  • Click Create New Registration
    • Name: Audit Logs Retrieval
    • Supported Account types:
      Accounts in this organizational directory only
    • Click Register
  • Take a copy of the Application (client) ID
  • Take a copy of the Directory (tenant) ID
  • Click View API Permissions
  • Click Add a Permission
  • Select Office 365 Management APIs -> Application Permissions -> ActivityFeed.Read
  • Click Add permissions

  • Click Grant Admin Consent for [tenant] and accept the permissions.
  • Click on Certificates & Secrets
  • Click New Client Secret
    • Description: Audit Web Hook
    • Expires: Never
  • Take a copy of the Secret value

Setting up Azure

You will need to set up your Azure Environment, this will consist of the following:

  • Resource Group
  • Azure Function V1
  • Applications Insights
  • Storage Account

I like to automate where I can, also it saves me creating loads of screenshots which are probably all out of date after 2 months. I have written an Az CLI PowerShell script that will create the above for you in your Azure Environment. In the next blog post I will explain the code.

  • Download the latest version of Az Cli.
  • Using a PowerShell window – Sign into your Azure Environment using ‘az login’
  • If you have multiple subscriptions, ensure you are pointing to the correct subscription ‘az account set –subscription [SubscriptionName]
  • Change the directory to .\O365AuditWebhook\powershell
  • Run the following: ‘.\Install-AzureEnvironment.ps1 -Environment “[Environment]” -Name:”AuditWebHook”‘ replacing the [Environment] with your tenant name. For example, I’ve used cfcodedev.
  • Once the script has run, you will have the basic template Azure resources you need within the Resource group named [Environment]-AuditWebHook

Deploying Azure Function from Visual Studio 2019

Firstly, you don’t have to deploy this way. If you prefer to use Visual Studio code, create an AZ install script or manually deploy using Kudu, that is your choice, and all are valid. My choice of doing this is simplicity for screen shots and steps.

  • Open the solution using Visual Studio Code 2019
  • Right click on the project AuditWebHook and select Publish
  • From the Pick a publish target dialog (click Start if you are not seeing a dialog), and under Azure Functions Consumption Plan click Select Existing, and select Create Profile.
  • Sign into your account if you need to, then pick your subscription, resource group, and then you can either search, or just pick the Azure Function. Click OK.
  • This takes you back to the Summary page. Under Actions click Edit Azure App Service settings
  • The Application Settings dialog will show you the values Local and what is found within Azure Function in the cloud. You will need to update the Remote value for the following:
    • FUNCTIONS_EXTENSION_VERSION: ~1
  • You will need to add the following Settings, by clicking on Add Setting creating the setting name, and put the value in afterwards. Repeat for each setting below.
    • Tenant: [Name of your Tenant, do not include .onmicrosoft.com]
    • ClientId: [Client ID created in step ‘Creating an App Only Token’ earlier]
    • AppSecret: [Secret Value created in step ‘Creating an App Only Token’ earlier]
  • Click OK
  • Back on the Publish screen, click the Publish button. This will push the code to your environment, with the correct Application Settings.
  • By going to your Azure Function at portal.azure.com, you will see 2 Azure Functions
  • Then clicking on Configuration, it will take you to the Application settings page, click Show Values and you will see your values.

At this point you just have the Azure function as a Webhook in place. Next steps are to tie the O365 Audit log to the WebHook.

Connecting O365 Audit Logs to your webhook

The last step is tying the Audit logs to your webhook. The webhook can be used for the different Audit logs. There are 5 different types of logs.

  1. Audit.AzureActiveDirectory
  2. Audit.Exchange
  3. Audit.SharePoint
  4. Audit.General
  5. DLP.All -Note: DLP sensitive data is only available in the activity feed API to users that have been granted “Read DLP Sensitive Data” permission.

I have written a PowerShell script for you that will register the webhook for you. You will find this in the repo.

  • Open PowerShell
  • Change the directory to .\O365AuditWebhook\powershell
  • Run the following PowerShell script (Run on one line), change the parameters to match your environment. I’ve picked Audit.SharePoint, but you can use any listed above, and run the PowerShell script multiple times to connect all logs to the webhook.

.\Set-AuditLogs.ps1 -ClientId:<ClientID>
-ClientSecret:<AppSecret>
-TenantDomain:<Tenant>.onmicrosoft.com
-TenantGUID:<Directory ID>
-WebHookUrl:https://<Environment>-auditwebhook.azurewebsites.net/API/AuditWebHook
-ContentType:Audit.SharePoint

The above codes login with the ClientID and Secret and Starts a subscription to the given ContentType audit, using the WebHookUrl for the webhook.

If successful, you will receive a 200 Status Code message like below.

Your Azure Function (AuditWebHook) would have fired, and you would see something like the following within your logs.

Viewing the results

Directly from the Microsoft Page on Office 365 management api it states in this note:

When a subscription is created, it can take up to 12 hours for the first content blobs to become available for that subscription. The content blobs are created by collecting and aggregating actions and events across multiple servers and datacenters. As a result of this distributed process, the actions and events contained in the content blobs will not necessarily appear in the order in which they occurred. One content blob can contain actions and events that occurred prior to the actions and events contained in an earlier content blob. We are working to decrease the latency between the occurrence of actions and events and their availability within a content blob, but we can’t guarantee that they appear sequentially.

If you are using a Development environment – like myself – and setup the Audit.SharePoint content type then I suggest you go into SharePoint, and start using SharePoint. Just so the logs start to fill.

Please note, it can take up to 30 minutes or up to 24 hours after an event occurs for the corresponding audit log entry to be displayed in the search results, depending on the service of Office 365. See the table at the bottom of this section Search the audit log in security and compliance – Before you begin

Viewing the AuditWebHook azure function, you will see that it has fired more times since your initial setup.

If you look at your latest call, (note: logs can display out of order in azure functions) you will see that it attempts to find the validation code, which is what it needs to set up the webhook. When it is unable to find the validation code, the code assumes that content contains log information. It grabs the URI of the log that has been created and then it adds it to our Azure Storage Queue for our other azure function to process. Depending on how busy your environment is, this request could hold multiple URL’s to logs. A webhook has to respond quickly back to the calling code with a 200 status code. Therefore we are adding the URI’s of the logs directly to a Storage Queue to allow a different process to interrogate the logs.

The second Azure Function (AuditContentUri) will fire every time an item lands on the Storage Queue. This will grab the information from within the log file by calling the URI.

If we select one of the calls and view the logs of that Azure Function call, every entry within that Audit log file URI will be displayed in a JSON format. Clicking on a row in the logs, will display the full details of the line. At this point in the code, would be where you process the line and do whatever you need to do with the Audit log. I’m just printing it out to the Azure Function Logs.

Remove O365 Audit Logs from your webhook

To remove the webhook from the Audit log just run the following PowerShell script. You will find this in the repo.

  • Open PowerShell
  • Change the directory to .\O365AuditWebhook\powershell
  • Run the following PowerShell script (Run on one line), change the parameters to match your environment. I’ve picked Audit.SharePoint, but you can use any listed above, and run the PowerShell script multiple times to remove all logs to the webhook.

The below codes login with the ClientID and Secret and stops the subscription of the given ContentType audit.

.\Remove-AuditLogs.ps1 -ClientId:<ClientID>
-ClientSecret:<AppSecret>
-TenantDomain:<Tenant>.onmicrosoft.com
-TenantGUID:<Directory ID>
-WebHookUrl:https://<Environment>-auditwebhook.azurewebsites.net/API/AuditWebHook
-ContentType:Audit.SharePoint

Hopefully, if you have followed this correctly, (and I have written decent enough instructions for you), you should have a basic Audit Log Webhook working in your environment. This isn’t anywhere near production ready code, but it gives you an idea where to start. In my next blog post I will be going though parts of the code, to explain how it all fits together.

Connecting to Azure Devops with a Service Principal


Alternative posts:

I see a lot of blogs and examples on the internet that shows you how to connect to environments using a username and password. This is all well and good for testing, but I believe it is bad for real world scenarios.

I’m a contractor, and my time at the job is defined, after I leave the contract and move onto the next one, the company should disable my account. What happens next? Everything I have built using a username/password, stops working. Yes, I could argue, it ensures I get a call back, but most contracts I’ve been involved in have a clause that any bugs found will be fixed for free up to 3-6 months after the job is done. Also, I like to leave a place with them thinking “That guy is awesome, lets get him back for the next project!”.

This post is going to show you how to set up a Service Principal for your Azure Devops CI/CD. At the end, I will give a very basic deployment that creates a Resource Group in Azure. Please note, this example is to show how to set up when your Azure Devops is not part of the same Directory as your Azure Resource Tenant. When it is part of the same tenant.

First, we need to start in Azure and create a Service Principal. The Service Principal will need to be a contributor on the Subscription or the Resource group that your Devops project is going to manage.

Create Service Principal

  • Open Azure Portal
  • Navigate to Azure Active Directory
  • Click App registration
  • Click New Registration
    • Name: Devops-<Company>-<ProjectName> (E.g, Devops-CFCode-OperationsDemo)
    • Supported account types: Accounts in this organizational directory only (Single Tenant)
    • Redirect URI: (Leave blank)
    • Click Register
  • Make a note of Application (client) ID and your Directory (tenant) ID.

Create a Secret for the Service Principal

  • In the App Registration for the above app, click Certificates & secrets.
  • Under Client secrets, click New client secret
    • Description: DEVOPS
    • Expires: Never
    • Click Add
  • Make note of the secret

Assign Service Principal permission to Subscription

  • Open Azure Portal
  • Navigate to Subscriptions and select your subscription
  • Click Access control (IAM)
  • Click Add -> Add role assignment
    • Role: Contributor
    • Assign access to: Azure AD user, group, or service principal
    • Select: <Name of service Principal>
    • Click Save
  • From the Overview blade, grab the Subscription ID and Subscription Name.

You can also add API permissions, such as Graph, and then make direct calls to Graph API using PowerShell using this service principal within the pipeline. Now this side has all been set up, we can head over to our Devops.

Create Service Connection in Devops

  • Go into your project.
  • At the bottom left of your screen click Project Settings
  • Within Project settings, underneath Pipelines click Service connections*. If you have a star next to the Service connections word, it means that you are viewing the preview version. I’m going to show the following screens using a preview version.
  • Click Create service connection
  • Select Azure Resource Manager, click Next
  • Select Service principal (manual), click Next
  • On the New Azure service connection blade, (replace values with your values you grabbed earlier)
    • Environment: Azure Cloud
    • Scope Level: Subscription
    • Subscription Id: <SubscriptionID>
    • Subscription Name: <Subscription Name>
    • Service Principal Id: <Application (client) ID>
    • Credential: Service principal key
    • Service Principal Key: <Secret>
    • Tenant ID: <Directory (tenant) ID>
    • Details (This section is your choice)
      • Service connection name: <Name of Tenant>-<SubscriptionName>
      • Description:
    • Security: Tick – Grant access permission to all pipelines.
  • Click Verify, a Verification Succeeded should show if all the details are correct, and the service account has permission.
  • Click Verify and save

The Service Principle is now connected

Proving the Service Principal connections works

Within your Dev Ops project, click on Pipelines and select releases. We are going to create a Resource Group within our subscription.

  • Add an Azure CLI task.
    • Task verison: 2*
    • Azure Resource Manager connection: Pick the subscription you have created in the previous section.
    • Script Type: PowerShell
    • Inline script: Inline Script
    • Inline script: az group create –name “Demo-rg” –location uksouth
  • Click Save and create a release. After the release has run, and you have received success, an empty resource group should have been created within the subscription.

AZ CLI putting message on a storage queue – not a valid Base-64 string


Using Az CLI a lot recently, it has made interactions with Azure so much easier using PowerShell.

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest

I had to write a simple PowerShell script that added items to a Storage queue. Once the items were added to a queue an Azure function picked up the items and processed them.

According to the documentation of Azure CLI you need to use az storage message put.

#Login
az login
#Get the connection string
$connectionString = az storage account show-connection-string --name "mystorageaccountname" –resource-group "MyResourceGroup" --query connectionString | ForEach-Object { $PSItem -join '' } | ConvertFrom-Json
#Create message
$message = "{""Name"":""Paul"",""LastName"":""Matthews"",""ID"":""d4b2ffc9-4380-46e4-a0bf-8a9ca58734d2""}"
#Add item to the queue
az storage message put --content $message --queue-name "myQueueName" --connection-string $connectionString

This fires successfully and I can see the item on the queue. However, when my Azure Function starts to run, I get the error message:

The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.

To solve this problem you just need to convert the string.

#Login
az login
#Get the connection string
$connectionString = az storage account show-connection-string --name "mystorageaccountname" –resource-group "MyResourceGroup" --query connectionString | ForEach-Object { $PSItem -join '' } | ConvertFrom-Json
#Create message
$message = "{""Name"":""Paul"",""LastName"":""Matthews"",""ID"":""d4b2ffc9-4380-46e4-a0bf-8a9ca58734d2""}"
#Encode to bytes
$b = [System.Text.Encoding]::UTF8.GetBytes($message)
#Convert to Base64String
$message64Base = [System.Convert]::ToBase64String($b)
#Add item to the queue
az storage message put --content $message64Base --queue-name "myQueueName" --connection-string $connectionString

Amazingly the value you push upto the queue is something like below:

“eyJOYW1lIjoiUGF1bCIsIkxhc3ROYW1lIjoiTWF0dGhld3MiLCJJRCI6ImQ0YjJmZmM5LTQzODAtNDZlNC1hMGJmLThhOWNhNTg3MzRkMiJ9”

But it appears on the queue correctly.

Branding your Office 365 sign-in pages


Are you fed up when you go to sign into your SharePoint Online sites, that you see the default picture that Microsoft provides? Wouldn’t it be nice that the sign in page matched your branding, gave your users a consistence look and feel? Well this article is going to show you how to do that.

Note: A branded sign-in page only appears when you visit a service with a tenant-specific URL such as https://outlook.com/contoso.com. When you visit a service with non-tenant specific URLs (e.g https://myapps.microsoft.com) a non-branded sign-in page appears until you have entered your User ID.

The following screen shot shows an example of the Office 365 sign-in page on a desktop after a customisation:

The following screen shot shows an example of the Office 365 sign-in page on a mobile device after customisation:

What can you customise?

On the screenshot below I have highlighted the areas that you can customise.

  1. Large Image / Background colour – You can change the image, or show a background colour which will be used in place of the image on low bandwidth or narrow screens.
  2. Logo – Your logo can be shown at the top right of the screen instead of the Office 365 logo.
  3. Sign-in Page Text – Although not showing on the above picture, you can supply sign-in page text. This text could be used to display a legal statement, simple instructions, or even contact information for your help desk.

SharePoint Online Tenant

To customise your sign-in page, you need to do this through Azure AD. If you have just a SharePoint Online tenant of the domain *.onmicrosoft.com and never used Azure, you will find that you cannot get to Azure. Luckily this isn’t too much of a problem if you have a credit card. (Don’t worry it doesn’t cost any money).

If you go to https://manage.windowsazure.com or https://portal.azure.com and attempt to sign in with your account, that you use for SharePoint online, you will see the screen below.

Chris O’Brien blog explains it further here (http://www.sharepointnutsandbolts.com/2014/04/using-azure-instance-behind-your-office-365-tenant.html), but basically you just need to click on “Sign up for Windows Azure” then follow the instructions and enter your credit card details. (Again it doesn’t cost you anything). It gives you a pay-as-you-go Azure instance.

Configuring your directory with company branding

  • Sign into your Azure classic portal (https://manage.windowsazure.com) as an administrator of the directory you want to customise, and select your directory.
  • Along the menu/toolbar list, Click Configure.
  • Under Directory properties click Customize Branding.
  • Modify the elements listed below. All fields are optional. See below for screenshots and details of all customisable elements.
    • Banner Logo
    • Sign-in Page Text
    • Sign-in Page Illustration
    • Sign-In Page Background colour.
  • Click Save.

Note: If you have applied changes to your sign-in page, it can take up to an hour for the changes to appear. Mine happen within a few minutes.

Any time you wish to change your customisation, just by going back and clicking the Customize Branding button again. It is also here where you can add different branding settings for a specific language.

Different Branding for different languages

After you have configured your default branding settings, by going back and clicking the Customize Branding button, the first screen you are presented with is changing existing settings (which if you have just followed this blog, only see Default here) or you can add branding settings for a specific language. Select the language and then click the arrow button, and upload pictures/text as you did before. Once set, this branding will only show for the given browser language.

Customisable elements details

Below you will find the screen shot of the customising branding wizard with descriptions that you would find if you click the help tool tip icon.

  • Banner Logo (60 x 280 pixels) – The banner logo is displayed on the Azure AD sign-in page, when users sign in to cloud application that use this directory. It’s also used in the Access Panel service.
    • Max pixel size: 60px by 300px
    • Recommended to keep under 30 pixels high to avoid introducing scrollbars on mobile devices.
    • Recommended file size: 5-10kb
    • Use a PNG image with a transparent background if possible.
    • Avoid using a logo with small text on it, as the image may be resized to fit smaller screens.
  • Square Logo (240 X 240 pixels) – The square logo (previously referred to as “Title Logo”) is used to represent user accounts in your organization, on Azure AD web UI and in Windows 10.
    • Max pixel size: 240px by 240px
    • Recommended file size: 5-10kb
    • Use a PNG image with a transparent background if possible.
    • Avoid using a logo with small text on it, as the image may be resized to fit smaller screens.
  • Square Logo, Dark Theme (240 x 240 pixels) – If configured, this image will be used instead of the “Square Logo” image in combination with dark backgrounds, such as Windows 10 Azure AD Joined screens in the out-of-box experience.
    • If your logo already looks good on white and on dark blue/black backgrounds, there’s no need to configure a separate Dark Theme logo.
  • User ID Placeholder – This will replace “someone@example.com” that’s shown as a hint in the user ID input field on the Azure AD login page.
    • Important: you should only configure this if you only support internal users. If you expect external users to sign in to your app(s), we recommend you leave this blank (Azure AD will show “someone@example.com”).
  • Sign-In Page Text Heading – Add a heading above your customized sign-in page text. If not configured, this space is left blank on Azure AD web login pages, and replaced by “Need help” on Azure AD Join experience on Windows 10.
    • Plain text only.
    • Don’t exceed 30 characters.
  • Sign-In Page Text – This text appears at the bottom of the Azure AD sign in page, on the web, in apps and in the Azure AD Join experience on Windows 10. Use this space to convey instructions, terms of use and help tips to your users.
    • Plain text only.
    • Can’t be longer than 500 characters (250-300 characters recommended).
    • Remember, anyone can see your login page so you shouldn’t use this space to convey sensitive info!

  • Sign-In Page Illustration – This large image is displayed on the side of the Azure AD sign in page. By design, this image is scaled and cropped to fill in the available space in the browser window.
    • PNG, JPEG or GIF
    • 1420×1200 resolution recommended.
    • Recommended file size: 300 kb (max file size 500 kb).
    • Use an abstract illustration or picture. Since the image gets resized and cropped, avoid using rasterized text and keep the “interesting” part of the illustration in the top-left corner.
  • Sign-In Page Background Colour – On high latency connections, the sign-in page illustration may not load, in which case the login page will fill in the space with a solid colour.
    • Enter an RGB colour code in hex format (e.g. #FFFFFF).
  • Hide KMSI (Keep Me Signed In) – Choose whether your users can see the “Keep me signed in” check box on the Azure AD sign-in page. This option has no impact on session lifetime, and only allows users to remain signed in when they close and reopen their browser.
    • Important: some features of SharePoint Online and Office 2010 have a dependency on users being able to check this box. If you hide this option, users may get additional and unexpected sign in prompts.
  • Post Logout Link Label – If this is configured, Azure AD will show a link to a web site of your choice, after users sign out of Azure AD web applications.
    • Make sure to configure both the label and URL properties!
    • Link can be plain text only.
    • URL can be HTTP or HTTPS.
  • Post Logout Link URL – If this is configured, Azure AD will show a link to a web site of your choice, after users sign out of Azure AD web applications.
    • Make sure to configure both the label and URL properties!
    • Link can be plain text only.
    • URL can be HTTP or HTTPS.

References: https://azure.microsoft.com/en-gb/documentation/articles/active-directory-add-company-branding/