Programmatically connecting to Azure DevOps with a Service Principal (Management Group)


Continuing from my last post of programmatically connecting to Azure DevOps with a Service Principal at subscription level, I also wanted to show how you can create a DevOps service connection programmatically at a Management Group level.

Unlike the subscription level, you cannot just uses Az DevOps command with a management group parameter. This does not appear to be available, the answer is to pass in a json template.

You will need a few things already configured:

* See code in the powershell folder from the post on https://cann0nf0dder.wordpress.com/2020/09/14/app-only-auth-connect-to-sharepoint-online-with-msal-and-azure-keyvault/ to see how you can create this programatically.

Management Groups Enabled

  • Click on Start using management groups. This will create your “Tenant Root Group” and apply your subscriptions to the management group.

The above will set you up to walk through this demo, however, please ensure you understand what Management groups are, and how to use. https://docs.microsoft.com/en-us/azure/governance/management-groups/overview

‘User Access Administrator’ access.

On the above picture, you can see next to the words Tenant Root Group a link (details). You probably do not have the details link clickable at this time. This is because although you have been able to create the initial Tenant Root Group – Management Group, you need to promote your account access to it.

Note: You can only do this as a Global Administrator.

Manually

  • Go to your Azure Portal https://portal.azure.com
  • Go to Azure Active Directory
  • In the left hand navigation under Manage, click Properties
  • Under Access management for Azure resources switch the button to Yes.

Now if you go back to the Tenant Root Group – Management Group, you will be able to click the details link and have access to the Management group, see deployments made at that level, modify access for others etc.

Switching the button to No will then remove your access.

Programatically

Using Az Cli, log in with you account first az login. The below snippet, on line 2 shows how to give yourself access. Where line 5 & 6 would remove the account.

#Give access
az rest method post uri 'https://management.azure.com/providers/Microsoft.Authorization/elevateAccess?api-version=2015-07-01'
#Remove access
$account = username@example.com
az role assignment delete assignee $account role 'User Access Administrator' scope '/'

The Code

Running of this code, will create a DevOps project if it doesn’t exist, and then create a Management Group level Service Connection to the Tenant Root Management Group. To apply to a different level management group would require modification to the code to grab the name and ID of the management group you wish to use and pass into the JSON template.

Your account is now set up to run, you will need to first be logged into AZ Cli.

Note: This can be a Service Principal, as long as the account being used is able to list App Registrations, and has ‘User Access Administrator’ RBAC on the Tenant Root Group – Management Group.

You will need to create a ‘management-group.json’ file which is used as a template, and key tokens will be replaced within the script.

{
"administratorsGroup": null,
"authorization": {
"scheme": "ServicePrincipal",
"parameters": {
"serviceprincipalid": "##ServicePrincipalId##",
"authenticationType": "spnKey",
"serviceprincipalkey": "##ServicePrincipalKey##",
"tenantid": "##TenantId##"
}
},
"createdBy": null,
"data": {
"environment": "AzureCloud",
"scopeLevel": "ManagementGroup",
"managementGroupId": "##ManagementGroupId##",
"managementGroupName": "##ManagementGroupName##"
},
"description": "Management Group Service Connection",
"groupScopeId": null,
"name": "##Name##",
"operationStatus": null,
"readersGroup": null,
"serviceEndpointProjectReferences": [
{
"description": "Management Group Service Connection",
"name": "##Name##",
"projectReference": {
"id": "##ProjectId##",
"name": "##ProjectName##"
}
}
],
"type": "azurerm",
"url": "https://management.azure.com/",
"isShared": false,
"owner": "library"
}

In the code below important parts to note:

(Line 32) – How the authentication works with DevOps. The Personal Access Token is added to the $Env: variable “AZURE_DEVOPS_EXT_PAT”

(Line 61 – 74) – Updating the json template, saving the file as a temp file, and then creating the Service Connection passing in the json template. The json template is the same template used by Azure DevOps when you set up the Management Group Service connection manually, you can see this by watching the network traffic.

<#
.SYNOPSIS
Creates a service connection for a ManagementGroup
Please ensure you are already logged to azure using az login
#>
param(
# Azure DevOps Personal Access Token (PAT) for the 'https://dev.azure.com/%5BORG%5D&#39; Azure DevOps tenancy
[Parameter(Mandatory)]
[string]
$PersonalAccessToken,
# The Azure DevOps organisation to create the service connection in, available from System.TeamFoundationCollectionUri if running from pipeline.
[string]
$TeamFoundationCollectionUri = $($Env:System_TeamFoundationCollectionUri -replace '%20', ' '),
# The name of the project to which this build or release belongs, available from $(System.TeamProject) if running from pipeline
[string]
$TeamProject = $Env:System_TeamProject,
[string]
$AppRegistrationName,
[securestring]
$AppPassword
)
$ErrorActionPreference = 'Stop'
$InformationPreference = 'Continue'
#Clearing default.
az configure defaults group=
$account = az account show | ConvertFrom-Json
$Env:AZURE_DEVOPS_EXT_PAT = $PersonalAccessToken
Write-Information MessageData:"Adding Azure DevOps Extension…"
az extension add name azuredevops
Write-Information MessageData "Configure defaults Organization:$TeamFoundationCollectionUri"
az devops configure defaults organization="$TeamFoundationCollectionUri"
Write-Information MessageData "Getting App Registration: $AppRegistrationName"
$AppReg = az ad app list all query "[?displayName == '$AppRegistrationName']" | ConvertFrom-Json
Write-Information MessageData "Give App Registration access to Management Group Root…"
az role assignment create role "Owner" assignee $($AppReg.appId) scope "/"
Write-Information MessageData "Checking if $TeamProject project exists…"
$ProjectDetails = az devops project list query "value[?name == '$TeamProject']" | Select-Object First 1 | ConvertFrom-Json
if(-not $ProjectDetails){
Write-Information MessageData "Creating $TeamProject project…"
$ProjectDetails = az devops project create name $TeamProject
}
Write-Information MessageData "Checking if service endpoint already exists…"
$ServiceEndpoint = az devops serviceendpoint list project "$TeamProject" query "[?name == '$($AppReg.DisplayName)-Mg']" | Select-Object First 1 | ConvertFrom-Json
if (-not $ServiceEndpoint) {
Write-Information MessageData "Getting Json file for Management Group…"
$managementGroupJson = Get-Content Raw Path "$PSScriptRoot/management-group.json"
$configFilePath = "$PSScriptRoot/temp-managementGroup.json"
$managementGroupJson = $managementGroupJson -replace '##TenantId##', $($Account.homeTenantId) `
-replace '##ManagementGroupId##', $($Account.homeTenantId) `
-replace '##ManagementGroupName##', "Tenant Root Group" `
-replace '##ServicePrincipalId##', $($AppReg.appId) `
-replace '##ServicePrincipalKey##', $(ConvertFrom-SecureString SecureString:$AppPassword AsPlainText) `
-replace '##Name##', "$($AppReg.DisplayName)-Mg" `
-replace '##ProjectId##', $($ProjectDetails.id) `
-replace '##ProjectName##', $($ProjectDetails.name)
Write-Information MessageData "Saving management json file…"
Set-Content Value:$managementGroupJson Path:$configFilePath
Write-Information MessageData "Creating Service Connection name:$($AppReg.DisplayName)-Mg for project $TeamProject"
$ServiceEndpoint = az devops serviceendpoint create project "$TeamProject" serviceendpointconfiguration "$configFilePath" | ConvertFrom-Json
Write-Information MessageData "Clean up temp files"
Remove-Item Path $configFilePath
}
Write-Information MessageData "Updating Service Connection to be enabled for all pipelines…"
az devops serviceendpoint update project "$TeamProject" id "$($ServiceEndpoint.id)" enable-forall true | Out-Null

To run the above code, you will need to put in your parameters. Replace with your values then run the below script, this will call the script above.

$PersonalAccessToken = "<PAT TOKEN>"
$TeamProject = '<PROJECT NAME>'
$TeamFoundationCollectionUri = 'https://dev.azure.com/<organizationName >'
$AppRegistrationName = '<Service Principal Name>'
$AppPassword = '<Service Principal Secret>'
$AppSecurePassword = ConvertTo-SecureString String:$AppPassword AsPlainText Force
.\Install-ServiceConnectionManagementGroup.ps1 PersonalAccessToken $PersonalAccessToken `
TeamFoundationCollectionUri:$TeamFoundationCollectionUri `
TeamProject:$TeamProject `
AppRegistrationName:$AppRegistrationName `
AppPassword:$AppSecurePassword

My team project is called AutomateDevOpsMG, and I used an App Registration called DevOps.

Running Script

Service Principal with Owner access on Management Group Level

Project ‘AutomateDevOpsMG’ and Service connection ‘DevOps-Mg’

Programmatically connecting to Azure Devops with a Service Principal (Subscription)


A previous post of mine Connecting to Azure Devops with a Service Principal has been popular since I have written it. Therefore, I’ve decided to extend on the topic and show how you can do it programatically with AZ DevOps.

You will need a few things already configured:

* See code in the powershell folder from the post on https://cann0nf0dder.wordpress.com/2020/09/14/app-only-auth-connect-to-sharepoint-online-with-msal-and-azure-keyvault/ to see how you can create this programatically.

Create a DevOps PAT token

  • Go to your Azure devops https://dev.azure.com
  • Sign in and click on User settings -> Personal access tokens
  • Click New Token
    • Give it a meaningful name so you know what the PAT token is for in the future. (E.g, Devops Service Connection)
    • Select your Organization
    • Select the Expiration date for as long as you need. Maximum 1 Year
    • Select Scopes at Full access (You might want to tighten your permission in a production environment, for this demo Full access is fine).
    • Click Create
  • Once you have clicked Create this is the only chance to grab a copy of the token. Please take a copy of this token as you will require it later.

The Code

You will need to first be logged into Az Cli. You can sign in using a service principal as you might with a pipeline, as long as the account being used is able to list App Registrations, and ‘User Access Administrator’ RBAC role to be able to apply contribute access to the DevOps service principal on the subscription (Line 43) .

The important part to note in the code is how the authentication works with Devops. The Personal Access Token is added to the $Env: variable “AZURE_DEVOPS_EXT_PAT”. (Line 32)

<#
.SYNOPSIS
Creates a service connection for a subscription
Please ensure you are already logged to azure using az login
#>
param(
# Azure DevOps Personal Access Token (PAT) for the 'https://dev.azure.com/%5BORG%5D&#39; Azure DevOps tenancy
[Parameter(Mandatory)]
[string]
$PersonalAccessToken,
# The Azure DevOps organisation to create the service connection in, available from System.TeamFoundationCollectionUri if running from pipeline.
[string]
$TeamFoundationCollectionUri = $($Env:System_TeamFoundationCollectionUri -replace '%20', ' '),
# The name of the project to which this build or release belongs, available from $(System.TeamProject) if running from pipeline
[string]
$TeamProject = $Env:System_TeamProject,
[string]
$AppRegistrationName,
[securestring]
$AppPassword
)
$ErrorActionPreference = 'Stop'
$InformationPreference = 'Continue'
$account = az account show | ConvertFrom-Json
#Clearing default.
az configure defaults group=
$Env:AZURE_DEVOPS_EXT_PAT = $PersonalAccessToken
Write-Information MessageData:"Adding Azure DevOps Extension…"
az extension add name azuredevops
Write-Information MessageData "Configure defaults Organization:$TeamFoundationCollectionUri"
az devops configure defaults organization="$TeamFoundationCollectionUri"
Write-Information MessageData "Getting App Registration: $AppRegistrationName"
$AppReg = az ad app list all query "[?displayName == '$AppRegistrationName']" | ConvertFrom-Json
Write-Information MessageData "Give App Registration Contributor access to Subscription…"
az role assignment create role 'Contributor' assignee $($AppReg.appId)
Write-Information MessageData "Checking if $TeamProject project exists…"
$ProjectDetails = az devops project list query "value[?name == '$TeamProject']" | ConvertFrom-Json
if(-not $ProjectDetails){
Write-Information MessageData "Creating $TeamProject project…"
$ProjectDetails = az devops project create name $TeamProject
}
Write-Information MessageData "Checking if service endpoint already exists…"
$ServiceEndpoint = az devops serviceendpoint list project "$TeamProject" query "[?name == '$($AppReg.DisplayName)-Subscription']" | Select-Object First 1 | ConvertFrom-Json
if(-not $ServiceEndpoint){
Write-Information MessageData "Creating Service Connection name:$($AppReg.DisplayName)-Subscription for project $TeamProject"
$Env:AZURE_DEVOPS_EXT_AZURE_RM_SERVICE_PRINCIPAL_KEY = $(ConvertFrom-SecureString SecureString:$AppPassword AsPlainText)
$ServiceEndpoint = az devops serviceendpoint azurerm create project "$TeamProject" name "$($AppReg.DisplayName)-Subscription" azurermserviceprincipalid "$($AppReg.appId)" azurermsubscriptionid "$($Account.id)" azurermsubscriptionname "$($Account.name)" azurermtenantid "$($Account.tenantId)" | ConvertFrom-Json
}
Write-Information MessageData "Updating Service Connection to be enabled for all pipelines…"
az devops serviceendpoint update project "$TeamProject" id "$($ServiceEndpoint.id)" enable-forall true | Out-Null

To run the above code, you will need to put in your parameters. Replace with your values then run the script, this will call the script above.

$PersonalAccessToken = '<Put your PAT Token>'
$TeamProject = '<Project Name>'
$TeamFoundationCollectionUri = 'https://dev.azure.com/<OrganizationName >'
$AppRegistrationName = '<Service Principal Name>'
$AppPassword = '<Service Principal Secret>'
$AppSecurePassword = ConvertTo-SecureString String:$AppPassword AsPlainText Force
.\Install-ServiceConnectionSubscription.ps1 PersonalAccessToken $PersonalAccessToken `
TeamFoundationCollectionUri:$TeamFoundationCollectionUri `
TeamProject:$TeamProject `
AppRegistrationName:$AppRegistrationName `
AppPassword:$AppSecurePassword

My team project is called AutomateDevOps, and I used an App Registration called DevOps.

Running Script

Service Principal with Contribute on Subscription

Project ‘AutomateDevOps’ and Service connection ‘DevOps-Subscription’

My next blog post explains how do make a Management Group Service Connection instead of a Subscription level. ‘Programmatically connecting to Azure Devops with a Service Principal (Management Group)

Viewing, Restoring and Removing Items from the SharePoint Recycle Bin – The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator.


I’ve had a script for a while that allows you to view all the items in the Recycle Bin for a Site Collection and prints out to a CSV file. Recently the environment I’ve been running this in has been throwing an error saying;

The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator“.

Getting all items out of the recycle bin.

Originally, I used the PNP Powershell command Get-PnPRecycleBinItem and it was only when I did a Google search for this issue, I found that other people were also having this problem. The PnP team have solved this issue now by adding -RowLimit parameter. If you set the RowLimit high enough you can return all items, as internally, it seems to implement a paging mechanism.

I now use the below script to export the result to a CSV file.

<#
.SYNOPSIS
Loops through the recycle bin and output a csv string.
Uses PNP Powershell.
.EXAMPLE
-URL:'https://<tenant&gt;.sharepoint.com/sites/<siteCollection>' -Stage:First -Path:.\FirstRecycleBin.csv
-URL:'https://<tenant&gt;.sharepoint.com/sites/<siteCollection>' -Stage:First -Path:.\FirstRecycleBin.csv -RowLimit:200000
#>
[CmdletBinding(SupportsShouldProcess)]
param(
# The url to the site containing the Site Requests list
[Parameter(Mandatory)][string]$URL,
[Parameter(Mandatory)][ValidateSet("First", "Second")][string]$Stage,
[Parameter(Mandatory)][string]$Path,
[int]$RowLimit=150000
)
Connect-PnPOnline -Url:$URL -UseWebLogin
Write-Host "Getting recycle bin items..."
$RecycleStage;
if ($Stage -eq "First") {
$RecycleStage = Get-PnPRecycleBinItem -FirstStage -RowLimit 150000
}
else {
$RecycleStage = Get-PnPRecycleBinItem -SecondStage -RowLimit 150000
}
$Output = @()
$RecycleStage | ForEach-Object {
$Item = $PSItem
$Obj = "" | Select-Object Title, AuthorEmail, AuthorName, DeletedBy, DeletedByEmail, DeletedDate, Directory, ID, ItemState, ItemType, LeafName, Size
$Obj.Title = $Item.Title
$Obj.AuthorEmail = $Item.AuthorEmail
$Obj.AuthorName = $Item.AuthorName
$Obj.DeletedBy = $Item.DeletedByName
$Obj.DeletedByEmail = $Item.DeletedByEmail
$Obj.DeletedDate = $Item.DeletedDate
$Obj.Directory = $Item.DirName
$Obj.ID = $Item.ID
$Obj.ItemState = $Item.ItemState
$Obj.ItemType = $Item.ItemType
$Obj.LeafName = $Item.LeafName
$Obj.Size = $Item.Size
$output += $Obj
}
$Output | Export-csv $Path -NoTypeInformation
Write-Host "Done"

Once I have the CSV file, I’m able to filter further in excel and save back to CSV to use to either Restore / Delete the items out of the recycle bin.

Restoring Deleted Items using a csv file.

It seemed that now that I can use RowLimit with Get-PnPRecycleBinItem I should be able to call Restore-PnpRecycleBinItem to restore the item. However, this isn’t the case. Even just passing the Identity of one item within the Recycle Bin, you get the same error message.

The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator“.

There is no RowLimit option on the Restore-PnpRecycleBinItem. The code must internally make a call to get all RecycleBin Items first without using RowLimit. Interestingly though, a user could go to a recycle bin, see items, and restore them if they wanted to. By looking through the network traffic, I was able to see that the GUI uses the following API to Restore Items.

POST /_api/site/RecycleBin/RestoreByIds

Passing in the following JSON body.

{
"ids": [
"b1a30d73-917a-4fdc-82f0-ba6e9881710b",
"2dbce811-cbaa-4818-a458-6ef3de70530b",
"17a0d047-efbb-4524-b7f4-32289b01cc3c",
"8c90ef0d-8b2d-468d-8b8e-f9af4c10f58b"
]
}

There can be one or many Ids.

The trouble with using REST API you need an accessToken. Using the Invoke-PnPSPRestMethod it automatically provides the AccessToken in the call.

This is what I do in the below code. Loop through every item in a CSV file to restore, and call “/_api/sites/RecycleBin/RestoreByIds” using Invoke-PnPSPRestMethod.

[CmdletBinding(SupportsShouldProcess)]
param(
# The URL of the Sitecollection where the recycle bin is.
[Parameter(Mandatory)]
[string]
$SiteUrl,
# Full Path of CSV file of Get-AllRecycleBin.ps1
[Parameter(Mandatory)]
[string]
$Path
)
function Restore-RecycleBinItem {
param(
[Parameter(Mandatory)]
[String]
$Id
)
$siteUrl = (Get-PnPSite).Url
$apiCall = $siteUrl + "/_api/site/RecycleBin/RestoreByIds"
$body = "{""ids"":[""$Id""]}"
Write-Verbose "Performing API Call to Restore item from RecycleBin..."
try {
Invoke-PnPSPRestMethod -Method Post -Url $apiCall -Content $body | Out-Null
}
catch {
Write-Error "Unable to Restore ID {$Id}"
}
}
$ErrorActionPreference = 'Continue'
$InformationPreference = 'Continue'
Connect-PnPOnline -Url:$SiteUrl -UseWebLogin
@($(Import-Csv -Path:"$Path")).ForEach({
$csv = $PSItem
Write-Information -MessageData:"Restore item $($csv.Title)"
Restore-RecycleBinItem -Id $($csv.ID)
})

Deleting Deleted Items using a csv file.

I discovered that I also get the error message when using Clear-PnpRecycleBinItem.

Again, I was able to do this in the GUI, and looking at the network traffic there is an API to delete the items.

POST /_api/site/RecycleBin/DeleteByIds

The JSON body is same format as the RestoreByIds, where it passes in one or many Ids.

The code below is almost identical to the Restore-RecycleBinItems.ps1. Passing in a CSV file with the IDs of files to delete permanently.

[CmdletBinding(SupportsShouldProcess)]
param(
# The URL of the Sitecollection where the recycle bin is.
[Parameter(Mandatory)]
[string]
$SiteUrl,
# Full Path of CSV file of Get-AllRecycleBin.ps1
[Parameter(Mandatory)]
[string]
$Path
)
function Clear-RecycleBinItem {
param(
[Parameter(Mandatory)]
[String]
$Id
)
$siteUrl = (Get-PnPSite).Url
$apiCall = $siteUrl + "/_api/site/RecycleBin/DeleteByIds"
$body = "{""ids"":[""$Id""]}"
Write-Verbose "Performing API Call to delete item from RecycleBin..."
try {
Invoke-PnPSPRestMethod -Method Post -Url $apiCall -Content $body | Out-Null
}
catch {
Write-Error "Unable to Delete ID {$Id}"
}
}
$ErrorActionPreference = 'Continue'
$InformationPreference = 'Continue'
Connect-PnPOnline -Url:$SiteUrl -UseWebLogin
@($(Import-Csv -Path:"$Path")).ForEach({
$csv = $PSItem
Write-Information -MessageData:"Delete item $($csv.Title)"
Clear-RecycleBinItem -Id $($csv.ID)
})

AZ CLI putting message on a storage queue – not a valid Base-64 string


Using Az CLI a lot recently, it has made interactions with Azure so much easier using PowerShell.

https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest

I had to write a simple PowerShell script that added items to a Storage queue. Once the items were added to a queue an Azure function picked up the items and processed them.

According to the documentation of Azure CLI you need to use az storage message put.

#Login
az login
#Get the connection string
$connectionString = az storage account show-connection-string --name "mystorageaccountname" –resource-group "MyResourceGroup" --query connectionString | ForEach-Object { $PSItem -join '' } | ConvertFrom-Json
#Create message
$message = "{""Name"":""Paul"",""LastName"":""Matthews"",""ID"":""d4b2ffc9-4380-46e4-a0bf-8a9ca58734d2""}"
#Add item to the queue
az storage message put --content $message --queue-name "myQueueName" --connection-string $connectionString

This fires successfully and I can see the item on the queue. However, when my Azure Function starts to run, I get the error message:

The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.

To solve this problem you just need to convert the string.

#Login
az login
#Get the connection string
$connectionString = az storage account show-connection-string --name "mystorageaccountname" –resource-group "MyResourceGroup" --query connectionString | ForEach-Object { $PSItem -join '' } | ConvertFrom-Json
#Create message
$message = "{""Name"":""Paul"",""LastName"":""Matthews"",""ID"":""d4b2ffc9-4380-46e4-a0bf-8a9ca58734d2""}"
#Encode to bytes
$b = [System.Text.Encoding]::UTF8.GetBytes($message)
#Convert to Base64String
$message64Base = [System.Convert]::ToBase64String($b)
#Add item to the queue
az storage message put --content $message64Base --queue-name "myQueueName" --connection-string $connectionString

Amazingly the value you push upto the queue is something like below:

“eyJOYW1lIjoiUGF1bCIsIkxhc3ROYW1lIjoiTWF0dGhld3MiLCJJRCI6ImQ0YjJmZmM5LTQzODAtNDZlNC1hMGJmLThhOWNhNTg3MzRkMiJ9”

But it appears on the queue correctly.

Unable to download the Exchange Online PowerShell Module – “Deployment and application do not have matching security zones”


If you want to use multi-factor authentication (MFA) to connect to Exchange Online PowerShell, Microsoft states you are required to install the Exchange Online Remote PowerShell Module, and use the Connect-EXOPSSession cmdlet to connect.

To download this, you need to go to the Exchange Admin Center (EAC) for your Exchange Online organization.

  • Head to Microsoft 365 Admin centre https://admin.microsoft.com for your tenant and sign in.
  • In the left hand navigation, expand out Admin Centers and select Exchange
  • A new tab opens and loads up the Exchange admin center.
  • In the navigation select Hybrid, and then click the second configuration button.

  • This downloads and runs Microsoft.Online.CSE.PSModule.Client.application.

It was at this point I got an error message stated, “Application cannot be started Contact the application vendor.”

From clicking on the Details under summary email it stated: “Deployment and application do not have matching security zones.”

The problem here, is that, it is trying to run a click-once application. This only works in IE/Edge. I was accessing the EAC through chrome. Therefore, log into your tenant again using Edge or IE.

  • Once loaded, you will find Microsoft Exchange Online PowerShell available in your Windows 10 menu. Also, a PowerShell window will open ready for you to connect to Exchange online using PowerShell.

Connect-EXOPSSession -UserPrincipalName <your UPN>

Summary: When downloading the Microsoft Exchange Online PowerShell from the Microsoft Exchange Admin centre, ensure you have logged in using IE or Edge.

Programmatically change the New Menu in SharePoint Online using PowerShell


Back in October 2018 Microsoft changed the way the New menu on a document library worked. You can now edit what is shown in the New menu directly from the library.

I needed to know how to do this in code, and it looked like PNP automatically does this for you. After doing a get-pnpprovisioningtemplate, when you look at the list you will see a XML element called NewDocumentTemplates, which has a JSON formatted string. The property NewDocumentTemplates exists on the view of the list. This made me think that you could have different menu items for different views, but this doesn’t seem to be the case.

<NewDocumentTemplates>[
{"templateId":"NewFolder","title":"Folder","visible":true},
{"contentTypeId":"0x0101007ADED896313A4943AFE7F07133B1339E","isContentType":true,"templateId":"0x0101007ADED896313A4943AFE7F07133B1339E","title":"Document","visible":false},
{"contentTypeId":"0x0101009348E1CE5767914598D327EAE7EB97D500245281855DF00844BF9BD64ADD983B7D","isContentType":true,"templateId":"0x0101009348E1CE5767914598D327EAE7EB97D500245281855DF00844BF9BD64ADD983B7D","title":"CF New Request","visible":true},
{"templateId":"NewDOC","title":"Word document","visible":true},
{"templateId":"NewXSL","title":"Excel workbook","visible":true},
{"templateId":"NewPPT","title":"PowerPoint presentation","visible":true},
{"templateId":"NewONE","title":"OneNote notebook","visible":true},
{"templateId":"NewXSLForm","title":"Forms for Excel","visible":true},
{"templateId":"Link","title":"Link","visible":true}]
</NewDocumentTemplates>

Now this seems to work, when applying it back to the same list. However, I found that if I set my own content types with visible to false, and then tried on a different list that did have that content type, it didn’t hide it.

So why not?

The reason is because the contentTypeId it is expecting, is the ListContentTypeId, not the actual ContentTypeId. For each list, the ListContentTypeId will be different.

When you add a site content type to a list:

“SharePoint list makes a local copy of the site content type and adds the copy to the content type collection on the list. The new list content type is a child of the site content type. The value of the Id property for the list content type is different from the value of the Id property for its parent site content type, but otherwise the two content types are initially the same.”

https://docs.microsoft.com/en-us/previous-versions/office/developer/sharepoint-2010/ms463016(v%3Doffice.14)#effects-from-adding-a-site-content-type-to-a-list

I’m sure PNP will pick up on this and update the code at some point.

PowerShell script

I decided to write my own PowerShell script to solve this issue. First, I needed to understand the JSON format. For the built in Microsoft menu items the JSON looks as follow:

{
  "title": "Word Document",
  "templateId": "NewDOC",
  "visible": true
}

There are quite a few built in ones, and from using PNP to export out, I was able to find what they were called. (If you find out any more, that I don’t have listed here please let me know.)

  • TemplateId – Name
  • NewFolder – Folder
  • NewDOC – Word Document
  • NewXSL – Excel workbook
  • NewPPT – PowerPoint presentation
  • NewONE – OneNote notebook
  • NewVSDX – Visio drawing
  • NewXSLForm – Forms for Excel
  • Link – Link

Note: If you don’t have a license for Visio, or do not have Forms enabled for your E3 license, then the menu item will not show up anyway.

The JSON format for content types are the following:

{
  "title": "<Content Type Name>",
  "templateId": "<ListContentTypeID>",
  "visible": <true/false>,
  "contentTypeId": "<ListContentTypeID>",
  "isContentType": <true/false>
}

Below is my function script. Set-ListNewMenuItems.ps1 With this script you can hide any content types/default menu items by using the titles.

-Url:’https://tenant.sharepoint.com/sites/demo&#8217; -ListTitle:’Documents’ -ContentTypesToHide:’OneNote notebook’,’PowerPoint’,’Custom CT Name’

You can also hide all default menu items, and let your content types show. It will still show links and folder.

-Url:’https://tenant.sharepoint.com/sites/demo&#8217; -ListTitle:’Documents’ -HideDefault:$true

If you want to hide link and folder too, you would need to include their names in the ContentTypesToHide.

-Url:’https://tenant.sharepoint.com/sites/demo&#8217; -ListTitle:’Documents’ -ContentTypesToHide:’Folder’,’Link’ -HideDefault:$true


<#
.SYNOPSIS
Set the New Menu on a document library.
.DESCRIPTION
Sets the New Menu on a document libary, allow you to hide content types you don't want to show. This will grab all list content types currently assigned to the library.
Default Content Types used by Microsoft 'Folder', 'Word document', 'Excel workbook', 'PowerPoint presentation', 'OneNote notebook' 'Visio drawing', 'Link'
.EXAMPLE
Updates the SharePoint library New Menu, but hiding given Content Types.
-Url:'https://tenant.sharepoint.com/sites/demo&#39; -ListTitle:'Documents' -ContentTypesToHide:'OneNote notebook','PowerPoint','Custom CT Name'
.EXAMPLE
Updates the SharePoint library New Menu, but hiding Default Content Types ('Word document', 'Excel workbook', 'PowerPoint presentation', 'OneNote notebook' 'Visio drawing')
-Url:'https://tenant.sharepoint.com/sites/demo&#39; -ListTitle:'Documents' -HideDefault:$true
.EXAMPLE
Updates the SharePoint library New Menu, but hiding Default Content Types ('Word document', 'Excel workbook', 'PowerPoint presentation', 'OneNote notebook' 'Visio drawing') and given Content Types
-Url:'https://tenant.sharepoint.com/sites/demo&#39; -ListTitle:'Documents' -ContentTypesToHide:'OneNote notebook','Link' -HideDefault:$true
#>
[CmdletBinding(SupportsShouldProcess)]
param(
# The url to the site containing the Site Requests list
[Parameter(Mandatory)]
[string]
$URL,
[Parameter(Mandatory)]
[string]
$ListTitle,
[Parameter()]
[array]
$ContentTypesToHide,
[Parameter()]
[bool]
$HideDefault = $false
)
function Add-MenuItem() {
param(
[Parameter(Mandatory)]
$title,
[Parameter(Mandatory)]
$visible,
[Parameter(Mandatory)]
$templateId,
[Parameter()]
$contentTypeId
)
$newChildNode = New-Object System.Object
$newChildNode | Add-Member type NoteProperty name title value:$title
$newChildNode | Add-Member type NoteProperty name visible value:$visible
$newChildNode | Add-Member type NoteProperty name templateId value:$templateId
if ($null -ne $contentTypeId) {
$newChildNode | Add-Member type NoteProperty name contentTypeId value:$contentTypeId
$newChildNode | Add-Member type NoteProperty name isContentType value:$true
}
return $newChildNode
}
function Get-DefaultMenuItems() {
$DefaultMenuItems = @()
$DefaultMenuItems += Add-MenuItem title:"Folder" templateId:"NewFolder" visible:$true
$DefaultMenuItems += Add-MenuItem title:"Word document" templateId:"NewDOC" visible:$true
$DefaultMenuItems += Add-MenuItem title:"Excel workbook" templateId:"NewXSL" visible:$true
$DefaultMenuItems += Add-MenuItem title:"PowerPoint presentation" templateId:"NewPPT" visible:$true
$DefaultMenuItems += Add-MenuItem title:"OneNote notebook" templateId:"NewONE" visible:$true
$DefaultMenuItems += Add-MenuItem title:"Visio drawing" templateId:"NewVSDX" visible:$true
$DefaultMenuItems += Add-MenuItem title:"Forms for Excel" templateId:"NewXSLForm" visible:$true
$DefaultMenuItems += Add-MenuItem title:"Link" templateId:"Link" visible:$true
return $DefaultMenuItems
}
function Set-NewMenuOnList() {
param(
[Parameter(Mandatory)][string]$URL,
[Parameter(Mandatory)][string]$ListTitle,
[Parameter()][array]$ContentTypesToHide,
[Parameter()][bool]$HideDefault
)
Connect-PnPOnline Url:$URL UseWebLogin
Write-Host "Connected to URL:$Url" ForegroundColor Green
$list = Get-PnpList Identity:$ListTitle
Write-Host "Connected to List:$($list.Title)"
$listContentTypes = Get-PnPContentType List $List
$defaultView = Get-PnpView List:$List | Where-Object {$_.DefaultView -eq $true}
$MenuItems = Get-DefaultMenuItems
$listContentTypes | ForEach-Object {
$ct = $PSItem
if($ct.Name -eq "Folder"){
return
}
$MenuItems += Add-MenuItem title:$ct.Name visible:$true templateId:$ct.StringId contentTypeId:$ct.StringId
}
$hideContentType = $ContentTypesToHide;
if($HideDefault -eq $true){
write-host "Hiding default content types"
$hideContentType += 'Word document', 'Excel workbook', 'PowerPoint presentation', 'OneNote notebook','Visio drawing'
}else {
write-host "Including default content types"
}
$MenuItems | ForEach-Object {
if ($hideContentType -contains ($_.title)) {
$_.visible = $false
write-host "Hiding content type $($_.title)" ForegroundColor Yellow
}
else {
write-host "Showing $($_.title)" ForegroundColor Green
}
}
$defaultView.NewDocumentTemplates = $menuItems | ConvertTo-Json
$defaultView.Update()
Invoke-PnPQuery
Write-Host "Updated $($list.Title)"
}
Set-NewMenuOnList URL:$URL ListTitle:$ListTitle ContentTypesToHide:$ContentTypesToHide HideDefault:$HideDefault

After running my script to remove Document, OneNote notebook, Word Document and Forms For Excel

-Url:’https://tenant.sharepoint.com/sites/demo&#8217; -ListTitle:’Documents’ -ContentTypesToHide:’Document’,’OneNote notebook’,’Word Document’,’Forms For Excel’ -HideDefault:$false

As you can see from above, it states that it is Showing Visio drawing. However, on my tenant I do not have a license for it. Below shows you how my New menu has been updated.

The code isn’t perfect, and could do with improvements such as ordering. Please feel free to use it.

View Trace Logs with PnP Provisioning Templates using PowerShell


When running the Get-PnPProvisioningTemplate and the Apply-PnPProvisioningTemplate it is sometime necessary to see what is going on while the call is processing.

Just run the following command first before calling your Get/Apply -PNPProvisioningTemplate command.

Set-PnPTraceLog -On -Level:Debug

If you need to turn it off again

Set-PnPTraceLog -Off

PnP Provisioning Templates Adding – Everyone except external users


On a recent project of mine, I’ve been working with the PnP Provisioning engine using PowerShell. It’s really the first time I’ve really worked with it, and I must say I’m impressed at how easy it is to use.

If you have never used it before I recommend you checking out the following articles on MSDN.

https://msdn.microsoft.com/en-us/pnp_powershell/pnp-powershell-overview

https://msdn.microsoft.com/en-us/pnp_articles/pnp-provisioning-framework

https://msdn.microsoft.com/en-us/pnp_articles/pnp-remote-provisioning

Basic installation setup

If you have a Windows 10 device or have installed PowerShellGet, to check if you have the latest version installed in your PowerShell environment, run the following below

Get-Module SharePointPnPPowerShell* -ListAvailable | Select-Object Name, Version | Sort-Object Version -Descending

I already have the latest version at the time of writing this 2.17.1708.1.

You can install or Upgrade the SharePointPNPPowerShell with the following commands.

To Install

#SharePoint Online
Install-Module SharePointPnpPowerShellOnline

#SharePoint 2016
Install-Module SharePointPnPPowerShell2016

#SharePoint 2013
Install-Module SharePointPnPPowerShell2013

To Upgrade

Update-Module SharePointPnPPowerShell*

Everyone and Everyone Except External Users

The quickest way to create a PnP Provisioning template ready to use again, is to create your site within SharePoint using point and click.

I’ve created my site with Members set up to “Everyone except external users” and Visitors set up to “Everyone”.

After you have created your site, you use the following commands to connect and export the template.

Connect-PnpOnline -Url:https://mytenant.sharepoint.com/sites/pnpexamples -Credentials: (Get-Credential)
Get-PnPProvisioningTemplate -Out 'pnpExample.xml'

If you open the XML file, and look in the <pnp:Security> section, you will see that Additional Members has c:0-.f|rolemanager|spo-grid-all-users/{GUID} and Additional Visitors has c:0(.s|true. These represent Everyone except external users and Visitors respectively.

If you are only using this template to create more sites in the same tenant that you exported it from, then you are good. The GUID after ‘spo-grid-all-users’ will always be that GUID in your tenant. However, when you want to use this template in other tenants – for example you have a staging and production environment – then this GUID will not work and the importing of the template will not add Everyone except external users to your members group.

What is the GUID after spo-grid-all-users?

It turns out that the GUID relates to the Authentication Realm ID of your tenant, and luckily PnP have a PowerShell command

Get-PnPAuthenticationRealm

How does that help us?

You can pass parameters into the Apply-PnPTemplate. First we need to change the XML inside <pnp:AdditionalMembers>

From:

<pnp:User Name="c:0-.f|rolemanager|spo-grid-all-users/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx" />

To:

<pnp:User Name="c:0-.f|rolemanager|spo-grid-all-users/{parameter:AuthenticationRealm}" />

Save the XML template and now you can run the following commands to apply the template to a new site in a different tenant. (Please note my team site I’m applying this to has already been created)

Connect-PnpOnline -Url:https://myOtherTenant.sharepoint.com/sites/pnpexamples -Credentials: (Get-Credential)
$AuthenticationRealm = Get-PnPAuthenticationRealm
Apply-PnPProvisioningTemplate -Path:'pnpExample.xml' -Parameters:@{"AuthenticationRealm"=$AuthenticationRealm}

After applying the template to another tenant, you will see your “Everyone except external users” inserted correctly.

Using the Windows Credentials Manager with PnP PowerShell


Do you get fed up keep typing in your username and password when you are connection to SharePoint via PowerShell? Or you have to keep changing between multiple tenants and get fed up keep typing in the username and password? Did you know you could use the built-in Windows Credential Manager to help ease your pain?

  • Open your Credential Manager

  • Under the Generic Credentials, click ‘Add a generic credential

  • For each tenant/user account you need, create a Generic Credential.
    • Put a label name to indicate what the permissions are for (e.g DevAdmin, TestAdmin, TestUser etc)
    • Put the username of the account
    • Put the password of the account

  • After you have created your Generic Credential(s), when you try to Connect to SharePoint using PNP, you can pass your Label to the credentials.
    Connect-PnPOnline -Url "https://mydevtenant.sharepoint.com/sites/pnpexamples" -Credentials:devAdmin
    

    In the screenshot below, I’m connecting to my tenant using a label I created called CFAdmin1

Using the above technique of Credential Manager labels, you can make your PowerShell scripts easier by creating a string variable called label and pass it in. This will make running the same script for multiple environments easier.

[CmdletBinding(SupportsShouldProcess)]
param(
    # The environment label to use for connection
    [Parameter(Mandatory)]
    [string]
    $Label,

    # The URL of the tenancy to create the site collections in, do not include the -admin
    [Parameter(Mandatory)]
    [string]
    $URL,
)
if ($VerbosePreference) {
    Set-PnPTraceLog -On -Level:Debug
}
else {
    Set-PnPTraceLog -Off
}
Connect-PnPOnline -Url:$URL -Credentials:$Label
…
… #Additional code to update Web
…

If the above file was called UpdateWebSite.ps1 I would type in the following:


.\UpdateWebSite.ps1 -Url:'https://mydevtenant.sharepoint.com/sites/pnpexamples' -Label:devAdmin

Building SharePoint 2016 development environment – Part 15 – Configuring Workflow


A few years ago I wrote “Build your SharePoint 2013 development machine on Windows Server 2012” series, I mainly work in the cloud now, but as the blogs was so popular, I thought I would create a new series for the newer version of SharePoint.

You can access other parts of this post below.

The configuration of the Workflow Manager for SharePoint 2016 is the same as it was for SharePoint 2013. Not only do you need to install the separate Workflow Manager components, once installed SharePoint designer will show SharePoint 2013 Workflow in a dropdown when deciding which platform to build the workflow on.

We will be installing Workflow Manager 1.0 CU3. Although my instructions add all this to the SharePoint Machine, the reason why Microsoft have made the Workflow manager separate is for scaling. There is no need for this to be installed on the SharePoint box. You could create another Windows Server 2012 R2 and add that to the domain and run the Workflow manager on that. There are probably a few more steps required in configuring. Here is a full walkthrough provided by Microsoft Technet https://gallery.technet.microsoft.com/SharePoint-2016-Workflow-acd5ba2a if you wish to delve in deeper.

Installing SharePoint Designer 2013

Wait! SharePoint Designer 2013? Yes.

There is no SharePoint Designer 2016, there is no plan to release one either. Microsoft have stated that they will support SPD 2013 with SharePoint 2016. We are installing SharePoint Designer here because I can use it to prove if you have correctly configured Workflow Manager with SharePoint 2016 correctly.

SharePoint Designer 2013 is a free tool from Microsoft.

  1. Download SharePoint Designer 2013 32bit from the Microsoft Site.
    https://www.microsoft.com/en-GB/download/details.aspx?id=35491
  2. Once downloaded run the file sharepointdesigner_32bit.exe
  3. Accept the License terms and click Continue.
  4. Click Install Now, (Unless you wish to customise and change file location, user information etc)
  5. Once installed, I’d recommend performing a Windows Update. From the Start Menu, type Windows Update, open the application and run any updates required. Reboot if necessary.

Check to see Workflow settings in SharePoint Designer

  1. From the start menu, type SPD and open SharePoint Designer 2013.
  2. Once it has loaded up, click Open Site
  3. Type the URL https://dev.cfcode2016.com click Open
  4. If prompted, enter your credentials
    User: cfcode2016\SP_SetupPassword: Pa55w0rd
  5. From the Navigation menu, select Workflows

  6. On the ribbon menu, select List Workflows > Documents

  7. In the Create List Workflow dialog, at the bottom you will see a dropdown box for Choose the platform to build your workflow on. Only SharePoint 2010 will be listed.

  8. When we come back to this later, we will see SharePoint 2013 Workflow. Close SharePoint Designer for now.

Configuring Workflow Manger accounts

The Workflow Manager will run under new accounts that we haven’t created yet.

  1. On the Domain Controller machine, in the start menu, type Active Directory Users and Computers and open it.
  2. Expand the tree in the left hand pane to see the Managed Service Accounts OU. Select the Managed Service Accounts OU.
  3. Right click in the right hand pane, and select New > User.
  4. Create a user called SP_Workflow. Set the full name and log on name to SP_Workflow. Click Next.
  5. In the password dialog screen, enter the following and click Next
    1. Password and Confirm Password as: Pa55w0rd
    2. Untick User must change password at next logon.
    3. Leave User cannot change password as unticked
    4. Tick Password never expires
    5. Leave Account is disabled as unticked
    6. Click Next. Then click Finished.

Setting up SQL with the correct Security Accounts

  1. On the SharePoint Machine, from the start menu, type SQL Server Management Studio and open up the application
  2. In SQL Server click Connect. (This should be to SQL2016 database instance).
  3. In the left hand menu expand Security. Right click Logins. And select New Login…
  4. In the Login – New dialog box, click the Search button.
  5. Click the Locations button and select Entire Directory.
  6. Type SP_Workflow in the Enter the object name to select, and click Check Names. This will resolve the name. Click OK.
  7. In the left hand panel select Server Roles.
  8. Tick both securityadmin and dbcreator then click OK.
  9. Close down SQL Server Management Studio

Giving SP_Workflow administrative rights on the SharePoint machine.

  1. From the start menu, type Edit local users and groups and open up the application.
  2. In the left hand panel, select Groups
  3. In the right hand pane, double click Administrators
  4. On the Administrators Properties dialog box, click Add
  5. Type SP_Workflow in the Enter the object name to select, and click Check Names. This will resolve the name. Click OK.
  6. Close Edit local users and groups.

Install the Microsoft Web Platform Installer 5.0

  1. Go to the URL https://www.microsoft.com/web/downloads/platform.aspx and download the latest Microsoft Web Platform Installer
  2. Once downloaded run the file wpilauncher.exe
  3. If like my machine it is already on there, it will just open the Web Platform Installer 5.0 else it will install it for you. Accept the License Agreement and click Install. Then click Finish when complete.

Install Workflow manager

  1. From the Start menu, type Web Platform Installer and open the application

  2. In the search box in the top right of the screen, type Workflow Manager and press Enter.
  3. Click Add on the Workflow manager 1.0 Refresh (CU2) and click Install at the bottom.

  4. Click I Accept

  5. When complete, click Continue.

  6. Click Finish.

  7. Close the Workflow Manger Configuration Wizard that has popped up.

Apply Cumulative Update 3.0 for Workflow Manager 1.0

  1. Close and re-open the Web Platform Installer 5.0 we are going to install the CU 3. (You need to close and re-open otherwise the installer thinks Workflow Manger 1.0 hasn’t been installed)
  2. Type Workflow Manger and press Enter in the top right search box.
  3. Click Add for Workflow Manager 1.0 Cumulative Update 3, then click Install at the bottom.
  4. Click I Accept. Once installed click Finish. Click Exit on the Web Platform Installer.

Configure the Workflow manager

  1. From the start menu, type Workflow Manager Configuration
  2. Click on Configure Workflow Manger with Custom Settings
  3. In the Configure Farm Management Database,
    1. Enter your SQL Server Instance: sql2016.cfcode2016.com
    2. Tick Use the above SQL Server Instance and Settings for all Databases
    3. Enter the Database Name: WF_ManagementDB
    4. Click Test Connection button to ensure all working OK.
  4. In the Configure Instance Management Database
    1. Enter the Database Name: WF_InstanceManagementDB
    2. Click Test Connection button to ensure all working OK.
  5. In the Configure Resource Management Database
    1. Enter the Database Name: WF_ResourceManagementDB
    2. Click Test Connection button to ensure all working OK.
  6. In the Configure Service Account
    1. Enter the User ID: CFCODE2016\SP_Workflow
    2. Enter the Password: Pa55w0rd
  7. In Configure Certificates
    1. Leave Auto-generate ticked
    2. Certificate Generation Key: Pa55w0rd
    3. Confirm Certificate Generation Key: Pa55w0rd
  8. In Configure Ports leave default port numbers
    1. https: 12290
    2. http: 12291
    3. Leave Allow Workflow management over HTTP on this Computer unticked
    4. Leave Enable firewall rules on this compute unticked (As we have disabled our firewall)
  9. In Configure Admin Group
    1. Leave BUILTIN\Administrators
  10. Click Next button
  11. On the Service Bus Configuration page, please provide the following
  12. In Configure Farm Management Database
    1. Enter the Database Name: Sb_ManagementDB
    2. Click Test Connection button to ensure all working OK
  13. In Configure Gateway Database
    1. Enter the Database Name: Sb_GatewayDB
    2. Click Test Connection button to ensure all working OK
  14. In Configure Message Container Database
    1. Enter the Database Name: Sb_MessageContainerDB
    2. Click Test Connection button to ensure all working OK
  15. In Configure Service Account
    1. Tick Use the same service account credentials as provided for Workflow Manager
  16. In Configure Certificate
    1. Tick Auto-generate
    2. Tick Use the same certificate generation key as provided for Workflow Manager
  17. In Configure Ports
    1. https: 9355
    2. tcp: 9354
    3. Message Broker Port: 9356
    4. Internal communication Port Range: 9000
    5. Untick Enable firewall rules on this computer (as we have disabled out firewall)
  18. In Configure Admin Group
    1. Leave BUILTIN\Administrators
  19. Click Next button
  20. On the Summary page, click the Tick button at the bottom right of the screen to start installation.
  21. The configuration process can take up to 10 minutes to complete. Once complete, you will see a success pag

Add Workflow Manager Certificate into SharePoint

  1. In Start Menu, type IIS and open Internet Information Services (IIS) Manager
  2. Expand your server name, and Sites. You will now see a site called Workflow Management Site

  3. Click on Workflow Management Site, then on the right hand pane, click Bindings
  4. Select https and click edit.

  5. On the Edit Site Binding, under SSL certificate you will see a Certificate that matches your Server Name. Click the View button.

  6. On the Certificate dialog, click on the Details tab.
  7. Then click Copy to File button.
  8. On the Certificate Export Wizard click Next.
  9. On the Export Private Key page, select No, do not export the private key, click Next

  10. On Export File Format page, select DER encoded binary X.509 (.CER) Click Next
  11. On File to Export page, select a path and filename on your machine. Click Next.

  12. Click Finish. You will receive a successful export message.

Import Certificate into SharePoint Trust

  1. Open SharePoint 2016 central administration
  2. Under Security > General Security click Manage Trust
  3. Click the New button in the ribbon.
  4. On the Establish Trust Relationship page, enter following information:
    1. Name: Workflow Manager
    2. Root Authority Certificate: <Select your file from previous steps>
  5. Click OK.
  6. You will see your certificate in the store.

Register Workflow Service Proxy

  1. In Start Menu, type SharePoint 2016 management Shell (run as administrator) and open the application
  2. In the console type:
    Register-SPWorkflowService -SPSite "https://intranet.cfcode2016.com" -WorkflowHostUri "https://cfsp2016.cfcode2016.com:12290"

Verify the Configuration of Workflow Manager.

  1. Open SharePoint 2016 central administration
  2. Click Application Management
    > Manage services applications
  3. At the bottom of the Manage Services Applications page, there will be Workflow Service Application Proxy

  4. If you click on Workflow Service Application Proxy it will take you a status page that will show you that workflow is now connected.

Check to see Workflow Settings are working in SharePoint Designer

  1. From the start menu, type SharePoint Designer and open the application
  2. Once SharePoint designer has opened, click Open Site.
  3. Type the URL https://dev.cfcode2016.com click Open.
  4. If prompted, enter your credentials
    User: CFCode2016\SP_Setup
    Password: Pa55w0rd
  5. From the Navigation menu, select Workflows
  6. On the ribbon menu, select List Workflow > Documents
  7. In the Create List Workflow dialog, at the bottom you will see a dropdown box for Choose the platform to build your workflow on. Both SharePoint 2010 and SharePoint 2013 should be listed if the Workflow is set up correctly.

We are almost at the end. You SharePoint farm is configured to give you a good start as a development machine. Only thing left now is actual development tools. That will be covered in my final post of the series. Shut down your machines, take a checkpoint. (We will remove checkpoints in the last post)