Category Archives: Azure

Deploying BGInfo using Azure Blob Storage and Microsoft Endpoint Manager

Delivering your corporate applications can be a nightmare if you dont have a enterprise delivery solution like System Center or 3rd party mechanism.

So let’s see how Azure Blob Storage and Microsoft Intune can address this issue by using a storage location and PowerShell script.

Azure Storage Account

One of the requirements for this solution is an Azure Storage Account within your Azure subscription, this account will be used for storing the applications which you would like to roll out to your Windows 10 desktops that are managed using Microsoft Intune.

Storage Account

Specify the required settings within the Basic tab for creating a Storage Account.

Basic Properties

Using the default settings as shown below

Advanced Properties

Click Review and Create
Click Create

Configuring Storage Account with required Applications

Click Container
Specify the Name
Select Conditioner (anonymous read access for containers and blobs) under Public Access Level

Blob – Container

Select your container
Select Upload
Select the files you want to upload
Modify the block size if it’s less than the size of the files you are uploading
Select Upload

Once the files are upload they all have a unique url which is used to identify the file as shown below. This will be required later for the PowerShell script.

The PowerShell Script!!!

This script has been made available on GitHub, you will need to modify the following;

$bginfo64 and $layout to reference your Azure Blob Storage for each file

Download Script

https://github.com/TheWatcherNode/blogaboutcloud/blob/master/Get-BGInfo.ps1

Publish script via Microsoft Endpoint Manager

Launch Microsoft Endpoint Manager https://endpoint.microsoft.com

Browse to Devices –> Scripts –> Click Add –> Select Windows 10

Provide a name and description (optional).. Press Next

Provide your script and select Run script in 64 bit PowerShell Host. Press Next

Press next on Scope Tag, unless you utilize them within your environment

Select the group(s) you wish to target.. Press Next

Press Add to complete

Once the script has applied to the required workstations, at the next reboot the BGInfo will be presented on the desktop wallpaper

Regards
The Author – Blogabout.Cloud

Invoke-WebRequest: The response content cannot be parsed when adapting a local powershell script for Azure Automation.

I have been recently adapting a PowerShell script that collects the Microsoft 365 Service Health using the Office 365 Service Communication API publishing the information into ServiceNow to log a support ticket to make the helpdesk aware of a potential issue in the service. The script was working is no issues when run manually out of PowerShell ISE but failed when ran by Azure Automation.

The focus of this post is to show two specific steps for adapting a locally executed PowerShell script for an Azure Automation runbook. In addition to standard work in Automation to add credentials/connections and create parameters where needed, the following changes were required to adapt a local script for my runbook.

1) Add -UseBasicParsing to Web Requests

The Management API script is filled with various requests using Invoke-WebRequest. Initially, I received numerous errors stating that the Internet Explorer engine is not available.

Invoke-WebRequest : The response content cannot be parsed because the Internet Explorer engine is not available, or 
Internet Explorer's first-launch configuration is not complete. Specify the UseBasicParsing parameter and try again. 
At line:46 char:9
+ $subs = Invoke-WebRequest -Headers $headerParams -Uri "https://manage ...
+         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotImplemented: (:) [Invoke-WebRequest], NotSupportedException
    + FullyQualifiedErrorId : WebCmdletIEDomNotSupportedException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand

To overcome this, I needed to add the parameter UseBasicParsing.

Original: $oauth = Invoke-RestMethod -Method Post -Uri $loginURL/$tenantdomain/oauth2/token?api-version=1.0 -Body $body
Adapted: $oauth = Invoke-RestMethod -Method Post -Uri $loginURL/$tenantdomain/oauth2/token?api-version=1.0 -Body $body –UseBasicParsing

2) Alter $ProgressPreference

PowerShell shows progress using $ProgressPreference, which defaults to “continue”. For Automation, I needed to change the option to “silently continue” and suppress progress bars. Locally, the progress can be convenient to see, but it was causing issues with

Automation.Adapted: $ProgressPreference = “silentlyContinue”

Regards
The Author – Blogabout.Cloud

Publishing your Microsoft 365 Service Messages to a SharePoint List.

I have been recently working on a solution for a customer whereby they required Microsoft 365 Service Messages to be logged as an incident using a Service Desk tool and publish to Microsoft Teams Channel.

Microsoft Teams – This was the simple bit as a nice person in the community built a PowerShell script available on GitHub.

Now for the publishing the information to SharePoint Online, this has caused quite a few headaches from banging my head on my desk but it now working. The reason for publishing to SharePoint List is so that it can be used / reference as a data source to pulling the information into ServiceNow.

You can bit with Power Automate however the information required and provided is lacking so the route I am going to demostrate provide a lot more details. Just like the Microsoft Teams post

I have been recently working on a solution for a customer whereby they required Microsoft 365 Service Messages to be logged as an incident using a Service Desk tool and publish to Microsoft Teams Channel.

Microsoft Teams – This was the simple bit as a nice person in the community built a PowerShell script available on GitHub.

https://github.com/einast/PS_M365_scripts/blob/master/AzureAutomation/AzO365ServiceHealth.ps1

Now for publishing the information to SharePoint Online, this has caused quite a few headaches from banging my head on my desk but it now works. The reason for publishing to SharePoint List is so that it can be used / reference as a data source to pulling the information into ServiceNow.

You can bit with Power Automate however the information required and provided is lacking so the route I am going to demonstrate provide a lot more details. Just like the Microsoft Teams post

Create the SharePoint Site

The first thing to do is create a SharePoint Site that can be used as the location for the data. This can be simply done by connecting to SharePoint via PowerShell and running the following command.

You will need the following PowerShell Module
https://www.powershellgallery.com/packages/PnP.PowerShell

$URL =  "blogabout.sharepoint.com"
$Admin = "thewatcher@blogabout.cloud"

$cred = (Get-Credential)

Connect-AzureAD -Credential $cred

Connect-PnPOnline -UseWebLogin -Url $url
New-PnPTenantSite -Url $URL -Title "Health Status" -Owner "$Admin" -TimeZone "2"     

We now need to gather some information which will be used later in this blog. Using the same PowerShell console run the following cmdlet

Get-PnPList

Make a note of the Id for Health Status as we will need this later on.

We now need to configure the SharePoint List with required columns for receiving the data.

The column used are;

– Title
– Status
– Severity
– Service
– Classification
– Uri
– Message
– Reference

Modifying your Azure Automation Account for SharePoint Lists

In this post, I will make an assumption that you have already implemented pushing messages to Microsoft Teams. If you havent, please visit this article.

First of you all, we need to add PnP.PowerShell module to this Azure Automation account. Go to Modules under Shared Resources and Browse Gallery.

Search for PnP.PowerShell and click on the module

Click Import

The import may take a few minutes to complete and once complete we can go on to the next step.

Select Credentials from Shared Resources

Important Notice

Some of the next steps I have already completed in my environment.

Click “Add a credential” and provide;

-Name – This is important for the PowerShell script to call the credentials
– Username
– Password

You can also provide a description if required.

It is time to now modify the Runbook with the required PowerShell cmdlets for publishing to SharePoint Lists.

Select your run book to edit the script

Under User-Defined Variables I have added the following;

Please note: I am not using Automated Variables in this script but they could be used if you add the variable and use Get-AutomatedVariable -Name

$SpListName = "Health-Status"
$ListName = "Add the Health Status ID you gathered earlier"
$SpSiteUrl = "https://blogabout.sharepoint.com/sites/$SpListName"

https://pnp.github.io/powershell/articles/authentication.html

$myCred = Get-AutomationPSCredential -Name 'MyCredential'
$userName = $myCred.UserName
$securePassword = $myCred.Password
$password = $myCred.GetNetworkCredential().Password

$SpCreds = New-Object System.Management.Automation.PSCredential ($userName,$securePassword)

As you can see from above I am using Get-AutomationPSCredential -Name MyCredential to call stored credentials.

Under the following PowerShell cmdlet in the script

# If any new posts, add to Teams
Invoke-RestMethod -uri $uri -Method Post -body $Payload -ContentType 'application/json; charset=utf-8'

I have added;

# Add news post to Sharepoint Site
Connect-PnPOnline $SpSiteUrl -Credentials $SpCreds
$list =  Get-PnPList -identity $ListName

Add-PnPListItem -List $list -Values @{"Title"=$inc.Title;"Reference"=$inc.Id;"Status"=$inc.Status;"Severity"=$inc.Severity;
                "Service"=$inc.WorkloadDisplayName;"Classification"=$inc.Classification;"Uri"=$inc.PostIncidentDocumentUrl;"Message"=$Message}

This will now allow you to publish the data to SharePoint as shown below,

I am using Power Automate to trigger the Runbook for more detail visit the below post.

Hopefully this post will help you publish the data to SharePoint.

Regards
The Author – Blogabout.Cloud

Pushing Microsoft Health Status to Microsoft Teams using Azure Automation.

In this post I am going to demostrate how to push Microsoft Health Status to Microsoft Teams using Azure Automation

In this post it is expected you have configured Office 365 Service Communications API. If not, visit this blog post https://www.blogabout.cloud/2020/10/1884/

Configuring Microsoft Teams for Incoming Webhook

Launch Microsoft Teams right click a channel within a Team and select Connectors. Search for Incoming Webhook and click Add

Click Add

Specify the name of the Webhook and click Create

Make a note of the URL below as you will need this for the PowerShell script later and click Done

Create Automation Account

Launch your Azure Portal and search for Automation Account to create a new account.

Specify;
– Name
– Subscription
– Resource Group
– Location

No should be selected for “Create Azure Run As Account” as this is will enable the Account to run against Office 365.
Click Create

Once the resources has successful created, go into the resource

First of all we need to create a number of variables, these variables will use within the PowerShell script we are going to add to the Runbook.

You will need to add the following variables

– AzO365ServiceHealthApplicationID = Client ID
– AzO365ServiceHealthApplicationKey = Client Secrey
– AzO365TeamsURI = This is the url we created earlier in the post
– AzO365TenantDomain = Tenant ID

If you unsure how to obtain these values, please visit this blog post for more details

We now need to launch Runbooks

Select Create a runbook
– Provide a name
– Runbook Type should equal PowerShell
– Description (is optional)

Press Create

I am now going to use the following PowerShell script available from GitHub within my Runbook. https://github.com/einast/PS_M365_scripts. This script has been created for the sole purpose of extracting the Microsoft 365 Service Health using the Office Service Communication API, which I demonstrated how to configure in another post.

Under the “User Defined Variables” you will need to edit;

Mintues to reflect the same time as your schedule.

Pushover notifications and Services to monitor as shown below

# ------------------------------------- USER DEFINED VARIABLES -------------------------------------

$ApplicationID = Get-AutomationVariable -Name 'AzO365ServiceHealthApplicationID'
$ApplicationKey = Get-AutomationVariable -Name 'AzO365ServiceHealthApplicationKey'
$TenantDomain = Get-AutomationVariable -Name 'AzO365TenantDomain'
$URI = Get-AutomationVariable -Name 'AzO365TeamsURI'
$Minutes = '1440'

# Pushover notifications in case Teams is down.
# Due to limitations and readability, the script will only send the title of the incident/advisory to Pushover. 
# COMMENT OUT THIS SECTION IF YOU DON'T WANT TO USE PUSHOVER!

#$Pushover = 'yes' # Comment out if you don't want to use Pushover. I use 'yes' for readability.
#$PushoverToken = Get-AutomationVariable -Name 'AzO365PushoverToken' # Your API token. Comment out if you don't want to use Pushover
#$PushoverUser = Get-AutomationVariable -Name 'AzO365PushoverUser' # User/Group token. Comment out if you don't want to use Pushover
#$PushoverURI = 'https://api.pushover.net/1/messages.json' # DO NOT CHANGE! Default Pushover URI. Comment out if you don't want to use Pushover

# Service(s) to monitor
# Leave the one(s) you DON'T want to check empty (with '' ), add a value in the ones you WANT to check (I added 'yes' for readability)

$ExchangeOnline = 'yes'
$MicrosoftForms = 'yes'
$MicrosoftIntune = 'yes'
$MicrosoftKaizala = 'yes'
$SkypeforBusiness = 'yes'
$MicrosoftDefenderATP = 'yes'
$MicrosoftFlow = 'yes'
$FlowinMicrosoft365 = 'yes'
$MicrosoftTeams = 'yes'
$MobileDeviceManagementforOffice365 = 'yes'
$OfficeClientApplications = 'yes'
$Officefortheweb = 'yes'
$OneDriveforBusiness = 'yes'
$IdentityService = 'yes'
$Office365Portal = 'yes'
$OfficeSubscription = 'yes'
$Planner = 'yes'
$PowerApps = 'yes'
$PowerAppsinMicrosoft365 = 'yes'
$PowerBI = 'yes'
$AzureInformationProtection = 'yes'
$SharePointOnline = 'yes'
$MicrosoftStaffHub = 'yes'
$YammerEnterprise = 'yes'

# Classification(s) to monitor
# Leave the one(s) you DON'T want to check empty (with '' ), add a value in the ones you WANT to check (I added 'yes' for readability)

$Incident = 'yes'
$Advisory = 'yes'

# ------------------------------------- END OF USER DEFINED VARIABLES -------------------------------------

You will now need to define a schedule within the Runbook for it to execute.

Once you have done all of the above, you will receive a nice notification in Microsoft Teams about the Service Health Status

In my next article I will show how simple it is too trigger this Runbook using Power Automate.

Regards
The Author – Blogabout.Cloud

Sign up for the new Microsoft Security Public Webinars

Microsoft have just released the following dates for all their new Security webinars. Go and check them out.

Below please find the full list. Details and registration are at https://aka.ms/SecurityWebinars.

JAN 27 – Azure Purview | Introduction to Azure Purview

FEB 3 – Azure Confidential Computing | Confidential computing nodes on Azure Kubernetes Service

FEB 4 – Azure Sentinel | Accelerate Your Azure Sentinel Deployment with the All-in-One Accelerator

FEB 16 – Microsoft Security Public Community | The Billion-Dollar Central Bank Heist

FEB 18 – Azure Sentinel | Best practices for converting detection rules

Interested in some bite-sized security tips? Check out our short videos at https://aka.ms/SecurityCommunityVideos.

To stay informed about future webinars and other events, join our Security Community at https://aka.ms/SecurityCommunity.

Regards
The Author – Blogabout.Cloud

Heres your chance to take Az-900 Fundamentals Exam for free

This year Microsoft have made AZ-900 free for all thoses that attend both days of the below events and for probably the last time this year. Heres another opportunity for you to take the exam for FREE!

Having completed this exam myself, I do strongly suggest you take this opportunity and get this base line certification on your transcript.

DateLink
16th December 2020 10:00-13:45

17th December 2020 10:00-13:45
https://mktoevents.com/Microsoft+Event/209890/157-GQE-382?wt.mc_id=AID3023748_QSG_488072
9th December 2020, 10:00 – 13:05

10th December 2020, 10:00 – 13:05
https://mktoevents.com/Microsoft+Event/211664/157-GQE-382?wt.mc_id=AID3023845_QSG_488067

Regards
The Author – Blogabout.Cloud

PowerShell Tip: Obtaining the ImmutableID from your Active Directory Objects on-prem and in the cloud

When working with Azure Active Directory Connect you may experience issues with account duplicating due to the ImmutableID not matching. If it does happen its is a pain to resolve as you have to;

Desynchronize the affected accounts
Delete from the Deleted Users OU in Azure Active Directory
Obtain the on-premises ImmutableID
Obtain the cloud ImmutableID
Compare the IDs
Set the cloud ID with the on-premises ID

Now wouldnt it be easier if someone had a bunch of PowerShell commands to help you get the ImmutableID. This is where I come in

Obtaining ImmutableID from on-premises Active Directory Object

The following PowerShell script extracts all the ImmutableID’s from every single Active Directory User Object and store in a CSV file on your desktop.

$reportoutput=@()
$users = Get-ADUser -Filter * -Properties *
$users | Foreach-Object {

    $user = $_
    $immutableid = "[System.Convert]::ToBase64String($user.ObjectGUID.tobytearray())"
    $userid = $user | select @{Name='Access Rights';Expression={[string]::join(', ', $immutableid)}}

    $report = New-Object -TypeName PSObject
    $report | Add-Member -MemberType NoteProperty -Name 'UserPrincipalName' -Value $user.UserPrincipalName
    $report | Add-Member -MemberType NoteProperty -Name 'SamAccountName' -Value $user.samaccountname
    $report | Add-Member -MemberType NoteProperty -Name 'ImmutableID' -Value $immutableid
    $reportoutput += $report
}
 # Report
$reportoutput | Export-Csv -Path $env:USERPROFILE\desktop\ImmutableID4AD.csv -NoTypeInformation -Encoding UTF8 }

Obtaining ImmutableID from Azure Active Directory Object

The following PowerShell script extracts all the ImmutableID’s from every single Azure Active Directory User Object and store in a CSV file on your desktop.

$reportoutput=@()
$users = Get-AzureADUser -All $true
$users | Foreach-Object {

    $user = $_

    $report = New-Object -TypeName PSObject
    $report | Add-Member -MemberType NoteProperty -Name 'UserPrincipalName' -Value $user.UserPrincipalName
    $report | Add-Member -MemberType NoteProperty -Name 'SamAccountName' -Value $user.samaccountname
    $report | Add-Member -MemberType NoteProperty -Name 'ImmutableID' -Value $user.immutableid
    $report | Add-Member -MemberType NoteProperty -Name 'DisplayName' -Value $user.displayname
    $reportoutput += $report
}
 # Report
$reportoutput | Export-Csv -Path $env:USERPROFILE\onedrive\desktop\ImmutableID4AAD.csv -NoTypeInformation -Encoding UTF8 }

Recommendation

When I have had to compare the two exports at scale for an entire environment, it can be a complete nightmare but the ImportExcel module was brilliant in getting the data merged into a single sheet.

https://www.powershellgallery.com/packages/ImportExcel/7.1.1

Regards
The Author – Blogabout.Cloud

Quick Tips: How do I restore a deleted Microsoft Teams Team using PowerShell.

Sometimes accidents happen and things get deleted by mistake but what do you do if you delete a Microsoft Team by mistake?

When the team is deleted, it is held in the “recycle bin” for 30 days until it is permanently deleted. The following is the process of restoring a deleted team in Microsoft Teams.

  • Once Team is deleted, option to recover it exists for up to 30 days
  • All of it including (Channels, files, tabs, etc.) will reappear as it was before
  • Restore can take up to 4 hours
  • To restore, from exchange admin center, select recipients, then groups
  • Locate the group (only if soft deleted)
  • Select the group and choose restore

This is where my favourite Microsoft tool comes into action, PowerShell

Prerequisites

AzureAD Module is installed
Global Admin to your Azure AD Tenant

Process

Launch PowerShell as an administrator


1
Connect-AzureAD

When a team is created in Microsoft Teams, it creates an Office 365 group. In order to restore the group you will need to obtain the Id using the following cmdlet


1
Get-AzureADMSDeletedGroup

1
Restore-ADMSDeletedDirectoryObject –ID <objectID>

As mentioned above the restore could take up to 4 hours to complete.

Regards
The Author – Blogabout.Cloud
           

Configuring Office 365 Service Communications API within your own Azure Active Directory.

Office 365 Service Communications API enables an organization to gather data about the Microsoft 365 tenancy and in this post we will be looking at Service Health.

Prerequisites

  • Relevant Azure Active Directory Permissions to create an app
    • Global Administrator,
    • Application Administrator
    • Cloud Application administrator
  • Licensed for Power Automate either;
    • Per-user plan
    • Per-user plan with attended RPA
    • Per Flow plan

Configuring Azure Active Directory

Login into the Azure Portal via http://portal.azure.com and browser to Azure Azure Directory then select App Registrations –> New Registration

Now enter a Name for the application i.e. Office 365 Service Communications API, select Accounts in this organizational directory only

The Redirect URI can be ignored as it no longer necessary and then click Register.

The registered app you just created will now be displayed – click on API permissions on the left hand menu. Click on the Add a permission button in the Configured permissions section. Select Office 365 Management API in the Request API permissions section.

Select Application permissions as the type of permissions your application requires. Then Select ServiceHealth.Read as the permissions required and then select the Add permissions button.

Granting Tenant Admin Consent

The application is now configured with the permissions it needs to use the Office 365 Management APIs but first it needs an admin to grant these permissions. A Global Administrator, Application Administrator or Cloud Application administrator must explicitly grant your application these permissions. This is granting the app permissions to use the APIs to access your tenant’s data. 



If you do not have the necessary role please advise the admin to follow this link and provide them with the name of your App Registration to review and approve.

If you have the necessary Global Administrator, Application Administrator or Cloud Application administrator role click on the Grant admin consent to <tenant name> button.

Generate a new key / client secret for your application

Navigate to the main page for the App Registration you just created, now make a note of the Application (client) ID and Directory (tenant) ID as you will need these later to access the Office 365 Management API using the app just created. Now Client secret needs to be generated to be used for authentication to the APIs – click on Certificates & Secrets on the left hand menu.

IMPORTANT: Now make a note of the Client Secret created i.e. BlahBlah-BlahBlah. It is important that this is done now as once this window is closed the Client secret will no longer be visible.

This completes the process for configuring Office 365 Service Communications API. The next step will be using either PowerShell or Power Automate to present the data.

If you would to utilize Power Automate check out this blog post I created.

Regards
The Author – Blogabout.Cloud

Understanding Azure Active Directory Connector soft matching

It is has become very common across most organizations that they set up an Office 365 tenancy or Azure tenancy without configuring integration with their own on-premises Active Directory or another scenario an organization has been brought by another company. The end-users are then given cloud-only accounts until such a time where they can be fully integrated. In going down this road it can potentially cause a number of issues that need to be resolved by either soft matching or hard matching the on-premises AD User with the Cloud Account.

How do I soft match?

Soft matching is driven by the SMTP Address of the user account and usually, the UPN matches the SMTP Address. So in the diagram below that, I have created you can see I have captured the two scenarios organizations move their on-premises identities to Azure Active Directory. What I have also done is put a deliberate mistake into the images, can you spot what it is?

So User D and Cloud D are the same users but the UPN is different, why have I done this? This is to explain the behavior that will happen if the account cannot be correct identified with its cloud account. User A to C will all synchronize successfully and correctly however, User D will not succesfully be synchronized as the UPN that doesn’t match Cloud D. While this isn’t a bad issue for this scenario but if you were actioning at scale, I hope you are ready for a host of complaints from users.

Its is important to ensure that the SMTP Addresses on-premises vs. the cloud but please be aware there are limitations like in any Microsoft product

SMTP matching limitations

The SMTP matching process has the following technical limitations:

  • SMTP matching can be run on user accounts that have a Microsoft Exchange Online email address. For mail-enabled groups and contacts, SMTP matching (Soft match) is supported based on proxy addresses. For detailed information, refer to the “Hard-match vs Soft-match” section of the following Microsoft Azure article: 

    Azure AD Connect: When you have an existent tenant

    Note This doesn’t mean the user must be licensed for Exchange Online. This means that a mailbox that has a primary email address must exist in Exchange Online for SMTP matching to work correctly.
  • SMTP matching can be used only one time for user accounts that were originally authored by using Office 365 management tools. After that, the Office 365 user account is bound to the on-premises user by an immutable identity value instead of a primary SMTP address.
  • The cloud user’s primary SMTP address can’t be updated during the SMTP matching process because the primary SMTP address is the value that is used to link the on-premises user to the cloud user.
  • SMTP addresses are considered unique values. Make sure that no two users have the same SMTP address. Otherwise, the sync will fail and you may receive an error message that resembles the following: Unable to update this object because the following attributes associated with this object have values that may already be associated with another object in your local directory services: [ProxyAddresses SMTP:john@contoso.com;]. Correct or remove the duplicate values in your local directory.

Hard-match works in a simalar way but uses the ImmutableID of the user accounts. This is unique value that each account has, so to hard match the on-premises ImmutableID to the cloud account would mean that you modify every single Cloud account with the correct on-premises account value. I know this from experience as I had to do just that for one of my customers and created a powershell script to enable the change.

Regards
The Author – Blogabout.Cloud