How to Deploy Office 2016 ProPlus Click-to-Run with ODT and SCCM

In this blog post, we will be deploying Office 2016 ProPlus (retail) with Office 2016 Deployment Tool and System Center Configuration Manager.

Office 2016 Deployment Tool (ODT)

To begin, we need to get the Office 2016 Deployment Tool (ODT). That can be downloaded from here, Microsoft Download Center. Create a folder on your SCCM application source folder, I called mine, “Office 2016“. Install the deployment tool on your SCCM server, save the extracted files to the folder you just created. Once the installation is complete, the following two files will be found:

Next, we need to create an XML file within the folder. I copied the original “configuration.xml” and called it, “Office 2016 Config.xml” and updated its contents to below. In my deployment, I am deploying Office 2016 32-bit. However, if you are deploying 64 bit, then just change OfficeClientEdition=”32″ to OfficeClientEdition=”64″.

<Configuration>
<Add SourcePath="your path to source files" OfficeClientEdition="32"> >
 <Product ID="O365ProPlusRetail">
 <Language ID="en-us" />
 </Product>
 </Add>
</Configuration>

Next, we need to run Command Prompt (run as Administrator), and run, “setup.exe” with the XML file we just created/modified. After this completes (give it a few minutes), you should now have the following four files within your folder.

setup.exe /download "Office 2016 Config.xml"

Next, we need to update the, “configuration.xml” file. This file is used to deploy Office 2016. As previously, we have set the version to 32-bit, again change this to 64, if you are deploying 64-bit Office. In this deployment, I am deploying a per-user licensing model, if you are using a product key per machine, you will need to add, “PIDKEY” value to the configuration file.

<Configuration>
<Add OfficeClientEdition="32">
<Product ID="O365ProPlusRetail"
<Language ID="en-us" />
</Product>
</Add>
<Display Level="None" AcceptEULA="TRUE" />
</Configuration>

Now we are ready to create and deploy our application package!

Create Application Deployment

First, we need to create the application package. We will choose the manual “Manually specify the application information” approach here.

Next, we need to provide some application information. Office 2016 deployment, owner, etc…

Now we need to add and create the deployment type

Next, we will choose “Manually specify the deployment type information“.

Again, give this deployment a name, and some descriptive comment(s).

Now, we need to specify the location of the source/installation file(s), and need to specify the “configuration.xml” file.

Next, we want to add a detection clause. Essentially, this deployment, once deployed, will validate against this code to confirm the installation was successful and both the detection code and product code match.

Note, if the deployment “fails”, yet the Office suite installed, confirm the product code and detection code match.

For the detection method, we will choose, Windows Installer, and the following Product code: {90160000-008C-0000-1000-0000000FF1CE}.

Next we will select, Install, and leave the Logon requirement to either.

We have no requirements and/or dependencies for this, but for completeness, here are those screenshot windows.

Great! Deployment is complete. Now we need to complete the application deployment wizard.

Great, application deployment is now complete. Now we need to deploy the package itself… Let’s do that.

Deploy Package Deployment to Collection(s)

Right click and select your collection, in this case, my collection is a test group, named “Test1”.

Specify the distribution point

We are going to mark Install and Available for the deployment settings here.

Provide a set time for the deployment to kick off, remember to set it to the correct time of day.. (struggled for a few deployments, after learning I forgot to set to AM…)

We will give the user the option to install, as the update will appear in their Software Center.

Now we if go to our client machine(s). I am testing on both Windows 7 and Windows 10 machines.

Validate Deployment

If we go into the Software Center, check under the “Available Software”, we now see the Office 2016 ready for deployment! Go ahead hit Install Selected, and let the magic happen!

Windows 7

We can validate the deployment, as we see the Office 2016 applications within the start menu.

Likewise for Windows 10:

 

For complete information on this deployment, please feel free to visit Microsoft’s article.

Advertisements

Configuring RSA Authentication Agent for ADFS 3.0 + Office 365

Security/Multi-Factor (MFA) are some of the big buzz words this year (2017) and when deploying Office 365, MFA (Multi-Factor Authentication) is almost a no-brainer. In the following post, I will demonstrate how to configure RSA Authentication Agent for ADFS 3.0. There has been some configuration done prior to the agent deployment, ie. TCP/UDP ports, RSA Auto-Registration, sdconf.rec export, etc. For the full documentation, please see the footnotes from RSA and Microsoft for ADFS 3.0 for implementation requirements guidelines.

Let’s get started. Please note, the following is for a Windows Server 2012 R2 (ADFS 3.0) and RSA Authentication Agent 1.0.2.

You will need this, “sdconf.rec” file from your RSA Administrator(s).

 

Next, within the ~\RSA\RSA Authentication Agent\AD FS Adapter\ folder, copy the “ADFSRegistrationSample.ps1” script to the “SampleRegistrationScripts” folder. This is a known bug in RSA Authentication Agent 1.0.2, as the file should be within the folder by default, but it is not.

Execute the PowerShell script as Local Administrator…

Now you should be able to see the RSA configurations within the AD FS management console.

If we go into the to Authentication Policies > Per Relying Party Trust > we can now edit the MFA settings for Office 365.

For this demo, we will enable both, Extranet, and Intranet.

Enable the RSA SecurID Authentication. Now if all was configured correctly, users within the Office 365 portal will be prompted for an RSA token once they supply valid Office 365/AD credentials!

 

 

 

System Center Virtual Machine Manager (SCVMM) 2016 – Error 2912 – Unknown error (0x80041008)

Problem: Cannot to deploy a logical switch (vSwitch) to a Windows Server 2016 node.

Environment: 2x10GB Network Cards – IBM Flex Chassis (not that matters…)

Error:

An internal error has occurred trying to contact the ‘hypervserver01.domain.com’ server: : .

WinRM: URL: [http://hypervserver01.domain.com:5985], Verb: [INVOKE], Method: [GetFinalResult], Resource: [http://schemas.microsoft.com/wbem/wsman/1/wmi/root/scvmm/AsyncTask?ID=1001]

Unknown error (0x80041008)

Recommended Action
Check that WS-Management service is installed and running on server ‘hypervserver01.domain.com’. For more information use the command “winrm helpmsg hresult”. If ‘hypervserver01.domain.com’ is a host/library/update server or a PXE server role then ensure that VMM agent is installed and running. Refer to http://support.microsoft.com/kb/2742275 for more details.

Solution: In my case, I tried the following. Ultimately, it came down to my last case (enabling the physical network card).

  • Disable Windows Firewalls on both SCVMM and the Hyper-V 2016 server
  • Change the default WinRM port to 5985
winrm set winrm/config/Listener?Address=*+Transport=HTTP '@{Port="5985"}'

  • Enable the secondary physical port

Connect Batch of Azure VMs to Log Analytics (OMS) via PowerShell

So, you have a bunch of Virtual Machines (VMs) in Azure, and didn’t used an ARM template, and now need to connect the VMs to Log Analytics (OMS). Earlier this month, I demonstrated on this can be done with the ARM portal, here’s that blog post. Of course, this has to be done individually and can be very tedious if you have 10’s or 100’s of machines to do this to… All I can think of is PowerShell!

Here is a script I tweaked that Microsoft has already provided but for a single VM. I have just tweaked it to automate and traverse through your entire resource group, and add ALL VMs within the RG to Log Analytics.

Here is the link to Microsoft TechNet for that script. Please test it out and let me know. And if it helped you out, please give it a 5 start rating.

Microsoft TechNet PowerShell Gallery

If all went well, your before and after should look similar to this. I had two test VMs in my Resource Group.

Before:

After:

(more…)

What is Azure File Sync (AFS) and how to set it up?

Earlier this month, Microsoft introduced Azure File Sync (AFS). So, what is Azure File Sync (AFS)?

Azure File Sync is a cloud based backup solution for backing up and providing disaster recovery options for a single, or multiple file shares within a server, or multiple servers. Some of the benefits are:

  • Eliminates network and storage complexity and capacity planning, as it is done for you in Azure.
  • Changes to on-premises data are synchronized in real time to Azure, and file/folder backup is completely seamless to the end-user(s).
  • At the current time, AFS offers 120 days of data retention.
    • I suspect this will increase over time, and will allow administrators to have options with higher or lesser days of retention.

Setting up and configuring Azure File Sync is pretty quick. Below is how I setup Azure File Sync to sync a folder/files from my local server to Azure. AFS is pretty cool stuff, and I have been wanting to chat about it for some time (NDA). At any rate, getting AFS setup is pretty easy. Microsoft provides pretty good documentation on how to do this as well, but in my opinion, they have elected to omit some steps. Here is my take:

First you will need to create a new Azure File Sync Storage Sync. Within the Azure marketplace, search, “Azure File Sync“. Note, Azure File Sync is currently only available to a limited set of regions:

  • South East Asia
  • Australia East
  • West Europe
  • West US

Once created, under Sync, and getting started, download the Storage Sync Agent.

Note, Azure File Sync currently only works with Windows Server 2016 and Windows Server 2012 R2 (servers must be installed with a GUI — no core).

Download and install the agent on your local server, and configure it to the Storage Sync Service you just created in Azure.

Whoops, since this a brand new server install, there is no AzureRM PowerShell modules installed. Go ahead and launch PowerShell as an Administrator, and execute the cmdlet, “Install-Module AzureRM -force

Okay, back to the install. Remember to select the Storage Sync Service you just created in Azure

Once the install is complete, go back to Azure, and under Sync, Registered Servers, your local server should now be present.

Great, now we need to create a Storage account. We can either chose an existing storage account, or create a new one – I chose the ladder.

Regardless with route you take with the Storage account, go into the Storage account properties, and scroll down to File Service, and select Files.

Create a File Share, give it some name, and some quota. I gave it 1GB, as this is simply for testing and PoC. The file path is the same file path you want to backup to AFS. This file path should already exist on your local server(s).

Now go back to your Azure File Sync, and under Sync, and Sync Groups, create a new Sync Group. Within the Azure File Share, select the File Share we just created within our Storage account.

Finally, now we can create an server endpoint. Go back to your Sync Groups, and create a new server endpoint. Here you will need to specify the file/folder you will want to share/copy/backup to your Azure File Sync (AFS).

And that is it! Next I will show you how you can actually restore from your Azure File Sync.

Connect Azure VMs to Log Analytics (OMS) via ARM Portal

Let’s say you have a bunch of machines in Azure, and want them communicating with Azure Log Analytics (aka OMS). Well, I am pretty sure that last thing you want to do is deploy the Microsoft Monitoring Agent to each machine, manually…

Well, now you can connect a VM to Log Analytics (OMS) with just a few clicks.

Go into the ARM (Azure Resource Manager) portal, and navigate to your “Log Analytics” blade, select your OMS workspace name, and within the Workspace Data Sources, select Virtual Machines.

Here you should have your machines that currently live within Azure. As you can see, there is one machine that is not connected to the OMS workspace. Let’s connect it now.

Select the VM in question, and you will now be presented with the following:

Make sure the VM is online/running, and select Connect. The VM must be online in order for the extensions to be passed through.

Give it a few moments, and there we go! No manual agent deployment.

 

We can also verify now in OMS, to see our new machine chatting with Log Analytics. (Go into the Agent Health solution/title)

ADFS Monitoring with Azure, OMS, SCOM 2016

ADFS (Active Directory Federation Services) has really taken flight since the inception of Office 365 and Azure Active Directory. Getting your on-premises environment configured with online identity services such as Azure, and having the SSO (Single Sign-On) abilities makes ADFS fundamental. Implementing ADFS is one thing, but what about monitoring your ADFS environment?

The following post is intended to illustrate the differences between ADFS monitoring by comparing the following monitoring tools: Azure AD Connect Health, OMS (Operations Management Suite) and SCOM 2016 (System Center Operations Manager).

SCOM (Operations Manager) 2016

First step is to deploy SCOM agents to your ADFS environment/servers along with the ADFS Management Pack install. Once that is complete, and discovery has run, we should start seeing data within the ADFS view(s).

Within the ADFS view, we can see some useful information such as Token requests. This data is represented in an hour fashion, and we can see the number of tokens being requested per hour over the given date range.

And good view is the Password Failed attempts. We can see how many bad password attempts were made over the various date range, but information such as which user, and when, could be useful.

This information is all good, however without doing some custom management pack work, it is impossible to get any additional data, ie. which users are requesting the token, which users are inputting bad passwords, and which users are connecting to which site/service offered by ADFS.

OMS (Operations Management Suite)

OMS does a nice job with dashboards, but unlike SCOM, we need to not only know which Event IDs we need to capture, we also need to build our dashboards out. This is not ideal, as it does require some custom work, and some investigation with regards to ADFS related Event IDs.

The query below, “EventID=4648 OR EventID4624 | measure count() by TargetAccount” shows us which target account/active directory user has requested the most ADFS tokens over the last 1 hour. Please note, this query is based on the OMS Log Analytics language version 1.

Since OMS does require a lot of ADFS knowledge, ie Event IDs, I decided not to proceed any further and build additional queries and dashboards.

Azure AD Connect Health

Lastly, Azure AD Connect is probably the most simple, and least technical configuration.

As a prerequisite, I enabled the all event types on the ADFS logs.

After running the AD Connect agent on the ADFS server(s).  And launching the Azure Resource Manager portal, we get some dashboards. Right off the bat, we can see some excellent information. Let’s take a deeper look.

If we click on the total request widget, this shows us similar data as we see in SCOM 2016, with some exceptions. Not only can we see the number of tokens being requested. We also can see which ADFS server within the farm is distributing the tokens. Since this is a highly-available and load-balanced configuration, it is comforting to know ADFS is distributing tokens as it is designed.

Secondly, we can also see which services within ADFS are generating the most hits. This is great to see which sites are the most busy. This something that lacks in SCOM and OMS, and I was unable to generate even after some custom MP work.

 

 

If we go into the Bad Password Attempts widget, we can see not only the number of bad password attempts, but also see which user and at what time and their source IP the attempt was generated from — very cool!

Overall, AD Connect Health does an excellent job and provides rich data and expands on what SCOM already does.

Verdict

After comparing SCOM 2016, OMS and Azure AD Connect Health, the clear winner is Azure AD Connect Health. Not only is the configuration straight forward, but provides more than enough information to monitor the ADFS environment. Azure AD Connect Health provides rich and very clear dashboards with almost no effect other than some log configuration on the ADFS server(s). The data is comparable to what SCOM presents, however much more richer and detailed. OMS and SCOM are still good tools, however does require some more technical knowledge and building the dashboards can be laboursome.