Richard works as a Cloud Consultant for Fordway Solution where his primary focus is to help customers understand, adopt and develop with Microsoft Azure, Office 365 and System Center.

Richard Green is an IT Pro with over 15 years' of experience in all things Microsoft including System Center and Office 365. He has previously worked as a System Center consultant and as an internal solutions architect across many verticals.

Outside of work, he loves motorbikes and is part of the orange army, marshaling for NGRRC, British Superbikes and MotoGP. He is also an Assistant Cub Scout Leader.

Preparing Certificates and GPOs for System Center Update Publisher

If you are using Configuration Manager to manage and patch your client estate then you already know that it’s great to have your Software Updates in the same console as your Application Delivery and the way in which Configuration Manager 2012 R2 manages Software Updates is a big leap on usability over Configuration Manager 2007 however the missing piece of the puzzle for many is managing non-Microsoft updates and for that, we need to enlist the help of a free product from Microsoft called System Center Update Publisher.

Before we start anything with Configuration Manager, WSUS or SCUP however, we do have the small matter of prerequisites to cover off and in this case it requires a certificate and a Group Policy setting or two. The certificate we are interested in is a Code Signing certificate which unless you are familiar with signing PowerShell scripts that you author, you may not have come across previously and your internal CA may not be setup to issue. You can buy these certificates for Code Signing from an external third-party CA if you wish but it is easiest and best done internally as after all, the code you are going to be signing is for updates to your internal clients.

Creating the Code Signing Certificate Template

On your Certificate Authority, we need to configure it to issue a Code Signing certificate. You can either use the native Code Signing template or you can create a custom template just for SCUP so that you can limit the scope of the certificate template to selected users or a group of users accordingly. If you want to create a new template then duplicate the existing Code Signing certificate for the purpose.

Once you have decided on the template to use, configure the CA to issue the certificate. In my lab, my template is called SCUP Code Signing and the security on the template limits users in an Active Directory Group called SCUP Code Signing Users to being able to Enroll the certificate which prevents users, malicious or otherwise from requesting the certificate.

SCUP Code Signing CA Template

Request the Code Signing Certificate

Once you have configured everything on the CA, you need to request a certificate based on this template. Using the Certificates MMC snap-in for your user account, you can request to enrol the certificate from your Active Directory Enrollment Policy.

SCUP Code Signing Certificate Request

If you based your new Certificate Template on the Code Signing template or you used the Code Signing template, you don’t need to enter additional information and the request will be built from Active Directory user attributes. Once you have created the certificate, you need to export it twice. For the first export, export the certificate only in .cer format and do not export the private key. This portion of the certificate will be used in the Group Policy Object shortly. The second export is required to be in .pfx format and include the private key and is used in SCUP for configuring it once installed.

Configure the Trusted Publishers Group Policy Setting

Once you have issued the certificate and you have exported it twice; once as a .cer file and once as a .pfx file, we need to configure the Group Policy for the Trusted Publishers. Put simply, in order for your client PCs to install updates that are not signed by Microsoft, the clients need to trust the updates. In order for the updates to be trusted, they need to be signed with a certificate that the clients trust. Having a certificate from your internal CA isn’t enough for this though. Once you have a certificate, a client will trust it as it is from a Trusted Root Certification Authority but it will not be trusted for code signing unless added to the appropriate certificate store.

Using ether a new Group Policy Object or an existing object which contains your other Certificate Services related settings, we need to add the .cer certificate exported earlier to the policy.

Trusted Publishers GPO Setting

Within the Group Policy Object, expand the Computer Configuration folder and then drill into Security Settings followed by Public Key Policies. Within the Public Key Policies folder, open the Trusted Publishers folder. In here, you need to import the Code Signing certificate .cer file that was previously exported. Doing this allows your clients to trust updates signed with this certificate for the publishing of software and applications.

Make sure you use the .cer export and not the .pfx export here as we only want the clients to have and trust the public key portion of the certificate. Distributing the .pfx would give these clients the private key also and that would be bad to have sent throughout the entire environment on every machine linked with the GPO.

Next, we need to change one setting in relation to the Windows Update Agent on the clients. In the same GPO or in another GPO if you have one dedicated to Windows Update related settings, navigate to Computer Configuration, Administrative Templates, Windows Components, Windows Update. Here, you need to change the status of the Allow signed updates from an intranet Microsoft update service location setting from Not Configured to Enabled. This second setting allows the Windows Update Agent to actually detect and download updates from your WSUS and SCCM environment if they are not signed by Microsoft and this setting is paired with the Trusted Publisher certificate above to make non-Microsoft updates trusted on the client.

With these all the above completed, you are now set and ready to deploy System Center Update Publisher and a follow-up post I will be publishing soon will cover the SCUP installation and setup.

Delta CRLs are Not Accessible via HTTP When Hosted on IIS

If you are running a Microsoft PKI in your environment then chances are you will have (or at least you should have) configured at least one HTTP based distribution point (CDP) for your Certificate Revocation Lists. If you are only publishing full CRLs then you will have no problems however if you are publishing Delta CRLs, the smaller, faster to process kind which list only certificates revoked since the last full publish then you may encounter an issue if you are using an IIS website to publish these.

The problem lies in the filename used for the CRLs. In my lab for example, my Certificate Authority issues a CRL file name rjglab-CA.crl and the delta files are named the same as the full CRL but they are appended with the plus character making the file name rjglab-CA+.crl. In it’s native configuration, IIS does not permit the use of the plus character because that character falls into the realms of IIS Request Mapping and the request handler.

HTTP Error Downloading Delta CRL

We can see in the screenshot above what the error code and message given by IIS is when we try to download the Delta CRL in the default configuration.

For an IIS webserver hosting your CRL and Delta CRL, we need to change the behaviour of IIS to allow this plus character to be permitted which luckily is easily done. First off, open IIS Manager on the server which is hosting and making available to clients your Delta CRL file. From the server home in IIS, open the Request Filtering page and from this page, select the Edit Feature Settings button in the Actions bar.

Request Filtering Settings in IIS

On the Edit Request Filtering Settings page under the General section, by default, Allow Double Escaping is disabled. Enable this option and then press OK.

Once you have made the change, try to download the Delta CRL file and you should find that the file is available and you can successfully download it.

Delta CRL Downloaded OK

Extended Validation (EV) with an Internal Certificate Authority

As IT Pro’s, we know that Extended Validation or EV on web server certificates doesn’t actually add a security layer or harden our web servers in any way but it does give users the warm fuzzy feeling that the website they are using is definitely trustworthy and given that we want our users to believe everything we do internally in IT is trustworthy, it would be great to have our internal web services use Extended Validation certificates for user facing websites.

If you are using a Windows Active Directory Certificates Services (ADCS) certificate authority for issuing your certificates then the great news is that we can do this and it can be made to work in an existing environment so you don’t need to build a new Root CA or setup new servers for it to work, we just need to create a new Certificate Template and a Group Policy Object in the domain.

Configure the Certificate Authority

The first step is to create the Certificate Template. On your ADCS server where you issue your Web Server certificates, open the Certificate Authority MMC console. From the console, right-click on the Certificate Templates folder and select Manage.

Manage Certificate Templates

Once you have clicked this, another window will open with the list of Certificate Templates configured in the environment. Find the Web Server certificate, right-click it and select the Duplicate Template option.

New Template Properties

At the Properties for New Template dialog, enter a display name that is appropriate such as “Web Server with EV” or “Web Server Extended Validation”. From here, click the Extensions tab.

New Template Properties Extensions

On the Extensions tab, highlight the Issuance Policies list item and select Edit. At the window which appears, select the New button to add a new Issuance Policy.

EV Issuance Policy

Give your new issuance policy a name such as “EV Issuance Policy” and if you have one (which you should do for production) enter your Certificate Purpose Statement URI. If you don’t know what a Certificate Purpose Statement (CPS) is then I would suggest the TechNet article Certificate Policies and Certificate Policy Statements as a first primer however in a nutshell, it’s a webpage which gives people information about how the certificates can be used.

Before you hit OK on the New Issuance Policy dialog, note the final field OID. Copy this OID to your clipboard and keep it their for the time being or better yet, save it to a text document in a safe place as we need this for the steps later.

Once you have this, hit OK on the dialog and change any other settings on the template you may need to such as the validity period, the key length or whether you want to allow the private key to be exported. Once you have created the new template, we need to configure the CA to be able to issue it.

CA Certificate Template to Issue

As shown above, back in the Certificate Authority console, right-click on the Certificate Templates folder and this time, select New followed by the Certificate Template to Issueoption. From the list of templates, select the new template you just created for Web Server with Extended Validation.

After this, the Certificate Authority is configured with a new template that can be used for Extended Validation and the CA is configured to issue certificates based on that template however it’s no good having the certificates if the clients do not know to trust it to the extent required to display the green address bar.

Configure Group Policy in Active Directory

With the CA configured, we need to configure clients to trust this certificate for Extended Validation and the best method for this is going to be Group Policy. If you have an existing Group Policy to apply certificate related settings then use that policy otherwise create a new one and link it either at the root of your domain to apply it to all computers on the domain or to a particular OU if you only want it to apply to sub-set of clients. Just for clarity, I would not recommend putting certificate related settings in the Default Domain Policy nor would I recommend putting any settings into that policy. The Default Domain Policy and the Default Domain Controllers Policy should be left untouched and new policy objects should be created for any settings you want to apply.

In your Group Policy Object, expand the view in Computer Configuration followed by Security Settings, Public Key Policies and finally Trusted Root Certification Authorities. If you are using an existing policy, you should have here a valid copy of the public key portion of the certificate for your Root CA. If you are creating this as a new policy, you will need to import the public key portion of your Root CA certificate.

GPO Trusted Root Certificataion Authorities GPO Trust Root CA Extended Validation Properties

Once your certificate is added, right-click it and select the Properties. From the properties, you need to select the Extended Validation tab. On this tab, add the OID that you earlier copied or saved to a text document. Any OIs in this list are considered trusted for Extended Validation when a certificate contains the Issuance Policy matching that OID and the certificate issued by a CA that is part of the issuing or subordinate chain below the specified Root CA.

Once you have applied the GPO to your clients, you can issue a new certificate for a web site with the Web Server Extended Validation template and when browsing to that site from a client computer which both trusts your Root CA and understands the OID applied to the Issuance Policy, you will get the green address bar.

Website with EV Certificate

Automatically Select a Configuration Manager Task Sequence

With Configuration Manager, we have a great amount of power and control over how we deploy computers using Task Sequences. Many people go to great lengths to make their operating system deployment experience a great one be it for end-users with User Driven Installation or for their support technicians driving the builds instead. One way which we can streamline the process is to use a consolidated task sequence which handles all of our various operating systems, languages, drivers and applications in a single task sequence with intelligence.

If you have gone to the great lengths to make this happen in your environment, you may be left with a sinking feeling that everytime you PXE Boot or USB Media Boot a client to deploy, you still have to select a task sequence to run even though you only have one as shown in the sample below.

Select TS Task Sequence Wizard

Luckily, we have a solution to this and a way to allow us to skip the Task Sequence Selection wizard and automatically enter our one task sequence to rule them all and it is done using Prestart Commands on our Boot Images in Configuration Manager. You don’t need MDT or any other fancy software integration with Configuration Manager to do this as it is done using the default boot images.

To start, we need to know the Deployment ID for the Task Sequence. This is not to be confused with the Package ID. The Deployment ID is the unique ID assigned to a single instance of the task sequence deployed to a collection.

We can get the Deployment ID for the Task Sequence by navigating to the Software Library portion of the Configuration Manager Administration Console and then expanding Operating Systems followed by Task Sequences and then locate your sequence from any folder structure you may have created. With the Task Sequence selected, the lower half of the screen will show the summary properties for it. Click the Deployments tab at the bottom here to see where your Task Sequence is deployed, in my example, to the All Systems collection.

In this area, right-click on the column titles and add the Deployment ID column to the view. This is the value we need.

Task Sequence Deployment ID

With this value in hand, we now need to create a very simple VBScript file. I store this file in a directory called OSD Prestart Files on my Primary Site Server but where you store it is up to you. Create a VBScript with the following contents.

Set DefaultOSDTS = CreateObject("Microsoft.SMS.TSEnvironment")
DefaultOSDTS("SMSPreferredAdvertID") = "RJG20008"

In your case, you will need to change the value of DefaultOSDTS to the Deployment ID shown in your console. My Deployment ID was RJG20008 as set.

Once you have created and saved this file, head over to the Boot Images section of the Operating System portion in the Software Library and view the Properties for your boot image.

Boot Image Prestart Command Setting

As you can see from the image above, we need to check the box for the option Enable Prestart Command on the Customisation tab of the properties. Once this is checked, we can add the command to call our VBScript file which in my case will be cscript AutoStartOSD.vbs. Once you have entered this, check the box for Include files for the prestart command and specify the path to the script file you created so that this file is integrated into the Boot Image file.

When you save the changes, you will be prompted to update your Boot Image files and the file will be rebuilt and updated on your Distribution Points. Once complete, if you are using PXE to boot your clients you need to do nothing more and you can start enjoying your automatically starting task sequence. If you are using USB or ISO Boot Media to start your task sequence process, you will need to update your image as the Boot Image that the media is based on has been updated.

Access Office 365 with Azure ExpressRoute

Azure ExpressRoute when it launched back in 2014 was for me, one of the most exciting propositions with Azure. The ability to rapidly provision, scale and consume PaaS and IaaS resources in the Microsoft Cloud however it lacked one thing and that was Office 365. Whilst many, many customers are adopting Office 365, having that traffic routed out over your internet connection for some people is seen as a security concern and for others it’s a bandwidth problem they just don’t want to deal with.

Earlier this week, the Office Team has posted a blog at http://blogs.office.com/2015/03/17/announcing-azure-expressroute-connectivity-to-office-365/ that Office 365 over Azure ExpressRoute is on the way although sadly not until Q3 2015.

The wait aside, this is great news both for customers seeking the maximum performance for their Office 365 deployments and their on-premise users and great news because it is another string in the public cloud productivity suites’ bow. I look forward to seeing that make it to the mainstream and seeing it in action.

Configuration Manager OSD Fails with Error 80070002

When working in my lab environment to build a deadly Operating System Deployment Task Sequence, I did the usual thing of creating reference image task sequences and building reference images. After doing this I fired off the real deployment task sequence to test some PC deployments and I was running CMTrace within Windows PE to monitor the logs and the progress.

I noticed in the testing that the Windows 7 images I had created worked just fine but my Windows 8.1 images kept failing at the Apply Operating System step. The package would be downloaded okay from the Distribution Point but would hang for quite some time after downloading it and fail to start applying the image to the disk. The error code in the smsts.log was 80070002 and the message for the code was The system cannot find the file specified. (Error: 80070002; Source: Windows).

Given that the build and capture task sequences had completed without problems, I knew that it must have been something added or changed between the original source media and the reference image task sequence and the most likely culprit is the Install Software Updates steps in the build and capture task sequence.

I started searching online to see if anyone had produced a list of known bad updates as having to identify manually which of the 92 updates being applied during the reference image creation process would be a pain. Luckily, Microsoft have a support article for just this at https://support.microsoft.com/en-us/kb/2894518. After reviewing the articles referenced on this page, it turned out that one of these updates was approved in one of my Software Update Groups and downloaded to one of my Deployment Packages. I unapproved and deleted the update and refreshed the package on the Distribution Points to flush it out.

My Windows 8.1 build and capture task sequences are running again as I type, fingers crossed this time working in the main deployment after capture.

UPDATE: Confirmed that removing the KBs referenced in the support article resolved the issue and here’s a screenshot just for proof. I will have loads of posts coming soon how this task sequence is built including sample files.

Windows 8.1 OSD Deployment

Updating Configuration Manager 2012 R2 Client Package to UR4

When you install UR4 for Configuration Manager 2012 R2 one of the things it doesn’t do is update your base client install package. As a result of this, newly installed agents will still install the out-of-the-box version 5.00.7958.1000 of the agent not the UR4 version 5.00.7958.1501. It goes without saying that we don’t want newly deployed agents to have to install and them straight away afterwards, update to UR4 because it makes sense to incorporate this update at install time.

One of the most common ways to apply an update rollup is using the PATCH MSI parameter in both your Client Push Client Installation Settings and also in your Setup ConfigMgr step in any Operating System Deployment Task Sequences. Not only does this mean updating it in two places at minimum but if you have a number of task sequences, it could be even more.

In this post, I’m covering to explain a great method of getting UR4 installed with a new agent that was posted by a blogger named Matt at http://www.m4ttmcg.com/2013/05/sccm-2012-client-push-including.html. This process is deemed to be not officially supported however using the PATCH parameter isn’t exactly filled with support and joy and with this method being easier, it makes it all the more promising.

The Configuration Manager Client Agent installation by default looks to a sub-folder of the Client directory called ClientPatch and installs any .msp files it finds as part of the installation, installing multiple patches alphabetically in order.

To do this, on your Primary Site Server, navigate to the local file system path where the update patch .msp file is stored. On my server this is located at D:\Program Files\Microsoft Configuration Manager\hotfix. In the hotfix directory will be a folder for any updates that you have installed which in my case is UR4 or KB3026739 and then there are subsequent subfolders for AdminConsole, Client, SCUP and Server.

Open the Client folder and then you will see more folders for x86 and x64 for the two client architectures.

SCCM Client Hotfix Folder

In another Windows Explorer window, open the folder for the Client Agent in the site which is used by both the Client Package and the path for the Client share on the site server. On my server, the share is located at D:\Program Files\Microsoft Configuration Manager\Client.

In the Client folder are two subfolders for x86 and x64 for the two architectures of the client agent. In each architecture folder (the screenshots herein are all for the x64 architecture but simply repeat for x86) create a folder called ClientPatch.

SCCM Client ClientPatch Folder

In the ClientPatch folder you just created, copy the .msp file for the KB (UR4 in my case) and then repeat this for the other architecture so that both x86 and x64 client folders have a ClientPatch subfolder and the appropriate .msp file to match the architecture.

Once you have updated both of the client folders, head over to your Configuration Manager Admin Console and the Software Library and then navigate your library to locate the Configuration Manager Client Package. Right-click on the software package and select the Update Distribution Points option.

SCCM Client Update Distribution Points

Once your package has been updated on all of your distribution points, you’re set. Your client package now includes the UR4 update .msp file and any new client installations such as Client Push or via an Operating System Deployment Task Sequence will be installed with the UR4 update automatically with no need to update your Installation Parameters with the PATCH option.

Upgrading Configuration Manager 2012 R2 Agents to UR4

In this post I’m going to assume that you’ve already installed UR4 in your Configuration Manager environment as that’s covered by many a post and article online already. I’m also going to assume that your UR4 Software Packages have been distributed to your Distribution Points.

After you’ve installed Update Rollup 4 for Configuration Manager on your Primary Site Server and updated your site database, it’s time to update the rest of the servers in your Configuration Manager hierarchy and then move on to your agents and administration consoles. The first thing to do is we want to create some device collections to help us. I have created the following collections using the included WQL statements for the query based membership.

Creating the Query Based Collections

SCCM UR4 Collections

Configuration Manager 2012 R2 Agent (x64)

SELECT
SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,SMS_R_SYSTEM.Client from SMS_R_System INNER JOIN SMS_G_SYSTEM_COMPUTER_SYSTEM on SMS_G_System_COMPUTER_SYSTEM.ResourceId = SMS_R_System.ResourceId WHERE SMS_R_System.ClientVersion = "5.00.7958.1000" AND SMS_G_System_COMPUTER_SYSTEM.SystemType = "x64-based PC"

Configuration Manager 2012 R2 Agent (x86)

SELECT
SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,SMS_R_SYSTEM.Client from SMS_R_System INNER JOIN SMS_G_SYSTEM_COMPUTER_SYSTEM on SMS_G_System_COMPUTER_SYSTEM.ResourceId = SMS_R_System.ResourceId WHERE SMS_R_System.ClientVersion = "5.00.7958.1000" AND SMS_G_System_COMPUTER_SYSTEM.SystemType = "x86-based PC"

Configuration Manager 2012 R2 Console

SELECT
SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,SMS_R_SYSTEM.Client FROM SMS_R_System INNER JOIN SMS_G_System_ADD_REMOVE_PROGRAMS on SMS_G_System_ADD_REMOVE_PROGRAMS.ResourceID = SMS_R_System.ResourceId WHERE SMS_G_System_ADD_REMOVE_PROGRAMS.DisplayName = "System Center 2012 R2 Configuration Manager Console" AND SMS_G_System_ADD_REMOVE_PROGRAMS.Version = "5.00.7958.1000"

Configuration Manager 2012 R2 UR4 Agent (x64)

SELECT
SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,SMS_R_SYSTEM.Client from SMS_R_System INNER JOIN SMS_G_SYSTEM_COMPUTER_SYSTEM on SMS_G_System_COMPUTER_SYSTEM.ResourceId = SMS_R_System.ResourceId WHERE SMS_R_System.ClientVersion = "5.00.7958.1501" AND SMS_G_System_COMPUTER_SYSTEM.SystemType = "x64-based PC"

Configuration Manager 2012 R2 UR4 Agent (x86)

SELECT
SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,SMS_R_SYSTEM.Client from SMS_R_System INNER JOIN SMS_G_SYSTEM_COMPUTER_SYSTEM on SMS_G_System_COMPUTER_SYSTEM.ResourceId = SMS_R_System.ResourceId WHERE SMS_R_System.ClientVersion = "5.00.7958.1501" AND SMS_G_System_COMPUTER_SYSTEM.SystemType = "x86-based PC"

Deploying the Software Packages for Agents

With the collections now created, head over to the Software Library node of the console. As part of installing UR4, the installer will have automatically created four software packages for you, two for the client architectures, one for the console and one for servers.

Clicking one of the Software Packages in the upper half of the view reveals the details of the package in the lower half. Select the Programs tab in the lower half and you will see the Program associated with the Software Package. Right-click the Program and select the Deploy option from the context menu as shown below.

SCCM UR4 Deploy Program

After clicking this, the Deploy Software Wizard will be shown. The Software field will be automatically populated by the wizard; we just need to enter the collection to target the deployment. For example, we want to deploy the UR4 update to all clients which have the out-of-the-box version installed so we need to target the Configuration Manager Client RU4 Update (x64) software program to the Configuration Manager 2012 R2 Agent (x64) collection.

SCCM UR4 Agent Deploy Software Wizard 1

On the next tab, we need to change the default Purpose from Available to Required. This purpose will force the installation to occur according to a schedule which we set on the following panel.

SCCM UR4 Agent Deploy Software Wizard 2

On the scheduling panel next up, we want to configure the schedule in accordance with any change control we may have to comply with. In my lab environment, I simply want to push all of the agent updates out to the servers and clients as soon as possible so the As Soon as Possible option is ideal for me.

SCCM UR4 Agent Deploy Software Wizard 3

If you aren’t using Maintenance Windows on any of your collections in Configuration Manager you can click next through to finish on the wizard now and have your agent deployment begin. If you are using Maintenance Windows then you will want to pay attention to a setting on the following panel.

SCCM UR4 Agent Deploy Software Wizard 4

If you are using Maintenance Windows then you may want to select the Software Installation checkbox to allow the agent installation to occur outside of any windows. The agent installation does not require a restart and in most cases, should be updated in a couple of minutes with no fuss so there is really isn’t a reason to not allow this to happen. Ticking the Software Installation checkbox allows the agent upgrade to take place outside of any maintenance windows so it will take place as soon as possible.

SCCM UR4 Agent Deploy Software Wizard 4

Once you have completed the wizard, you can confirm that your Deployment was successfully stored by viewing the Deployments tab in the lower half of the Software Library Packages view. As shown below, you can see that the deployment I just created to deploy the 64-bit agent has been stored. With this completed, you will want to repeat the process for the x86 (32-bit) agent so that both types of client get the UR4 agent update.

Deploying the Software Package for Admin Consoles

I’m not going to walk through the process for this as it is the same as above for the agent update however there is one important difference I will cover.

When you get to the Deployment Settings panel, you need to select the Available deployment option and not the Required option. The reason for this is that when you deploy the UR4 agent updates, the agent version gets updated from 5.00.7958.1000 to 5.00.7958.1501 however installing UR4 for the Admin Console does not update the version number reported for the console in Programs and Features in Control Panel.

With the Admin Console, because the version number does not change, if the deployment was set to Required, clients would install the update package and would continue to re-evaluate it because it would still be detected as the old version not the UR4 version. What we need to do instead is advertise it to computers with the Admin Console installed and make it available for those users to initiate themselves.

If anyone knows of a different way to target the Admin Console UR4 update, perhaps using the installed Software Update detection I would really like to hear how you have been able to have it automatically installed as Required as that would save a tonne of headache and effort. In the meantime, enjoy your UR4 client rollout. In another post, I will be covering how to update your base client package with UR4 so that newly deployed clients get this updated version hassle free.

Power On Hosts Out of Band with IPMI and PowerShell

As part of my home lab project, I built my servers using Supermicro X8DTH-6F motherboards for many reasons however one of these reasons was that the motherboard hosts an IPMI base management controller (BMC) interface for managing the power state of the host out-of-band along with the ability to access the IP KVM and virtual media support to allow me to access the server console session remotely and even attach .iso media over the network.

In my quest to make my lab more accessible I wanted to be able to script the procedure for starting it and shutting it down and whilst I haven’t given a whole heap of thought to the whole thing just yet, I have made an interesting discovery in the quest. Windows Server 2008 R2 introduced a PowerShell syntax PcsvDevice with various commands including Get, Start and Stop along with another Cmdlet Set-PcsvDeviceBootConfiguration.

These Cmdlets were first introduced in Windows Server 2008 R2 and the syntax for some of the parameters changed in Windows Server 2012 which is worth noting if you have used these Cmdlets previously. Using the Get-PcsvDevice Cmdlet, we can query IPMI, WS-Man and SMASH based base management controllers and get some information back from them. We need to feed this command some parameters such as a credential and an IP address for the BMC.

$Cred = Get-Credential ADMIN
$Ip = "172.16.150.21"
Get-PcsvDevice -TargetAddress $Ip -ManagementProtocol IPMI -Credential $Cred

As you can see, I set variables for $Cred and $Ip to set the username and the IP address to connect with and the ManagementProtocol parameter allows us to specify whether the controller is IPMI or WS-Man based. The result from this command in my environment is as follows.

PowerShell Get-CsvDevice Output

So from the screenshot, we can see that the Get-PcsvDevice Cmdlet can detect my Supermicro IPMI interface and the EnabledState column in the output shows that my host is currently Disabled or in other words, powered off. So now that I know PowerShell can find my BMC, I want to be able to power on the host. By piping the Get-PcsvDevice Cmdlet into the Start-PcsvDevice Cmdlet, we can do just that.

$Cred = Get-Credential ADMIN
$Ip = "172.16.150.21"
Get-PcsvDevice -TargetAddress $Ip -ManagementProtocol IPMI -Credential $Cred | Start-PcsvDevice

After entering the Cmdlet, I heard my server spin up in the rack downstairs. I opened up a command prompt in a separate window and started pinging the management IP address of the host which would start pinging once Windows Server 2012 R2 started after the boot POST.

PowerShell Start-PcsvDevice Output

As you can see from this second screenshot, the host started to ping on its management address and I assure you, there is no magic here like me running downstairs and hitting the power button, this all happened via PowerShell remotely. Running the Get-PcsvDevice Cmdlet again now would return a similar output except the EnabledState column now reports Enabled as the host is powered on.

The opposing Cmdlet is Stop-PcsvDevice however I won’t be demonstrating this one and I’ll provide a warning to go with it. This PowerShell Cmdlet does not instruct the IPMI or WS-Man interface to perform a graceful soft shutdown using the operating system commands. It performs a hard off as it someone had just held down the power button. If your server is running and accessible via normal in-band management means, power it down that way such as the PowerShell Cmdlet Stop-Computer.

The other command which is available to us is Set-PcsvDeviceBootConfiguration. I won’t demonstrate this one either as I like my server how it’s configured right now but this Cmdlet can be used to change the boot order of a server. If for example, you wanted to set a server to Boot from LAN over PXE so that you could apply some out-of-band update or image to it then you could do that. You can get the syntax for Set-PcsvDeviceBootConfiguration or any of the other Cmdlets from TechNet at https://technet.microsoft.com/en-us/library/dn283384.aspx.

 

Intel HD Graphics Update for Windows 10 Technical Preview

Today is a good news day for Windows 10 Technical Preview users. I’ve been using the Technical Preview on my Dell Latitude E7440 laptop since it’s release and since upgrading to build 9926, I’ve been having a lot of problems with blue screens of death on startup. So much so, that from a cold boot it normally takes me four BSODs to get logged in and working so my laptop normally only ever goes to sleep to avoid the cold boots.

The problem is caused by the Intel HD Graphics driver which I’ve confirmed for myself using WinDbg to analyze the crash dumps for many of these issues. Today, it looks like my luck is in.

Windows 10 Technical Preview Intel HD Graphics Update

Delivered via Windows Update, I’ve got two new drivers waiting for me, one for the Realtek audio driver and another for the Intel HD Graphics driver. I’m installing it as you read this post but fingers crossed it is going to resolve these issues with the Windows 8.1 driver running under Windows 10.