Richard works as a Cloud Consultant for Fordway Solution where his primary focus is to help customers understand, adopt and develop with Microsoft Azure, Office 365 and System Center.

Richard Green is an IT Pro with over 15 years' of experience in all things Microsoft including System Center and Office 365. He has previously worked as a System Center consultant and as an internal solutions architect across many verticals.

Outside of work, he loves motorbikes and is part of the orange army, marshaling for NGRRC, British Superbikes and MotoGP. He is also an Assistant Cub Scout Leader.

Project Home Lab Hyper-V Server

This is a really quick post but something exciting I wanted to share. Last night, I did a bit of work to help get the home lab up and running and after finishing some bits and pieces, I’ve now got the Hyper-V server up and running with the Windows Server 2012 R2 installation. Here’s a screenshot of Task Manager showing the memory and CPU sockets and cores available on the machine.

Lab Hyper-V Server Task Manager

As you can see, there are two CPU sockets installed with four cores per socket giving me 8 physical cores and 16 virtual cores. There is 24GB of RAM per CPU socket currently installed giving me 48GB of memory and I am using 6 out of 12 available slots so when the time comes that I need more memory, I can double that current number to 96GB or more should I swap out my current 8GB DIMMs for 16GB DIMMs.

I should have some more posts coming up soon as I’m actually (after far too long) reaching the point of starting to put all of this together and building out some System Center and Azure Pack goodness at home, including, finishing off the series introduction where I actually explain all the hardware pieces I’m using.

Making SCSM Custom Lists Visible in the Console

This week, I have been working on a custom management pack for System Center Service Manager to add new classes for Mobile Phone and SIM Card Configuration Items. Once of the requirements for this was to include some lists which can be updated via the console and used to store values abut these two CIs.

Creating the properties in each of the new CIs was no problem and setting their Enumeration Type to a list was no problem either but getting the lists to actually display in the Service Manager Console I found rather challenging. I was able to do it using Service Manager Authoring Tool okay but the Authoring Tool seems to make a horrible mess of generating IDs for objects and it uses Aliases for no reason everywhere. I made the switch from the Authoring Tool to Visual Studio Authoring Extensions for System Center 2012 but Visual Studio doesn’t automatically create the code required to make the lists visible.

To fuel the frustration, I was only able to find a helpful solution after many failed online searches, clearly using the wrong keywords. I was only able to find the answer in the end by creating a list in Service Manager in an unsealed management pack, exporting the Management Pack and viewing the XML to reverse engineer it. From the XML I was able to find the name of the proper name for the code value which then turned up a helpful article on TechNet Blogs.

Using Service Manager Authoring Console

If you are attempting to complete this using the Service Manager Authoring Console then you’re on easy street and you don’t need to do anything in the following sections. Simply create your Enumeration List in the custom Configuration Item Class and the list will automagically be made visible for you. If you saw what I saw which is that the Authoring Console makes a right old mess of your management pack and you decide to use Visual Studio with the Authoring Extensions to create your management packs then read on.

Adding the References

In Visual Studio with the Authoring Extensions (VSAE) add two new references to your solution. The references we need to add are  Microsoft.EnterpriseManagement.SerivceManager.UI.Authoring and Microsoft.EnterpriseManagement.ServiceManager.UI.Console. You can find the SCSM 2012 R2 RTM versions of these system management packs in the Service Manager Authoring Console installation directory at C:\Program Files (x86)\Microsoft System Center 2012\Service Manager Authoring\Library. By default these references have Aliases of !MUSEA and !MUSEC respectively but in my project I have changed these to !Authoring and !Console to make them more intuitive for anyone reading the code.

Making the Lists Visible

With our references added, we need to add the code to make the lists visible in the console. You can either add these lines to the Management Pack Fragment which contains your list definitions (which I have done) or you may wish to have a separate Management Pack Fragment for elements which you are publishing into the UI. Either way, they will be included in the compiled project it’s just your choice about how you structure your project and the code for development.

<Categories>
   <Category ID="Class.List" Target="Class.EnumerationTarget" Value="Authoring!Microsoft.EnterpriseManagement.ServiceManager.UI.Authoring.EnumerationViewTasks" />
   <Category ID="Class.List.Visible" Target="Class.EnumerationTarget" Value="System!VisibleToUser" />
</Categories>

As you can see from the code sample above, we add the Categories section to the fragment and inside that section, we add two Category elements each with unique IDs. The first of the code lines will make the Enumeration List that was declared in the custom Configuration Item class accessible and the second line as you can probably guess from the code makes this visible in the console to end-users.

Unlike most things in Service Manager management pack development, these two Category IDs appear not to require Language Pack Display Strings to be declared so we’re done here. Save your changes, build the project and import the management pack.

Adding List Items to Sealed Management Packs

If you are developing this management pack for a production system then you should be sealing your management pack for import. If you are providing the end-users with an empty list to which they can add their own custom list items then when the first list item is added, you will need to define an unsealed management pack for the list entries to be stored in. Alternatively, if you want to provide a set of default options, you can include these in the sealed management pack as default options using EnumerationValue as part of your EnumerationTypes. These default options will then be included in the sealed management pack and any new entries added will be stored in the unsealed management pack.

Azure Backup Maximum Retention

This is a very short and quick post but something I wanted to share none-the-less.

I got a call from somebody today looking at the potential for using Azure as a long-term solution to store infrequently accessed data. A StorSimple appliance is one obvious answer to the problem but that was out of consideration in this instance and we talked about using Azure Backup as a solution due to the fact that this data doesn’t actually need to be accessible online and an offline recovery to access the data would be viable.

When I started to use Azure Backup with the Windows Server 2012 R2 Essentials integration a number of years ago, Azure Backup was limited to 30 days retention but I knew that this had been increased of late so using the Microsoft Azure Backup client on my server, I looked to see what the maximum value was that I could set the backup job retention to and the number that came out was 3360 Days which in a sensible scale is 9 Years and 3 Months.

That’s quite a lot of retention there but sadly, it still wasn’t enough for this requirement so back to the drawing board. My problem aside, it’s good to see that Azure Backup now supports long-term data retention for backup and 9 years and 3 months is long enough to meet most organisations retention requirements including those in the financial sector.

Office 365 Management Pack for SCOM

Yesterday I got a chance to play with the Office 365 Management Pack for SCOM. Usual rules apply, read the release notes, import the Management Pack and then configure it, the same rules for all Management Packs you import into SCOM.

The installation was simple by downloading the .msi file from the Microsoft Download page at http://www.microsoft.com/en-us/download/details.aspx?id=43708 however in that this is a Microsoft Management Pack for a Microsoft product, I would have expected this to be published to the Management Pack Catalog in SCOM not a separate .msi file download as it would have certainly streamlined the installation process a little.

Once installed, the configuration of the Management Pack is really simple as an Office 365 configuration link is added to the Administration view. It gets added to the very bottom of the list so if you think you don’t have it visible, make sure you’ve scrolled all the way to the bottom. From the configuration wizard, you simply feed it a friendly name for your tenant and give it the email address for a user in Office 365 or configured through your Azure Active Directory.

The reason for this post, other than to explain how simple the Management Pack is to deploy is to have a little gripe. The user which you create in Office 365 needs to be configured as a Global Administrator on your tenant. To compare things to on-premises, that’s like using an account which is a member of Enterprise Admins to monitor Exchange On-Premises, a bit of a sledgehammer to crack a nut. I personally like things to be least privileged so the idea of having a Global Administrator account for this purpose is an annoyance. In that the Management Pack is testing the health of services within your tenant, I personally don’t see any reason that this account couldn’t be a Service Administrator to still give it some administrative powers but lessen them or failing that, a standard user. I suspect the need for being an administrator comes from the need to query a service API which is only available to accounts authenticated with administrative rights.

The upside of course to my gripe about the account being a Global Administrator however is that you do not need to assign any Office 365 service licenses to the account so it means you don’t need to shell out £20 a month for your E3 license per user in order to be able to monitor Office 365 from SCOM.

Inaccessible Boot Device after Windows Server 2012 R2 KB2919355

Earlier on this week, I finally got around to spending a bit of time towards building my home lab. I know it’s late because I started this project back in February but you know how it is.

On the servers, I am installing Windows Server 2012 R2 with Update which for the uninitiated is KB2919355 for Windows Server 2012 R2 and Windows 8.1. This is essentially a service pack sized update for Windows and includes a whole host of updates. I am using the installation media with the update integrated to same me some time with the updates but also because it’s cleaner to start with the update pre-installed.

The Inaccessible Boot Device Problem

After installing Windows Server 2012 R2, the machine starts to boot and at the point where I would expect to see a message along the lines of Configuring Devices, the machine hits a Blue Screen of Death with the message Stop 0x7B INACCESSIBLE_BOOT_DEVICE and restarts. This happens a few times before it hangs on  a black screen warning that the computer has failed to start after multiple attempts. I assumed it was a BIOS problem so I went hunting in the BIOS in case I had enabled a setting not supported by my CPU or maybe I’d set the wrong ACHI or IDE mode options but everything looked good. I decided to try the Optimized Defaults and Failsafe Defaults options in the BIOS, both of which required an OS re-install due to the AHCI changes but neither worked.

After this I was worried there was either something wrong with my hardware or a compatibility issue with the hardware make-up and I was going to be snookered however after a while of searching online, I found the solution.

KB2919355 included a new version of the storage controller driver Storport. It transpires that this new version of Storport in KB2919355 had an issue with certain SCSI and SAS controllers whereby if the controller device driver was initialized in a memory space beyond 4GB then it would cause the phyiscal boot devices to become inaccessible. This problem hit people who installed the KB2919355 update to previously built servers at the time of release as well as people like me, building new servers with the update slipstreamed. My assumption is that it’s caused by the SCSI or SAS controller not being able to address 64-bit memory addresses hence the 4GB limitation.

The problem hits mainly LSI based SCSI and SAS controllers based on the 2000 series chipset, including but by no means limited to the LSI SAS 2004, LSI SAS 2008, LSI MegaRAID 9211, Supermicro SMC 2008, Dell PERC H200 and IBM X240 controllers. In my case, my Supermicro X8DTH-6F motherboards have the Supermicro SMC 2008 8 Port SAS controller onboard which is a Supermicro branded edition of the LSI SAS 2008 IR controller.

The workaround at the time was to disable various BIOS features such as Intel-VT, Hyperthreading and more to reduce the number of system base drivers that needed to load, allowing the driver to fit under the 4GB memory space but eventually the issue was confirmed and a hotfix released however installing the hofix is quite problematic when the system refuses to boot. Luckily, we can use the Windows installation media to fix the issue.

Microsoft later released guidance on the workaround to use BCDEdit from the Windows Recovery Environment (WinRE) to change the maximum memory.

Resolving the Issue with KB2966870

Workarounds aside, we want to fix the issue not gloss over or around it. First off, download the hotfix KB2966870 which is a hotfix by request so you need to enter your email address and get the link emailed to you. You can get the update from https://support.microsoft.com/kb/2966870. Once you have the update, you need to make it available to your server.

If your Windows Server 2012 R2 installation media is a USB bootable stick or drive then copy the file here. If your installation medium is CD or DVD then burn the file to a disc.

Boot the server using the Windows Server 2012 R2 media but don’t press the Install button. From the welcome screen, press Ctrl + F10 which will open a Command Prompt in Administrator mode. Because of the Windows installation files being decompressed to a RAM disk, your hard disk will have likely been mounted on D: instead of C: but verify this first by doing a dir to check the normal file structure like Program Files, Users and Windows. Also, locate the drive letter of your installation media which will be the drive with your .msu update file on it.

Once you have found your hard disk drive letter and your boot media letter, we will use the following DISM command to install the update using Offline Servicing:

Dism /Image:[Hard Disk]:\ /Add-Package /PackagePath:[Install Media]:\Windows8.1-KB2966870-x64.msu

Once the command completes, exit the Command Prompt and exit the Windows Installation interface to restart the computer. In my case, I had to restart the computer twice for the update to appear to actually apply and take effect but once the update had been taken on-board, the machine boots without issues first time, every time. You can verify that the update has been installed with the View Installed Updates view in the Windows Update Control Panel applet.

KB2992611 Winshock Update and the Broken Cipher Suites

Last week, Microsoft released an update under KB2992611 in response to a security bulletin MS14-066 to address a flaw in SChannel reported to Microsoft. As part of KB2992611, Microsoft not only patched the flaw in SChannel but they also added four new encryption cipher suites. The suites added were as follows:

TLS_DHE_RSA_WITH_AES_256_GCM_SHA384
TLS_DHE_RSA_WITH_AES_128_GCM_SHA256
TLS_RSA_WITH_AES_256_GCM_SHA384
TLS_RSA_WITH_AES_128_GCM_SHA256

Although it was a nice gesture to add some new cipher suites to Windows, there was a knock on effect to installing KB2992611 and adding these new cipher suites as it appears that Google Chrome for one, possibly more browsers depending on the version you have, do not accept these ciphers and the addition would cause browsers to fail to connect to websites and causing TLS sessions to be dropped. There are also other issues although less widely reported about the installation of KB2992611 causing SQL and ODBC based data connections within applications to drop dramatically in performance.

To address the problem, Microsoft have re-released KB2992611 with KB3018238 which is a secondary update which changes the default state of these new ciphers to disabled. It’s important to note that disabling the new ciphers does not remove the fix for the vulnerability in SChannel which is addressed by the original hotfix. Some people are suggesting uninstalling KB2992611 to workaround the issue but doing this will open the SChannel vulnerability again. After hearing conversations about these updates today, there is much confusion about the situation. Microsoft have not pulled KB2992611 and replaced it with KB3018238 but they have instead added KB3018238 as a secondary update. This is in contrast to replacing the update with a version 2 release which is commonplace when there are issues with updates.

If you have already installed KB2992611, you will be offered KB3018238 via Windows Update. Installing KB3018238 will disable the four new cipher suites by default to restore compatibility however you will have the option to re-enable them if you wish via the normal means for editing and selecting cipher suites. The fix for SChannel will remain in place. If you have not yet installed KB2992611, then via Windows Update, you will see KB2992611 advertised as an update for installation but upon installation, both KB2992611 and KB3018238 will be installed and both will be listed in the View Installed Updates pane in Control Panel. In this case, you will have both the cipher suites disabled and that SChannel vulnerability patched.

If you are having issues with SQL Server or ODBC connection based applications, there is no fix for this problem currently and the solution to this is community opinion to remove the previously installed KB2992611 which appears to restore order to the force. Hopefully Microsoft will address whatever the underlying issue with SQL Server and ODBC and the interaction with this fix to SChannel in future update.

In addition to KB3018238 to fix the issues with SChannel, Microsoft yesterday released two other updates. KB3011780 has been released to address a flaw in Kerberos which effects the Key Distribution Center (KDC). This is a service which runs on Domain Controllers so this update is considered critical. Another update under KB3000850 has been released as a November 2014 Rollup Update for Windows 8.1 and Windows Server 2012 R2. This rollup includes all previously released updates for the operating systems and includes the KB2992611 but it is not clear whether it includes the original release of KB2992611 or KB2992611 and the secondary KB3018238 update.

To download KB2992611 with the secondary update KB3018238 visit http://support.microsoft.com/kb/2992611. For the Kerberos update KB3011780 visit http://support.microsoft.com/kb/3011780 and lastly, for the November 2014 Rollup Update, visit http://support.microsoft.com/kb/3000850.

Friends in the 21st Century

I’m known for liking to have a good old moan about things and I’m also known for being a bit old fashioned in my ways and values despite my age. I don’t normally get involved in talking about that part of me on my blog as I like to keep it technical here but when something overlaps into technology, it’s hard not to get it out there.

When I was growing up, we had friends and friends were people who you went out with and socialised together, people who you’d call on the phone to see how they were doing or how their life was going. Now, in the year 2014, what on earth has happened to the concept of friends? Did the old definition get completely unwritten and nobody told me? I checked the Oxford dictionary and the definition for the noun friend reads as follows:

A person with whom one has a bond of mutual affection, typically one exclusive of sexual or family relations

The synonyms show you that a friend is someone close to you with words like intimate, confidante, soul mate and brother or sister given and Oxford tells us that the origins of the word friend are Germanic and the meaning ‘to love’.

Just this weekend, I met someone at a party and I spent no more than fifteen minutes sum total time talking to said individual. I didn’t dislike him at all so there’s no problems there, but does meeting a stranger at a party and spending net fifteen minutes with them really constitute a friendship these days and how does that effect the things that we should be holding closest and dearest to us?

On Facebook right now, I have 60 friends. All of these people are either family or friends who are actually people that I am some-which-way interested in hearing from or actually care to read what they have to say (although I do wish sometimes that I could unfriend some people for the amount of share this and look at rubbish they post).

I did a straw poll on Twitter earlier today and granted, my follower base isn’t particular large and those who do follow me are going to be biased to me in a like minded sense, but both of the people who responded said the same thing: they only friend with people on Facebook who they actually know so why are a lot of people out there so willing to throw friend invitations on Facebook around like sweets and confetti? Surely a friendship on Facebook should be something reserved for the people who you actually hold in that esteem? Not only does having a mammoth collection of friends clutter your News Feed with information and status updates that you largely are going to ignore and not care about, but you are also exposing yourself to people who you don’t really know. Not that I am trying to victimise her in this post, but my wife has currently, 320 friends on Facebook and whilst she definitely has a wider circle of friends and people she interacts with more people than me, is it really five times greater than mine or is she collecting friends for the sake of it (bearing in mind here that she accepted the friend request from the same person I received an invitation from at the weekend)?

Facebook Contact Privacy Settings

I took a couple of screenshots of my Contact Info page from my Facebook profile earlier today and overlayed them on top of each other so that I can show the whole scene in one picture. As you can see from the picture, my contact information shared with friends and this includes my mobile and home phone number, my home address and although not shown (as it’s further down the page beyond the fold) my email address is also shared with friends.

I know that the protagonist amongst you will say that you can customise this and change who can see your information but that then brings its own questions. Firstly, who actually thinks about what that person might be able to see before accepting the friend request in that the decision to accept or decline for most has probably become a reflex action and secondly, what are the privacy options if you wanted to limit that persons access to your information? I took a look at the privacy options for my phone number and the choices are Friends of Friends, Friends, Only Me or Specific People.

Friends of Friends is just utter lunacy. Why would I want to share my phone number with the friends of my friends when I have no control over who they friend in turn? Friends Only is a logical option and Only Me defeats the purpose of adding the information to your profile in the first place. Specific People is the ideal option if you are a bit of a friend collector or very privacy conscious but who really is going to remember to after accepting that friend request, go and edit the list of people who are allowed or denied to see your information? What’s more, I highly suspect that this isn’t a setting which you can edit from the mobile applications which makes it hard to administer the value too.

Contact information and information about where you live, your email address and other personal data nuggets are important pieces of personal information, Personally Identifiable Information (PII) as the world has come to know it and this information should be protected at all costs, not made available to somebody at the acceptance of a friend request. If the Facebook account of somebody in your friends list was hacked, then your information could become part of the next wave of phishing scam or telephone nuisance.

Aside from the PII though, there is the day to day aspect of do you actually want to see what said individual is posting status messages about or do you want to know what they liked and shared and the answer is most likely probably not, especially if you are already dealing with a high volume of News Feed clutter already. The side of this issue is more personal and the response will vary from person to person according to how much of their lives they want to publicise, but do I want people who I only know in the most lose of senses to know what I am doing and do I want my status updates appearing in their News Feed? If I post a message that I’m having a great day out with my kids because I want to share the fact that I’m having a great time, enjoying a day with my family, how do I know that I only met for fifteen minutes isn’t a professional crook and now armed with knowledge that I am out for the day with my kids and my home address, isn’t going to come and burgle my house for all my prized, hard earned possessions and the blunt answer is that you don’t know these things because the people you friend on Facebook, you probably don’t know enough about them to make that judgement call.

For all my rambling in this post, the crux of the issue for me is that the definition of friends seems to have negatively evolved as social media has made people far more accessible to other people. I think that it is a good thing is many respects as it allows us to connect with people that they care most about in ways that they couldn’t have done previously and people in this category are truly the real friends in life. On the other side though, I also think that there is a high degree of over-sharing that goes on and people, people are making their lives too publicly accessible for the consumption of those that they barely know at all and they aren’t considering the implications of clicking that little blue accept button before they do it which not only means each time you look at Facebook you have to wade through the endless scrolling page of tripe to reach the good stuff and consequentially wasting your own time, you are also exposing yourself and your information to people. If I wouldn’t give somebody I met at a party my phone number, why would I connect with them as a friend on Facebook because they are tantamount to the same thing.

Two Weeks of Dell, VMware and TechEd

It’s been a while since I’ve worked with VMware to any serious nature but for the last two weeks, I’ve been working with a customer to deploy vSphere 5.5 on a new Dell Vrtx chassis. I’ve seen the Dell Vrtx on display at VMUG conferences gone by and it sure is an interesting proposition but this is the first time I’ve had a chance to work with it real world.

All in all, the Dell Vrtx is a really nice system, everything seems to be well planned and thought out.  The web interface for managing the chassis works but it is slow at times to open pages and refresh information, bearable. The remote KVM console to the blades themselves is Java based so results may vary whether it works or not; I really dislike Java based systems and wish more vendors would start to use HTML5 for their interfaces. There is an apparent lack of information on the Dell Website about the Vrtx system. There is a wealth of configuration guides and best practice documents for the Vrtx but all of these seem be so highly pitched that they lack actual technical details. Another issue is the Dell parts catalogue doesn’t really acknowledge the existance of the Vrtx system; I was talking to someone about extending the system with some Fibre Channel HBAs for FC storage connectivity but of all of the FC HBAs for sale on the Dell website, only a single port 4Gbps HBA is listed as supported which I can’t believe for one minute given the PCIe slots in the Vrtx are, well, PCIe slots. Disk performance on the Shared PERC controller is pretty impressive but networking needs to be taken with caution. If you are using the PowerEdge M620 half-height blade, it only exposes two 1GbE Ethernet interfaces to the internal switch plane on the chassis whereas the full height PowerEdge M520 blade exposes four 1GbE Ethernet interfaces and I would have really liked to have seen all four interfaces on the half-height blade, especially when building virtualization solutions with VMware vSphere or Microsoft Windows Server Hyper-V.

I haven’t really worked with VMware too much since vSphere 5.0 and working with vSphere 5.5, not an awful lot has changed. After talking with the customer in question, we opted to deploy the vCenter Server Appliance (vCSA). vCSA in previous releases of vSphere was a bit lacklustre in it’s configuration maximums but in 5.5, this has been addressed and it can now be used as a serious alternative to a Windows Server running vCenter. The OVA virtual appliance is 1.8GB on disk and deploys really quickly, and the setup is fast and simple. vSphere Update Manager (VUM) isn’t supported under Linux or on the vCSA so you do still need to run a Windows Server for VUM but as not everyone opts to deploy VUM, that’s not a big deal really. What I would say to the vCSA though is if you plan to use local authentication and not the VMware SSO service with Active Directory integration then I would still consider the Windows Server. Reason for this being that with the vCSA, you cannot provision and manage new users and password via the vSphere Web Client and instead you have to SSH onto the appliance and manage the users from the VI CLI. With the Windows Server then we can obviously do this with the Users and Groups MMC Console which is much easier if you are of the Microsoft persuasion. If you are using the VMware SSO service and Active Directory integration then this will not be a problem for you though.

Keeping it on the VMware train, I’m looking forward to a day out to the VMware UK User Group Conference VMUG in Coventry in two weeks. I’ve been for the last three years and had a really good and informative day every time I’ve been.

Being so busy on the customer project and with my head buried in VMware, I’ve been really slow on the uptake of TechEd Europe news which bothers me but fear not, thanks to Channel 9, I’ve got a nice list of sessions to watch and enjoy from the comfort of my sofa but with there being so many sessions that I’m interested in, it’s going to take me a fair old chunk of time to plough through them.

Thoughts on Windows Server 2003 End of Life

A post by me has a just been published over on the Fordway blog at http://www.fordway.com/blog-fordway/windows-server-2003-end-of-life/.

This was written in parallel to my earlier post Windows Server 2003 End of Life Spreadsheet, reproducing the spreadsheet for documenting your Windows Server 2003 environment originally posted by Microsoft. In this new post on the Fordway blog, I talk about some of the areas that we need to focus our attention and other up some food for thought. If you have any questions then please feel free to get in touch either with myself or someone at Fordway who will be happy to help you.

Monitoring SQL Server Agent Jobs with SCOM Guide

Late last night, I published a TechNet Guide that I have been working on recently entitled “Monitoring SQL Server Agent Jobs with SCOM”. Here’s the introduction from the document.

All good database administrators (DBAs) create jobs, plans and tasks to keep their SQL servers in tip top shape but a lot of the time, insight as to the status of these jobs is left either unturned like an age old stone or is done by configuring SQL Database Mail on your SQL servers so that email alerts are generated which means you have additional configuration being done on every server to configure this and it’s yet another thing to manage.

In this guide, I am going to walk you through configuring a System Center Operations Manager 2012 R2 environment to extend the monitoring of your SQL Servers to include the health state of your SQL Server Agent Jobs, allowing you to keep an eye on not just the SQL Server platform but also on the jobs that run to make the platform healthy.

You can download the guide from the TechNet Gallery at https://gallery.technet.microsoft.com/SQL-Server-Agent-Jobs-with-f2b7d5ce. Please rate the guide to let me know whether you liked it or not using the star system on TechNet. I welcome your feedback in the Q&A.