Posts from 2013

Deduplication in Windows Server 2012 Essentials

Yesterday, I posted with a quasi-rant about Windows Server 2012 Essentials Storage Pools and the inability to remove a disk in a sensible non-destructive manner. At the end of that post, I eluded to the lack of the Primary Data Deduplication feature in Windows Server 2012 Essentials which got me thinking about it more, so I went of on an internet duck hunt to find the solution.

Firstly, I found this thread (http://social.technet.microsoft.com/Forums/en-US/winserveressentials/thread/4288f259-cf87-4bd6-bf9f-babfe26b5a69) on the TechNet forums in which an MVP highlights a bug which was filed on Microsoft Connect during the beta stages over the lack of deduplication. The bug was closed by Microsoft with a status of ‘Postponed’ and a message that it was a business decision to remove the feature.

Sad, but true when the people being targeted with Essentials are the people potentially wanting and needing it most, but I guess the reason probably lies in the realms of supportability and a degree of knowledge gap in the home and small business sectors to understand the feature.

Luckily for me, in another search, I found this article (http://forums.mydigitallife.info/archive/index.php/t-34417.html) at My Digital Life where some nefarious user has managed to extract the .cab files from a Windows Server 2012 Standard installation required to allow DISM to install the feature. While the post is targeted at Windows 8 64-bit users to use dedup on their desktop machines, the process works equally well for Windows Server 2012 Essentials, if not better as you can also use the GUI to drive the configuration.

I don’t want to be the one in breach of copyright infringement or breach of terms of service with Microsoft, so I’m not going to link to the .7z file provided on My Digital Life, so download it from them, sorry.

Download the file and extract it to a location on the server. Once extracted, open an elevated command prompt, change the directory context of the prompt to your extracted .7z folder and enter the following command:
dism /Online /Add-Package /PackagePath:Microsoft-Windows-VdsInterop-Package~31bf3856ad364e35~amd64~~6.2.9200.16384.cab /PackagePath:Microsoft-Windows-VdsInterop-Package~31bf3856ad364e35~amd64~en-US~6.2.9200.16384.cab /packagepath:Microsoft-Windows-FileServer-Package~31bf3856ad364e35~amd64~~6.2.9200.16384.cab /PackagePath:Microsoft-Windows-FileServer-Package~31bf3856ad364e35~amd64~en-US~6.2.9200.16384.cab /packagepath:Microsoft-Windows-Dedup-Package~31bf3856ad364e35~amd64~~6.2.9200.16384.cab /PackagePath:Microsoft-Windows-Dedup-Package~31bf3856ad364e35~amd64~en-US~6.2.9200.16384.cab

If DISM fails or gives you any errors, then the most likely cause is that you didn’t use an elevated command prompt. The next likely cause is that you aren’t in the correct working directory so check that too.

Once all of the packages are imported okay, enter the second command:
dism /Online /Enable-Feature /FeatureName:Dedup-Core /All

No restart is required for the import of the packages or the enabling of the feature, so everything can be done online.

Once the feature is enabled, head over to Server Manager to get things started. Server Manager isn’t pinned to the server Start Screen by default, so from the Start Screen type Server Manager and it will appear in the in-line search results.

From Server Manager, select File and Storage Services from the left pane, and then select Volumes from the sub-options.

As you will see in the screenshot, I’ve already enabled dedup on the volume on this test Windows Server 2012 Essentials VM of mine and I’ve saved space  by virtue of the fact that I’ve created two data folders with identical data in each folder.

For you to configure your volumes, right click the volume you want to setup and select the Configure Data Deduplication option. On the options screen, first, tick the box to enable the feature. Once selected, you have options for age of files to include in Deduplication and types of file to exclude. For my usage at home, I am setting the age to 0 days which includes all files regardless of age, and I am choosing to not exclude any file types as I want maximum savings.

The final step is at the bottom of the dialog, Set Deduplication Schedule. This allows you to configure when optimization tasks occur and whether to use background optimization during idle periods of disk access. I chose to enable both of these and I have left the default time of 0145hrs in place.

Once you click OK and then OK again on the initial dialog, you have just enabled dedup on that volume. Repeat the process for any volumes you are interested in and job done for you. After this, the server has the hard task of calculating all the savings and the process of actually creating the metadata links to physical blocks on the disk and marking the space occupied by duplicate blocks on the disk as free space. This process is very CPU and memory heavy and depending on the size of your dataset can and will take a long time to run.

I am just about to kick off a manual task on my live Essentials server at home, so once the results are in, I will be posting here to report my savings and also the time taken, but I’m not expecting this to come in anytime within the next day or so.

 

The Problem with Storage Spaces

As you may well have gathered from a number of my previous posts about Windows Server 2012 and Storage Pools, I was intending on using them for my home server rebuild, and I am indeed using them, however I have neglected to post anything showing the new server (although I will change that shortly).

I ran into a problem with Storage Pools today which I think quite frankly blows. I got myself a new Western Digital 3TB Red drive to try out. The plan is to replace all of my existing six 2TB Western Digital Green drives with these for a number of reasons including greater bang for buck on power consumption, increased IOPS, cooler running temperature and improved reliability.

Not wanting to keep a mixture of Green and Red drives for very long, I proceeded to remove one of the drives from the pool to replace with a Red drive. The Storage Pool refused to remove it as a Simple non-redundant Storage Space was being hosted on this drive.

Problem 1:  Storage Spaces cannot be converted between Simple, Mirror or Parity. Once they are created, they are created. My only option for this was to create a new temporary Space marked as Mirror and copy the data from the Simple so that I could delete it. Once deleted, I tried a second attempt to remove the drive and I got an error that I needed to add another drive to the pool as there was insufficient capacity.

I’m sorry, what?

Problem 2: I have six 2TB drives in an uber-pool. I am currently less than half of it, so removing the drive should be no problem. I tried this a few more times and each time I got the same error that I would need to add more capacity to the pool before I would be able to remove the drive, which I know to be cobblers.

In the end, I just pulled the disk from the server and let the Storage Pool have a cry about the missing disk. From here, I marked the disk for removal to allow Windows to think that the disk was failed and that it was never coming back. This worked although is time consuming as it forces all Mirror and Parity virtual disks to enter a repairing state, copying blocks to remaining disks in the pool to keep up the protection level.

This brings me softly onto another point which is more of a beef.

Beef 1: One of the tricks of Windows Server 2012 was deduplication. Anyone familiar with Windows Server 2012 will know that Storage Pools and deduplication do work together, but in Essentials, deduplication is absent, missing, not there. The feature is completely missing from any of the Server Manager interfaces and from PowerShell, the command Get-Command -Module Dedup* returns nothing.

Why is it missing from Essentials? Essentials is the release of Windows Server 2012 targeted at SMB/SME and pro-home customers, the customers most likely to be storing a lot of data on a tight budget, so why strip out the feature that they will probably be highly interested in, in Windows Server 2012.

I really hope that Microsoft get enough complaints from customers of Essentials to release a Feature Update to re-add the support for deduplication.

With this done,

Azure Online Backup Service Outage

So I came home today to check up on my trusty Essentials 2012 server and I was confused to see on the Online Backup tab for my free six month trial of Azure Online Backup reported absolutely nothing, no data, no stats or anything. I closed the Dashboard and headed over to the Azure Online Backup MMC console to see some more ‘direct’ information. Again, nothing.

I logged into the Azure Online Backup Portal to check up on my account to make sure that my trial hadn’t accidently been suspended or cancelled for some reason and spotted this:

Uh oh. Looks like the whole worldwide Azure Online Backup service is down, so this will be effecting Server 2012 Essentials, System Center DPM 2012 SP1 as well as conventional Azure Online Backup customers. Hopefully the service gets restored okay without anyone having to re-register their servers.

 

Active Directory and the Case of the Failed BitLocker Recovery Key Archive

This is an issue I came across this evening at home (yes, just to reiterate, home), however the issue applies equally to my workplace as we encounter the same issue there.

One of the laptops in my house incorporates a TPM Module which I take advantage of to BitLocker encrypt the hard disk and using the TPM and a PIN. This gives me peace of mind as it’s the laptop used by my wife who although doesn’t currently will likely start to take her device out on the road when studying at university.

Historically, I have used the Save to File method of storing the recovery key, storing the key both on our home server and on my SkyDrive account for protection, but as of our new Windows Server 2012 Essentials environment, I wanted to take advantage of Active Directory and configure the clients to automatically archive the keys to there.

The key to beginning this process is to download an .exe file from Microsoft (http://www.microsoft.com/en-us/download/details.aspx?id=13432). I’m not going to explain here how to extend the AD Schema or modify the domain ACL for this all to work as that is all explained in the Microsoft document.

Following the instructions, I created a GPO which applied both the Trusted Platform Module Services Computer Configuration Setting for Turn on TPM Backup to Active Directory Domain Services and also the setting for BitLocker Drive Encryption Store Computer Configuration Setting for Store BitLocker Recovery Information in Active Directory Domain Services.

After allowing the machine to pickup the GPO and a restart to be sure, I enabled BitLocker and I realised that after verification in AD, nothing was being backed up. Strange I thought, as this matches a problem in the office at work however we had attributed this problem at work to a potential issue with our AD security ACEs, but at home, this is a brand new Windows Server 2012 with previously untouched ACEs out of the OOBE.

After scratching my head a little and a bit more poking around in Group Policy, I clocked it. The settings defined in the documentation are for Windows Vista. Windows 7 and Windows 8 clients rely on a different set of Group Policy Computer Configuration settings.

These new settings give you far more granular control of BitLocker than the Windows Vista settings did, so much so, that Microsoft elected that the Windows Vista settings would simply not apply to Windows 7 or 8 and that the new settings needed to be used.

You can find the new settings in Computer Configuration > Administrative Tools > Windows Components > BitLocker Drive Encryption. The settings in the root of this GPO hive are the existing Vista settings. The new Windows 7 and Windows 8 settings live in the three child portions: Fixed, Operating System and Removable Drives.

Each area gives you specific, granular control over how BitLocker affects these volumes, including whether to store the key in AD DS, whether to allow a user to configure a PIN or just to use the TPM and probably the best option second to enabling AD DS archive in my opinion is whether to allow the user to select or whether to mandate that the entire drive or only the used space is encrypted. The Operating System Drives portion gives you the most options and will likely be the one people want to configure most as this is ultimately what determines the behaviour when booting your computer.

I’m sure you’ll agree that there’s a lot of new settings here over Vista and that this gives you much greater flexibility and control over the settings, but with great power comes great responsibility. Make sure you read the effects and impact of each setting clearly and that you test your configuration and if possible, backup any data on any machines which you are testing BitLocker GPOs against in the event that the key isn’t archived to AD DS and that you enter a situation where you need, but don’t have that recovery key available.

Media Center Auto-Start on Windows 8

With my backend server update to Windows Server 2012, I was keen to get my media front-end up to Windows 8 also to take advantage of SMB 3.0 for improved performance of opening and accessing the media stored on the server. I rebuilt the front-end about two weeks ago, taking advantage of the free Media Pack upgrade prior January 31st. I had already tested the components I use to make my media center tick including Shark007 Codec Pack, MyMovies and MediaControl so I knew all was good.

After installing Windows, the software needed and configuring auto-login for the media center service account, I proceeded to copy the shortcut I used in Windows 7 to launch the Media Center application into the Startup start menu group for the account. In Windows 8, the Startup group can be found in %AppData%MicrosoftWindowsStart MenuProgrmsStart-Up. With the shortcut added, I restarted the machine to test the result.

In Windows 8, to help attract people to the new Start Screen, the Start Screen automatically opens at login of any account. What I found was that this screen would pop over the Windows Media Center application which is hardly seamless for a keyboard and mouse free front-end. Using the remote, I clicked the Desktop tile on the Start Screen and Media Center appeared as expected, but I couldn’t control it. The reason was that although the application was now visible, it didn’t have focus so any inputs were ignored. Attaching a mouse to the machine and clicking anywhere in the Media Center interface restored focus but short of writing an AutoIt macro to do that for me (which is a nasty hack) this isn’t what I wanted or needed.

Luckily, a colleague pointed me in the direction of a Group Policy setting used sometimes in Remote Desktop Services or kiosk computer scenarios where the Explorer interface is hidden and a default application launched in it’s place. The setting still existed in Windows 8, so I gave it a shot and guess what? It works perfectly. I’m in the fortunate position that I am using Windows Server 2012 Essentials in a domain scenario so I was able to apply the Group Policy from the server, however this fix will work equally well for a non-domain scenario.

The policy setting can be found under User Settings > Administrative Templates > System. The setting is named Custom User Interface.

Enable the setting and specify the name of the application you want to launch. In my case, it is %WinDir%eHomeeShell.exe /nostartupanimation /mediamode.

It’s highly recommended to use environment variables here and not local paths if you can as I have done above. This will also work for Windows XP, Vista and 7 along with working for XBMC, Plex and other media clients you may use besides Windows Media Center. The byproduct of this is that startup performance is actually improved as you are no longer waiting for the Explorer shell interface to launch, and it prevents a few processes from running on the machine, giving you a little more CPU and Memory available.

As you will see, I use a couple of switches with my Windows Media Center startup to control the behaviour of it, which I would also recommend. These two switches stop the animation of the Media Center logo upon startup which I find saves about a second in load times and the second enters Media Mode. In this mode, Media Center’s close and minimize buttons are disabled causing Media Center to always run full screen and cannot be closed unless you use the manual Exit Media Mode option in the menu.

In the next couple of days, I’ll try and get a YouTube video up demonstrating the process for configuring this setting both via Windows Server 2012 Essentials domain and locally using the Local Group Policy Editor.

Breaking the Duck

It’s been over 18 months since I last sat an IT Pro exam of some description and frankly that was far, far too long. I should really have taken my TOGAF 9 exams last year as a minimum as the Architecting the Enterprise course I attended in London in May included the vouchers for the combined TOGAF exam, but it just never happened.

Today though, I finally broke the duck on my exam sitting and took my VMware Certified Professional 5 Datacenter Virtualization (VCP5-DV) exam and passed it. Maximum score for the exam is 500 and the minimum passing score is 300. I scored 380 which works out to be just shy of 80%. I wasn’t thrilled with the result, but I was happy to pass it first time round.

I got lots of questions on VMware FT which is probably my weakest area of the product after spending a lot of time researching iSCSI and NFS to square up on my existing Fibre Channel knowledge to cover all the storage topics. Although I’ve now passed the exam, I’m going to continue my research to try and brush up more of Fault Tolerance.

Next up? Well, my Cisco CCENT qualification expires in April this year, so I’ve got three months to pass my ICND2 exam to gain my CCNA or I lose the earlier CCENT and have to sit both exams again. Luckily, my networking knowledge has grown a lot since the first time I sat ICND2 and failed it about two and a half years ago, so I’m confident with some new research and studying into serial connections, IPv6 and a few other bits, I will be able to pass that exam.

Onwards and upwards…..

Controlling Configuration Manager Clients Across WAN Links

At work this week, we encountered an issue when a package I created for Adobe Reader 10 went mandatory in Configuration Manager. We service retail stores connected via slow WAN links back to our head offices. When I first joined the company, on a monthly basis when new Windows Updates were released into the wild, our network team would come down upon our team, fire wielding whilst we raped and pilaged the lines to stores.

Configuration Manager gives you the power to create BITS (Background Intelligent Transfer Service) policies to throttle the bandwidth that will be consumed by SCCM client transfers for packages and patches, however the problem with Configuration Manager is that it’s policy is not granular, it is singular which means to apply a policy of 32KB/s which we needed to do to facilitate stores, you would also be limiting the speed of head office clients connected to 100 Megabit or 1 Gigabit high speed LAN connections.

Group Policy also gives you the ability to configure BITS throttling policies and in actual fact, Group Policy gives you more options to control the granularity plus the fact that you can link Group Policies to OUs and not just entire domains, or sites, allows us to control the speeds in a more appropriate way.

In a Group Policy Editor window from Group Policy Management Console (GPMC), navigate to Computer Configuration, Administrative Templates, Network, Background Intelligent Transfer Services (BITS). From here, enable the Limit the Maximum Network Bandwidth for BITS Background Transfers setting and configure the speeds and times as you need. You can also configure an out of hours policy which we make use of, limiting the store clients to 32KB/s between 8am and 6pm daily, but allowing them to expand to 256KB/s overnight when the store is closed, not making VoIP calls or trying to transact credit cards.

This worked great and the next time we deployed patches we had no complaints from the stores, but instead, we had a problem at the head office. The problem had not been manifested previously as we had to delay patch deployments before the packages reached everyone, but the issue we experienced now was that due to the length of time a Configuration Manager client was connected to the Distribution Point downloading packages, we were now seeing prolonged connections to the IIS site on the Distribution Point and lots of 64KB/s connections makes lots of bandwidth – We were now actually consuming all of the bandwidth at the head office site, causing inter-site applications between our two head offices to crawl.

We found a solution to this problem in IIS. The solution probably isn’t recommended by Microsoft or any System Center consultants out in the wild, but it works for us and causes no side-effects that we’ve witnessed in the year or so that it has been in place, so it has withstood the test of time.

Using IIS Manager on the Distribution Point server, expand the Default Web Site. From the Actions pane on the right hand side of the interface, select Advanced Settings. From the Advanced Settings dialog, expand the Connection Limits group under the heading Behaviour. By default, IIS accepts a number of connections so large is may as well be infinite. We calculated that based on our free capacity on our link at the head office, taking into account the traffic required for LoB applications, we could handle about 20Mbps of client traffic. We divided the 20Mbps into 64KB/s BITS setting out which gave us a number of 320. Setting the IIS connection limit to 320 and restarting the site in IIS, we saw an instant reduction in Distribution Point server network activity and also the drop we needed on the site link.

As I have mentioned above, this was done over a year ago. In this time, we’ve had not a single complaint from stores, head office users or our network team that there are network contention issues due to SCCM traffic, nor have we seen any apparent SCCM client issues such as clients never properly reporting due to connection retry limits being reached. This isn’t to say that this fix is for everyone: You would need to factor in things like how many total clients do you have in an SCCM site, and if a client was back of the IIS queue, would it be waiting for a connection for so long that it would timeout, or do you need to be able to rapidly deploy high threat updates regardless of the impact to other LoB applications?

Since implanting these changes, we’ve had two Microsoft SCCM RAP assessments and neither have produced red or amber health status problems due to the changes to BITS and IIS, so I think we found ourselves a winner.

Restoring Client Computer Backup Database in Windows Home Server 2011

Quite sometime ago (probably about two months now), the primary drive on my Windows Home Server 2011 was giving me issues due to a new device driver I installed. Nothing got me going with ease: Last Known Good Configuration, Safe Mode, nothing. The problem lied in the fact that the server wouldn’t acknowledge that the OS disk was a boot partition, and after leaving it to attempt to repair the boot files by itself, which for the record, I don’t think I’ve ever seen work, I took to it manually.

Launching the Recovery Console command prompt from the installation media, I tried the good old commands that have served me well in the past for Windows Vista and Windows 7 machines I had to repair, bootrec and bootsec, but nothing worked, so I was left with only one option to re-install the OS. I wasn’t concerned about loosing personal data that is stored on a separate RAID volume, but I was concerned about my client backups which were stored on the same volume.

Using a USB attached hard disk, I manually copied out the Client Computer Backups folder, then rebuilt the operating system. I don’t keep active backups of the Home Server operating system because the Windows Server Backup utility in Windows Server 2008 R2 isn’t that hot. It doesn’t support GPT partitions over 2TB which obviously is an

Once installed, Windows Home Server sets up the default shares and folders including the Client Computer Backups. The critical thing here is that no clients can start a backup to the server before you complete these steps. Once a client starts a backup to the server, it creates the new databases and files for the server, ruining the chances of importing the existing structure.

From the new OS installation, open the directory where the Client Computer Backups live. The default location is C:ServerFoldersClient Computer Backups, but I had moved mine to D:ServerFoldersClient Computer Backups. Once you’ve found the directory, copy across all of the files I had previously copied from the burnt install of Windows and overwrite any files that it requests.

Once completed, restart the server. This will restart all of the Windows Home Server services responsible for running the Dashboard and the Client Computer Backups. Once the restart has completed, open the Dashboard and select the Computers tab where you normally view the computer health states and backups. On first inspection, it looks as though you have no clients and no backups, but look more closely and you will se a collapsed group called Archived Computers. Expand this group and you will see all of your clients listed and all of their associated backups will be listed if you select the Restore Files option for a computer.

The thing to point out here is that these backups will remain disassociated from the clients. Once you re-add a client to the server and commence a backup, it will be listed as a normal computer and the Archived Computer object for it will also remain listed. This is because the server generates GUIDs for the backup files based on a combination of the client identity and the server identity and because the reinstallation of the operating system will cause a new GUID to be generated, they are different. This isn’t a problem for me, but I’ve read a number of posts on the TechNet forums at Microsoft where people have had trouble locating the Archived Computers group in the Dashboard interface and think that they’ve lost everything which clearly isn’t the case.

The Things You Don’t Normally Hear

In a somewhat random post from me, I’m going to make a comment on my Sennheiser HD215 headphones.

I bought these recently to replace my failed Creative I-Trigue 2.1 speakers I use at home on my desktop PC. More and more of late I have been turning to headphones over speakers, largely due to wanting to be able to listen movies, YouTube or good old fashioned music at a sensible volume and with my study being fairly close to the kids bedroom, the speakers weren’t the best option for volume in the evenings while the kids are asleep.

I’ve been using a pair of Sennheiser HD201 headphones at work in the office for around the last year, I like them for the £30 price tag and they are more than good enough for the office. Being a shared office with moderate background noise and the fact that I am at work and can’t rationally expect to pump out 100dB of music without disrupting others and possibly my own productivity, I’ve never really had the greatest of chances to explore them fully. That is coupled to the fact that I find them uncomfortable after more than about an hour of listening although I rarely get a chance to listen to them for that length of time in a solid block so it’s a non-issue: The pad and can size means that they sit on the ear not around it and the padding isn’t that thick so the plastic construction of the cans slowly presses into your outer ear giving you that warm ear discomfort sensation.

Giving the new HD215 cans a try at home this evening, I instantly felt the difference as due to the size of the cans, they sit around the ear, resting instead on your head leaving the ear free to move. Listening to a range of tracks from Dance and Dubstep to Vocal and Acoustic, it’s amazing some of the tones and notes you detect with decent headphones at decent volumes that you just otherwise don’t. My case example is the album Radio 1 Established 1967 which is a 2 CD album of tracks taken from the Radio 1 Live Lounge to celebrate one of Radio 1’s anniversaries and the song’s on it sound completely different. I’ve listened to the album at work before on the HD201 headphones and I remember it sounding a decent amount better that how I previously had heard it, but that is because I had only ever previously used my Sennheiser CX-300 II in ear headphones, but these HD215’s seem to take it up another level.

Let’s be straight. I’m no music expert, nor am I am audiophile with an exceptional ear for quality in headphones or music or the ability to detect the difference between a 20,000Hz tone and a 22,000Hz tone, I just like music. I’m sure that someone with deeper pockets than me could easily comment to this and say that their £500 super-duper headphones with their all singing and dancing digital music optimized listening environment and equipment will sound factors greater than these will and perhaps you are correct, but for £55, they sound incredible.

My only criticism of them is that they are supplied with a coiled lead and not a straight lead. I’m not a fan of the coiled lead as you just end up pulling against it trying to reach the length you want and not the length the cable inherently wants and I think that just adds a level of unnecessary discomfort. Luckily, QED have the answer in the form of a Jack-to-Jack 3.5mm lead, available in 1,2 or 3 metre lengths that I can replace the lead with as at least Sennheiser are nice enough to make this model of headphones with a totally detachable lead using 3.5mm standard jacks at either end.

Windows Server 2012 Essentials and the Failed Migration

Last week, I took a day out of the office as annual leave to migrate my home setup from Windows Home Server 2011 to Windows Server 2012 Essentials, taking in all of the blog posts I have written over the previous months’ about how I intend to use some of it’s new features.

Suffice to say, it wasn’t a success, but I have completed the lessons learnt and I am now preparing for a second attempt.

The main protagonist in the failure was the recently acquired 3ware 9590SE-12ML multilane SAS/SATA RAID controller. After installing the card about a month ago to verify it’s functionality, I saw the message “3ware BIOS not initialized” and the 3ware site left me comforted in the fact that this was due to the fact that I had no drives connected to it. When I connected my two new Intel 520 Series SSD drives to it to create a RAID1 mirror for my new OS drive, I saw the same message still even though the drives we detected okay. I installed the 3DM2 software in Windows Home Server 2011 and I was able to manage the card via the web interface (which is really nice by the way), however after creating the volume unit, the controller began to initialize the disks and the system froze instantly. I left a it a minute or two just in case, but no joy. A hard power off and restart then left the controller completely missing from the POST and startup with even the BIOS not showing it as connected. After trying a few different things, I was able to intermittently get the card to be detected, but not without causing major stability issues and it still wouldn’t properly initialize the BIOS during POST. A colleague leant me an Adaptec card for a day to test and this card was detected okay, allowed me to create a volume and the volume was detected within Windows okay, so I had it down to a compatibility issue between the motherboard and the 3ware card.

I decided that the issue with the motherboard compatibility could be related to the fact that it is a Micro ATX motherboard with the AMD Brazos chipset and the AMD E-350 ultra-low power processor and that the card could perhaps not be able to draw sufficient power from the PCI Express 16x (4x Mode) slot so I began looking at some other options. The processor has actually been one of the things I wish I had done differently of late. When the server was first built and put online it was great, but as I began to utilize the Home Server for more backend centric tasks, I began to notice the 1.4GHz Dual Core processor struggling and some tasks would timeout if they happened their timing happened to collide with other simultaneous tasks.

With the Ivy Bridge 3rd Generation Intel Core family CPUs, Intel released a line of CPU appended with the letterT. This family of CPUs are low power compared to their letter-less or K processors with the Core i5-3470T being the most efficient, pipping even the Core i3 T variant to the peak TDP and performance titles. Compared to the 18W peak TDP of my AMD E-350 chip, the Intel Core i5-3470T consumes a peak TDP of 35W, however it gives in exchange 2.9GHz Dual Core processing with Hyper-Threading allowing Windows to see two additional virtual cores, however because it is an i5 chip and not the lower specification i3 chip, it features TurboBoost which allows the CPU to boost up to 3.6GHz under high load. Using data from cpubenchmark.net, the AMD E-350 produces a score of 774, whilst the Intel Core i5-3470T produces a score of 4,640.

Investing in Ivy Bridge is more expensive then investing in the 2nd Generation Sandy Bridge which also offers some T branded chips for energy efficiency, however the CPU benchmark for the Sandy Bridge vs. the Ivy Bridge speaks for itself not to mention the fact that the Ivy Bridge reduces the TDP by 7W, the extra few pounds between the chips is worth the money.

To support the Ivy Bridge Socket 1155 Core i5 processor, I was going to need a new motherboard. I like ASUS as their are the market leader in motherboards in my view, and I decided upon the ASUS P8Z77-V LX board for several reasons. It’s a step up from the Micro ATX board I have previously been using, up to a standard ATX board.

The benefits of this are it avails me 4 memory modules in a dual channel configuration whereas I only previously had two slots with a single channel. The slot count isn’t an issue as I upgraded about six months ago from my originally purchased Corsair Value Select 2x2GB DIMMs to 2x4GB Corsair XMS3 DIMMs. The new DIMMs allowed me to make use of the higher DDR3 PC3-12800 1600MHz speeds, doubled my memory ceiling as due to running SQL Express on the backend for the MyMovies database I was hitting very close to 4GB daily and gave me a theoretically more stable system as the XMS3 memory is designed for overclocking and high performance cooling with it’s head spreaders, so running them at a standard clock should make them super stable. The other benefit is the increased PCI Express slot count. The new board gives me 3x PCI, 2x PCIe x1 and 2x PCIe 16x, one of which is a true 16x PCIe 3.0 slot and the other a PCIe 2.0 slot with 4x bandwidth.

The other reason for selecting it was the Z77 chipset. The Z77 set affords me the widest range of slots, interfaces and is also the best bang for buck having the best power consumption for the chipset out of all of the full feature chipsets (ignoring the Q77 chipset as although this adds Intel vPro, you lose a lot of slots through it).

All told, with the pair of new SSD drives for the OS mirror, the new Core i5 processor and the new ASUS motherboard, my overall power consumption will increase by what equates to £10-15 a year. When you consider the performance uplift I am going to see from this (the hint is worlds’ apart), it’s £10-15 a year very well spent.

The T variant of the Ivy Bridge supports passive cooling which aligns with my previous mantra of keeping it quiet, but I have come to the conclusion over the last year that this is unnecessary when I have a Cisco 2950T switch and a Cisco PIX Firewall making way more noise than a server would and the fact that it is all racked in my garage, out of earshot of the rest of the house for the one to two hours a month I many spend in the garage, it’s just not worth the thermal though process trying to engineer it quiet and cool. I have also been getting concerned lately of the drive temperatures on the Western Digital Green drives, stacked up inside the 4U case, so I’m switching to active. I selected he Akasa AK-CCE-7101CP. It supports all nature of Intel chipsets including the Socket 1155 for Ivy Bridge and has variable fan speed and decibel output. It’s rated up to 95W TDP for the quad core i5 and the i7 family chips, so running it on the 35W T variant of the i5, I’m hoping it will run at the quiet end of it’s spectrum, putting it at 11.7dB which is silent to the passing ear as it happens anyway.

To assist with my drive cooling problem and also an on-going concern about what I would do to deal with a drive failure or upgrade in a hurry (currently, it’s shutdown the server, drag and keyboard, mouse and monitor to the rack from my study to access the console session, open the case and connect the new drive cables etc) I decided to invest in the X-Case 3-to-5 Hot Swap caddy’s. These caddy’s replace the internal cold swap drive bays which require manual cabling and drive screwing with an exterior access, hot swap caddy system. All the drives in a block of 5 are powered via two Molex connectors, reducing the number of power connectors I need from my modular PSU, and the five SATA data ports on the rear of the cage are to be pre-connected inside the case allowing me to hot add and remove disk without powering down the server or even having to open the case. Each caddy also features a drive status and a drive access indicator so that I can readily tell if a drive fails which drive is the one in question, making fault resolution much easier. This is all the more important and useful with Windows Server 2012 Essentials. The cage also incorporates an 80mm fan which draws air out of the drive cage to keep the disk temperatures down.

To summarize then, I’m doing the following:

  1. Upgrading the ASUS AMD Brazos Motherboard to an ASUS P8Z77-V LX Motherboard
  2. Upgrading the AMD E-350 Dual Core 1.4GHz CPU (774 Score) to an Intel Core i5-3470T 2.9GHz Dual Core CPU (4,640 Score)
  3. Gaining an Extra Memory Channel for my Corsair XMS3 2x4GB DIMMs
  4. Adding X-Case Hot Swap Drive Caddies
  5. Gaining a Bit of Active Cooling

I’m still waiting for a few of the parts to arrive but once they do, it’s going to feel like the Home Server is going to be getting it’s 18 month birthday present in the form of several serious performance and ease of use and management upgrades. I’m really looking forward to it and in a sad kind of way, I’m glad that the upgrade didn’t work out the first time, otherwise I wouldn’t have invested in these parts which I know I’m not going to regret buying.

Once I’ve got everything installed, I’ll run another post to show the images of it and I will hotlink to my old pictures to do a little before and after for comparison, then it’ll be hot trot into Windows Server 2012 Essentials I hope.