Project Home Lab: Servers Built

So in the last post (which I wrote in April but only posted a few minutes ago), I talked about some of the elements of the build I had done thus far. Well weekend just gone, I finished the builds bar a few small items and I’m glad to see the back of it to be honest. Here’s the pictures to show the pretty stuff first then I’ll talk effort and problems.

So in the last post (which I wrote in April but only posted a few minutes ago), I talked about some of the elements of the build I had done thus far. Well weekend just gone, I finished the builds bar a few small items and I’m glad to see the back of it to be honest. Here’s the pictures to show the pretty stuff first then I’ll talk effort and problems.

Server Build in Pictures

WP_20141222_001

The image above is a top down view of the 3U Storage Server and you can see it in all it’s finished glory. It looks quite barren inside the case and that’s totally the look I was aiming for, maximizing the available resources to give me oodles of options to expand it in the future should I need. The braided cables which after much, much effort, I’m not quite 100% happy with but 95% there really clean it all up.

WP_20141222_002

This is a close-up of the from edge of the motherboard where the on-board LSI SAS2008 ports live which I spoke about being problematic in my previous post. After the first chassis was built, I knew what was needed and it all went in fairly painlessly but luckily these SAS SFF-8088 multilane cables are quite flexible. The black braid on the LSI cables matches the braiding I used on the ATX cabling which makes the consistency monster in me happy.

In the top of the image, you can see a bundle of cables zip tied together running from left to right and these are the chassis connectors for power button and LED, NIC activity lights and so forth. These run off to the left of the shot and are connected to the pin headers on the motherboard. This is one part of the build I’m really happy with because the cables fitted nicely in the gap between the chassis and the motherboard so they are well kept.

WP_20141222_005

Nothing super exciting here, but this is the Intel PRO/1000PT Low Profile Quad Port network adapter that features in both the 2U Hyper-V Server and the 3U Storage Server with the difference being that the 2U server uses a half-height back plate and the 3U server uses a full-height back plate. No butchery required as I managed to get used versions of both with the correct plate from eBay.

You can also see here the white cables going from left to right. This is the front-access USB port connectors which plug into a pin header just behind the network adapter. I’ve installed the network adapter in the left-most-but-one PCIe slot. This keeps it as far away from the CPU as possible to avoid heat exchange between the two whilst giving a bit of room for the adapter to breathe as it’s passive heat sink is on the left side.

WP_20141222_004

This last shot shows where all the effort has gone in the build for me personally and what has taken me so long to get it to completion. The original ATX looms with the case where over 70cm long and finding somewhere to hide that much cable excess in a tight chassis wasn’t going to be easy or efficient. There are three looms all told: One for the 24-pin ATX connector, one for the Dual 8-pin EPS connectors and the chassis fans and the third and final for the drive enclosures.

The reason I am only 95% happy with these is that I would have in hindsight, considered putting half the drives on the EPS channel and the other half on the same channel as the chassis fans but in reality. What I have got though does mean that the drives get an entire 12v rail to themselves which is good in one respect. Wiring the 24-pin ATX connector was by far the hardest and trying to crimp 24 pins onto cables and then squeeze in inside the paracord before heat shrinking the ends was a challenge for sure. In hindsight here, I should have found a local electrical company capable of such wiring work and paid them to do it. Even if it cost £20 or £30 per chassis to do, it would have been worth it for time and effort on my part.

Outstanding Items

So the only items outstanding are some disks. I didn’t talk about disks in the shopping list as I was kind of undecided about that part but the answers are written now and I just need to finalize some bits.

I was considering the option of using the on-board USB 3.0 port to install Windows Server 2012 R2 on the servers to give me maximum disk slot for data but I didn’t like the fact I only had a single USB 3.0 port on-board so there was no option to RAID the USB. A dual port SD Card controller would have been excellent here but they are only really seen on super high-end motherboards shipping today. Secondly, whilst USB boot for Hyper-V Server is supported, it appears that it’s not supported for Windows Server and as I wanted to keep the design and configuration as production capable as possible that meant this was out of the window too.

The final decision has led me to using a pair of Intel 520 Series 240GB SSD drives in a RAID1 Mirror for the OS in both the Storage Server and the Hyper-V Server with all the drives connected to the on-board LSI SAS2008 controller running in IR mode (Integrated RAID) but more on this in the configuration post.

For the Hyper-V Server, these two disks are the only disks installed as no VM data will reside on the server itself. For the Storage Server, I have another four Intel 520 Series 240GB SSD drives and two 3TB Western Digital Red drives which will make up the six disk Tiered Storage Space. I have two of the SSDs installed now and the other two our going back to Intel tomorrow.

The two SSDs going back to Intel appear to be fried and will not even get detected by the system BIOS or the LSI SAS BIOS. The two Western Digital 3TB Red drives are currently in my Home Server. I have two 5TB Red drives waiting to be installed in my home server and in exchange for the 3TB drives moving out of the Home Server into the Storage Server.

The log jam right now is the Home Server. The Home Server currently lives in an older generation X-Case 4U chassis and as part of Project Home Lab it is moving house into one of the 3U chassis to match the Storage Server. I’ve got a lot of data on the Home Server so taking backups of everything and finding the right moment to offline it all and move it is tough with a demanding wife and kids trying to access the content on it.

Up Next

In the next post, I will talk about some of the things I’ve found and done in the initial configuration of the hardware such as the BIOS and the IPMI.

 

One thought on “Project Home Lab: Servers Built

Comments are closed.