General

A catch all category for posts that are neither specific to a Microsoft technology which has its own dedicated category or posts that aren’t based on a Microsoft technology, as rare as those are.

TP-Link TL-SG3210 Switch Review

Following on from my post Good Enough for a Network Engineer, I thought I would take the time to review my TP-Link TL-SG3210 8 Port Gigabit switch that I purchased about three weeks ago.

The switch is actively in use in my home network, replacing my Cisco 2950T access layer switch and I have to say it’s fantastic with a few caveats.

The switch lives in my study as my access switch, serving my desktop PC, a pair of ports into the bedroom for the Sky box, Xbox and anything that I may want networked in there. Additionally, it also serves as the access for our Vonage VoIP phone gateway as the internal phone wiring master socket is also in the study so it makes it easier to connect to downstream phones from here.

The first thing you notice about the TL-SG3210 is it’s size. For an eight port switch, it’s pretty big, measuring just shy of 12 inches wide. It’s this reason that TP-Link actually supply it with 19″ rack adapters for people who may wish to use it in a rack mount scenario. For your £80, you get a IEC C13 kettle plug type power input on the rear, one RJ-45 console port on the front, along with eight 1000Mbps Gigabit RJ-45 ports and two SFP slots which should accept all industry standard GBIC modules. TP-Link sell their own range of GBIC modules, however one omission in their range are 1000Mbps RJ-45 GBICs so you would have to try using Cisco, HP or another brand if you wanted to use the two SFPs as your trunk ports to upstream switches.

The second thing you notice is the volume. None, nada. The switch is totally silent being passively cooled which is fantastic for my study come home office. My previous use Cisco 2950T switch quotes 47 dB on the Cisco product specification, then add a decibel or two for dust and age of the fans.

Start-up and restart of the switch takes about two to three seconds which is really fast if ever you need to. Configuration is simple thanks to the webmin although TP-Link have console access and Telnet and SSH access too via a Cisco-a-like CLI. The commands in the CLI are fairly syntax akin to Cisco with subtle differences just enough to keep them out of patent infringement but close enough that with the Tab key, most users who know Cisco IOS could tab their way through completing the commands.

The web interface is good and easy to navigate. My only problem with it was that configuring VLANs and assigning them to ports wasn’t as obvious as I would have liked. Creating port channel groups (LAGs) is easily achieved although one item to note is that I like to hard set my LAG ports to the required interface speed, and changing the port from a standard port to a LAG port sets the port speed and duplex back to Auto leaving you to force it back again.

My only problem with the switch relates to firmware updating. After configuring a few bits and pieces on the switch, I noticed the option for firmware update and checked the TP-Link website to find an update available. I downloaded and installed the update only to lose access to the switch afterwards. It appears that updating the firmware causes the switch to reset to factory defaults, causing me to have to re-configure my machine with a static IP in the 192.168..0.0/24 range to access it and configure it again.

Performance wise, I connected two machines, a desktop and a laptop to the switch. One of the machines has an SSD, the other conventional SATA HDD disks. I performed a file copy from the SSD machine to the HDD machine and the transfer speed was sustained at 74MB/s (Megabytes) which to me looks to be the limitation of the disk and disk subsystem and not the switch. With two machines SSD to SSD, it wouldn’t surprise me if I could max out the gigabit link at 100MB/s (Megabytes).

I haven’t fully explored all the features as they are beyond my needs, but some of them include DSCP and QoS configuration, port security, 802.1x authentication, Layer 2 to 4 firewall, switch clustering and more.

Conclusion?

For general home use, this switch is totally over the top and I would suggest actually a TL-SG1008D which is an unmanaged 8 port gigabit switch without the SFP slots. For IT pro at home and power users, this switch is fantastic. For £80 you can’t beat the fact that you are getting (including the SFPs) ten ports of gigabit Ethernet without wasting any of its watts on noise and cooling. It supports so many features that it quite frankly makes Cisco and other high end brands look woefully overpriced and under specified; the Cisco 2960 Express which is an analogous form factor and targets the same sort of market is over £500 and only allows you to configure firewall policies up to Layer 2. Based on just these comments, I couldn’t recommend this switch highly enough.

For small businesses on the other hand, I would not recommend this switch on the basis that updating the firmware causes it to totally factory reset it’s configuration which could leave the uneducated types stuck wondering what is wrong and why they have to access to any network resources, but with that said, that only applies if you are using VLANs and your native VLAN isn’t the switches default VLAN of 1. If you aren’t using VLANs or you are, but your native VLAN for access devices is VLAN 1 then by all means, purchase away.

 

Good Enough for a Network Engineer

In my home currently, I have three main areas of tech: There is the garage which hosts my home built rack with my firewall, switch and home server, the study where my desktop and our Vonage phone gateway live and lastly the living room where the HTPC media center lives.

All of this is interconnected with two Cisco 2950T L2 switches which are 10/100 switches with a pair of gigabit ports for god measure, and a Cisco Aironet 1100 access point for wireless. Downstairs, I make use of the gigabit ports on the core switch to the home server connected to a dual port Intel server adapter in a static 2Gbps team to ensure that there is sufficient bandwidth available for multiple clients accessing media content leaving everything else to run at 100Mbps.

I’ve been long toying with the idea of a gigabit upgrade for the home including a new 802.11n access point to increase the wireless speeds from their current 802.11g 54Mbps speed. Being an enterprise grade gear geek, I love having Cisco in my home. The performance meets and mostly exceeds home gear on a 100Mbps port by port basis and the reliability is amazing (prior to a planned power down this week to install a module in my UPS, my core switch had over 300 days uptime), but this all comes at a cost; a financial one and a feature one.

To get me the gigabit ports I so crave at the core, I’m looking at either a Catalyst 2960 switch or a Catalyst 3560G switch. The 3560G is preferred in part because it gives me Layer 3 routing on the LAN side as opposed to doing router-on-a-stick with the firewall to traverse my VLANs but also because it’s an older model now replaced by the 3750 and 3750v2 switches making it marginally cheaper (although the 3560 series, including the 3560G still hold an incredible price purely due to the fact that they are one of the most commonly deployed enterprise switches). For upstairs on the access switch, I’m looking at a Catalyst 2960 Express to allow me to downsize my access layer point count as a 24 port switch for my study is crazy, but at the time served the requirement for LACP port channelling and price.  For the wireless, I’m looking at an Aironet 1140 Series.

When you price up the best of the used prices online for this gear, it’s frightening. £4-500 for the 3560G, £400 for the 2960 Express and £150-250 for the Aironet 1140 Series, totalling around £1,150, something I simply cannot afford or justify for a four or five user home network even if feature rich reliability and stability are critical to me.

After hearing my tales, a network engineer in our office introduced me to a company called TP-Link who he uses in his home and said that it’s good kit. For a network admin who normally deals in the realms of Cisco, RSA and other networking and security big boys, granting TP-Link the accolade of being good must mean they are worth a look surely?

TP-Link have a nice range of product and they actually compare if not slightly exceed Cisco on feature set when comparing like-for-like models, but best of all is their price. For a cool £300, I can get a brand new, Amazon retail priced TL-SG5428 24 port gigabit switch, a TL-WA801ND 300Mbps 802.11n wireless access point and a TL-SG3210 8 port gigabit desktop switch. For the most part, Amazon prices are actually cheaper than eBay prices for TP-Link kit.

So how do they actually stack up? I’ll start by comparing the switches. TP-Link switches are all fanless which means that the decibel from the stack in my study will become nill and garage will be cut probably by two thirds as the switch is currently the loudest item at 41dB for the 2950T. Features I use and rely on such as MAC port security, QoS mapping for voice and ACLs all exist in TP-Link land, and acutally, for TP-Link, they offer Layer 2 through 4 ACLs on their Layer 2 switches, compared to Cisco who only give you Layer 2 MAC based ACLs on the Layer 2 switches. Management options include an IOS alike CLI, Web, SNMP and RADIUS allowing me to manage the switches in the same way I do currently. Network features like LACP, port trunking, port mirroring and more are all still present on the TP-Link side of like too.

For the desktop switch there is actually no feature loss when compared to the rack mount 24 port model. All of the features listed across the two models compare equally which means I won’t suffer for taking a step down to a desktop switch from the current rack mount.

On the wireless front, my current 1100 Aironet access point supports PoE and I’m using this in the form of an inline injector which the TP-Link ships with whereas I had to buy my current Cisco one separately. All the usual wireless access point features exist on the TP-Link access point too such as multiple SSIDs, VLANs, detachable, replaceable antenna, 802.11d, 802.11i and all the managements such as the IOS alike CLI, Web, SNMP and RADIUS again.

The feedback from our network engineer has been that the throughput of the switches and their reliability are both top notch and he’s had no complaints since buying the switch many months ago nullifying the concern I had there.

The conclusion then is that the age old adage of nobody got fired for buying Cisco may stand true, but it looks as though you might not get fired for buying TP-Link either? Frankly, I was concerned over how you can even design and manufacture a 300Mbps N access point for £35 and a 24 port rack mount gigabit switch for £200 let alone sell it and turn profit, but the fact that TP-Link can and do so, and do it so well means I’m clearly paying for a badge that my home network doesn’t demand? It also means that my home network could stop suffering the two generations old only mantra that seems to flow currently. By no longer competing with  Cisco on feature and price, only being able to justify buying two or three generation old equipment, I can buy something bang up to date, giving me the gigabit I have for so long wanted and need.

Time will tell as I’m not going to be replacing everything overnight but I will be staggering all my upgrades throughout the 2013 calendar, but I’ve got strong optimism for the idea of the switch. The best part is that it will be largely free as the resale values on my old Cisco kit on eBay will cover 99% of the cost of the new kit. Who said there is no such thing as a free lunch?

Learning Styles from Scott Lowe

I had the privilege of attending VMUG UK at the National Motorcycle Museum this week for the second year running. I felt like I got more out of it this year, in part due to the fact that I wasn’t spending half the time overwhelmed by attending my first proper IT conference (albeit a local free one).

One of the interesting things to come out of it for me was the closing keynote from Scott Lowe, entitled “Stay Sharp and Relevant in IT”. During this keynote, Scott talks about topics he feels are relevant to IT Pro’s learning over the next twelve months in order for them to stay relevant. What was interesting for me wasn’t actually the subject matters we defined, but the chatter before it surrounding learning.

All my IT life, I’ve studied and obtained my certifications predominantly using bookings and self-study, a hard thing to do for someone who’s never really liked reading too much. I took some time last night to take some online tests and assessments to try and calculate my learning style, and wow: it’s amazing how you can find yourself in printed verse doing these tests.

Based on the results of about four or five different tests I took, I come out every time as a kinesthetic learner. This means that I learn best from the act of doing with hands-on interaction, touching things. When I think about how I go by my daily life and compare the information online relating to kinesthetic learning it just all fits into place.

Take this report at Dirjournal for example (http://www.dirjournal.com/guides/study-tips-for-kinesthetic-learners/). They say that kinesthetic learners are likely to fidget and can’t sit still for long periods and when forced to do so, often resort to tapping or bouncing their feet and legs and like to fiddle and play with inanimate objects like pens or pencils when in classes. These two attributes alone describe exactly my behaviour when at work at my desk or when in classes learning new topics. Kinesthetic learners also find it hard to read materials and are more attracted to reading or looking and content when it is visually appealing which explains my passion for good design and striking documentation and materials.

So now that I’ve identified how I learn best and what I should be doing to aid my learning, I think it’s time to cut back on the books and get my hands dirty with more home labs and play time, and hopefully, that will give me the motivation and inspiration to learn more, faster and to be a better IT Pro.

I want to thank Scott for his keynote at VMUG for making me think to stop and asses my learning style, identify it and to help me to help myself. Thanks Scott 🙂

Configuring VMware Data Protection Appliance (VDP)

Yesterday, I had the chance to stand up a VMware Data Protection (VDP) appliance virtual machine. This is the replacement for VDR in vSphere 5.0 and upwards and is based on the EMC Avamar product.

When configuring the VDP appliance for the first time using the https://hostname:9543/vdp-configure URL, you are asked for credentials to connect to vCenter. The interface tells you that this is used to register with vCenter, but what it doesn’t tell you is that this account is actually then used for on-going access to vCenter and creating and deleting the snapshots on the VMs it is backing up. Therefore, make sure you use a service account and not your own account.

There is also a requirement for this account to be directly permissioned in vCenter and not via a nested group membership, so make sure you do this before attempting the registration.

SkyDrive Limits Update

Today, Microsoft released its latest revision to the Windows Live SkyDrive file hosting and sharing platform. As a result of the change, the user storage limits have been reduced from 25GB to 7GB. If you have an existing Windows Live ID, then as a loyalty reward, you can get a free upgrade to 25GB, allowing you to keep your existing storage limit.

To do this, simply login to SkyDrive at http://skydrive.live.com, login as normal and select the Upgrade banner near the top of page. Hurry though as this is being reported as a limited time offer.

HTML Webpart to Hide Quick Launch in SharePoint

I’m currently running a test lab for the purposes of development of a Version 4 Master Page and CSS Stylesheet for SharePoint 2010 to replace the legacy Version 3 UI we are using at work. Whilst developing my flash new fixed width master page and layout I wanted to be able to hide the quick launch on the homepage so that I had the full width of the layout to give the site an eye-catching look.

Using the HTML Webpart we are able to inject some inline CSS Styles which do this for us.

Add the HTML Webpart anywhere on the page that you want to hide the Quick Launch on and add the following source code to it:

<style type=”text/css”>
/* Hide Quick Launch on Homepage */
#s4-leftpanel {
display:none;
}
.s4-ca {
margin-left:0px;
}
</style>

Make sure you configure the Webpart as Modeless so that the users don’t see the title of what is essentially a blank Webpart shown in the interface, and that’s it.

Migrating Saved Games to Xbox 360 Cloud Saved Games

So you’ve been playing on your Xbox 360 for sometime and you’ve built up a collection of saved games, all stored locally on your consoles hard drive. You’ve heard about the new Cloud Saved Games feature in the new Xbox dashboard update and want to be able to transfer (move, migrate, whatever you want to call it) your existing saves there for anywhere access?

The process is fairly painless and easy to complete, however it would have been nice if it was automated as part of enabling the Cloud Saved Games service. There is a gotcha to be careful of, but once you take it into account it’s plain sailing.

First off, you need to enable the Cloud Saved Games feature. You can do this by following my previous post Enabling Xbox 360 Cloud Saved Games at http://richardjgreen.net/2011/12/08/enabling-xbox-360-cloud-saved-games/.

Once you have enabled the Cloud Saved Games feature, do the following:

  1. Navigate to the Settings tab and select System.
  2. From System, select Storage, and from Storage, select Hard Drive to see the locally saved content.
  3. Within Hard Drive, select Games and Apps.
  4. Highlight a game that you want to migrate to the Cloud Saved Games service, and Press Y (Game Options).
  5. From Game Options, select the Move option, whereby you will be presented with a list of available storage devices. Select Cloud Saved Games and your game saves will be migrated across.

The migration process will detect the files which are game saves and the files which are updates, DLC and other non-save content. Using Forza Motorsport 4 as an example, with the installed files, it uses 3.3GB of hard disk space, however with a 500MB limit on your Cloud Saved Games service, you will obviously not be able to store all of this online. Fortunately, because the save files are detected for you, only the 20-30MB save file is actually moved.

This does mean that if you roam onto a friend or another persons console without that game already installed (and you have taken your game disc with you to play on) you will have to install the content first, but being able to have you save follow you is what is important and useful here.

The gotcha I mentioned earlier is relating to multi-player consoles. In my household, the wife and the kids use the console too. In my case, Dance Central 2 has saves for four people within it. Select the Move option against the ensure Dance Central 2 container would migrate everyone’s save to my cloud and would also grant me ownership of those saves, preventing the others access to their own saves.

In these instances, you will need to do the following:

  1. Drill into the game itself by selecting it with the A button.
  2. Highlight your own personal save file (the save file will show the Gamertag of the player on the right beneath the file size) and select it with the A button.

You will now have an option to move the save to the Cloud Saved Games service and this will only move your own save without effecting those of other players. I’m hoping that a future update might resolve this gotcha and will allow it to detect the ownership of other saves and as such, only move your own personal files, but time will tell on this one.

Enabling Xbox 360 Cloud Saved Games

One of the new features included with the Xbox 360 Dashboard update this week is the ability to store your saved games in the new Cloud Saved Games service. The service is free to Xbox LIVE Gold subscribers and allows you up to 500MB of storage for your game saves.

Enabling the feature is simple and is done as follows:

  1. Login to your console using your Windows Live ID (Xbox LIVE Gamertag).
  2. Scroll to the Settings tab on the new dashboard, and select System Settings.
  3. Within System Settings, select Storage.
  4. From Storage, highlight Cloud Saved Saves and Press Y on the controller (Device Options).
  5. Select Enable Cloud Saved Games.

You’re done.

Unfortunately, this feature isn’t enabled by default for Xbox LIVE Gold subscribers, which I think that it should be, and I also think that as part of the dashboard update, you should be prompted upon first login if you want to migrate your saves to the Cloud, however it’s possible this may come in a later update?

With the shoe on the other foot however, I can see Microsoft’s dilemma. Storage isn’t free in the cloud (contrary to the belief of many). Disabling the feature by default and not automatically prompting people to use the service allows them to under provision storage reducing cost, because your local hard disk doesn’t cost Microsoft anything compared to Cloud storage.

Although the Cloud Saved Games feature has been advertised by people like Major Distortion and other people online, I think it’s been pretty under-played compared to the dashboard update itself, or the Xbox Companion App for Windows Phone 7 and iOS. It’s a shame, because the feature is really powerful and adds a new dimension to console, being able to ‘carry your saves around with you’.

Failure with Concessions

Today wasn’t the greatest day for me in one respect. Unfortunatly I flunked my 70-236 Configuring Exchange Server 2007 exam for the second time, and strangely, with months more Exchange experience under my belt (and I mean that because we’ve faced our share of issues and undertaken our share of mini-projects on infrastructure engineering since my last attempt), and with loads of preperation, I actually scored roughly 75 points lower than my first attempt.

I purchased a three exam pack through Prometric earlier in the year which expires December 31st 2011, so I’ve got to try and get two exams passed before the end of the year still, with MDOP being my next exam and still undecided on the third, but Exchange better look out, as once I’ve done my two remaining in the pack, I’ll be going back for my MCITP for Exchange Server 2007 and 2010.

The concession in all of this is a feeling of self-enlightenment. Tomorrow, my trusty laptop will be going back to the office from home so that I can re-deploy it with SCCM to install my yummy new SSD disk (I would clone the disk, but I have a feeling BitLocker might not accept that too kindly). To make sure I didn’t loose any data, I hooked up my VPN this evening and made sure that all of the data on my laptop was safe and sound on the file servers and work, and then I turned my attention to OneNote.

I’m an avid OneNote user, and will use it over written notes whenever I possibly can. Being a Windows Phone 7 user, I also enjoy the OneNote integration in the phone giving me super access to my personal notes. I quickly realised that through the course of migrating through various working practices at work, I had one notebook in my SharePoint 2010 MySite and another locally on the laptop, and then a third in the Windows Live SkyDrive cloud. I’ve just combined them all into my SharePoint 2010 MySite notebook and I feel great for it.

Unification for the win 🙂

Error Opening Excel 2007 and 2010 Documents in SharePoint 2010

Last night we completed a SharePoint 2010 at work and after all the testing, we deemed the upgrade a success, however coming into the office this morning, we received reports from some users that they were unable to open some of their Excel spreadsheets stored in various Document Libraries.

After some diagnosis, it turned out that the problem only effected Office 2007 and Office 2010 XML format documents and that original format Excel documents from Office 2003 and documents saved in the 2003 format were unaffected.

After initially suspecting the problem to be linked to the new Excel Services Application in SharePoint 2010, I worked to resolve the configuration of the Excel Services Application which we had left previous un-configured due to it not being required currently, however the problem persisted.

Whilst searching TechNet for the error code we were receiving I encountered a page entitled “Configure the Default Behaviour for Browser-Enabled Documents” (http://technet.microsoft.com/en-us/library/ee837425.aspx) which details how to manage the behaviour of SharePoint for launching web compatible documents.

SharePoint 2010 features various web-enabled services and can be configured to use Office Web Apps, which is a hosted version of the applications available via Office Live WebApps. The default behaviour for SharePoint 2010 is to attempt to launch web compatible formats using the web based application, however as this is not configured in our environment the error appeared.

The resolution to the problem was simply enabling the Site Collection Feature Open Documents in Client Applications by Default. Once enabled on the Site Collection to apply the setting to all child sites, SharePoint began prompting the users to open the file with their client side installations of Excel as per the SharePoint 2007 behaviour.