Azure Backup Maximum Retention

This is a very short and quick post but something I wanted to share none-the-less.

I got a call from somebody today looking at the potential for using Azure as a long-term solution to store infrequently accessed data. A StorSimple appliance is one obvious answer to the problem but that was out of consideration in this instance and we talked about using Azure Backup as a solution due to the fact that this data doesn’t actually need to be accessible online and an offline recovery to access the data would be viable.

When I started to use Azure Backup with the Windows Server 2012 R2 Essentials integration a number of years ago, Azure Backup was limited to 30 days retention but I knew that this had been increased of late so using the Microsoft Azure Backup client on my server, I looked to see what the maximum value was that I could set the backup job retention to and the number that came out was 3360 Days which in a sensible scale is 9 Years and 3 Months.

That’s quite a lot of retention there but sadly, it still wasn’t enough for this requirement so back to the drawing board. My problem aside, it’s good to see that Azure Backup now supports long-term data retention for backup and 9 years and 3 months is long enough to meet most organisations retention requirements including those in the financial sector.

Failed Windows Server 2012 Essentials R2 Azure Backup Integration

Just before Christmas, I upgraded my Windows Server 2012 Essentials server at home to Windows Server 2012 Essentials R2. After re-deploying the server as R2, I re-configured my Windows Azure Backup and my Office 365 Integration. Since re-configuring the Windows Azure Backup, I’ve been having a problem with the integration with the Windows Server 2012 Essentials R2 Dashboard.

The Windows Azure Backup Integration is dependant on two things: The Windows Azure Backup Agent (cbengine) and the Windows Azure Backup Integration Service  (WSS_OnlineBackupProviderSvc). The Windows Azure Backup Integration Service is dependant on the Windows Azure Backup Agent.

With both services started, launching the Dashboard and accessing the Online Backup tab is empty reporting No Data.

Windows Server 2012 Essentials R2 Dashboard Online Backup No Data

When this occurred, I observed that the Windows Azure Backup Integration Service would stop after launching the Dashboard. Restarting the service and the Dashboard did nothing except cause the service to crash again. This crash could be observed in the Application Event Log as follows:

Error .NET Runtime Event ID 1026

Application: OnlineBackupProvider.exe

Framework Version: v4.0.30319

Description: The process was terminated due to an unhandled exception.

Exception Info: System.NullReferenceException

Stack:

at Microsoft.WindowsServerSolutions.DataProtection.OnlineBackup.OnlineBackupJob.Equals(Microsoft.WindowsServerSolutions.DataProtection.OnlineBackup.OnlineBackupJob)

at Microsoft.WindowsServerSolutions.DataProtection.OnlineBackup.OnlineBackupProviderCore+<>c__DisplayClass46`1[[System.__Canon, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].<GetOnlineBackupObjectUpdateList>b__44(System.__Canon)

at System.Linq.Enumerable.FirstOrDefault[[System.__Canon, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]](System.Collections.Generic.IEnumerable`1<System.__Canon>, System.Func`2<System.__Canon,Boolean>)

at Microsoft.WindowsServerSolutions.DataProtection.OnlineBackup.OnlineBackupProviderCore.GetOnlineBackupObjectUpdateList[[System.__Canon, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]](System.Collections.Generic.List`1<System.__Canon>, System.Collections.Generic.List`1<System.__Canon>)

at Microsoft.WindowsServerSolutions.DataProtection.OnlineBackup.OnlineBackupProviderCore.UpdateOnlineBackupData()

at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)

at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)

at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem()

at System.Threading.ThreadPoolWorkQueue.Dispatch()

Searching online for the issue turned up nothing, so I decided to report the issue on the TechNet community forum (http://social.technet.microsoft.com/Forums/en-US/eb718279-3da9-4544-9e0f-50b0ba440ef5/windows-azure-backup-integration-service-fails?forum=winserveressentials) and Pan Chen from Microsoft turned up with an unexpected answer.

The Windows Azure Backup Agent logs the status of backups and their success or failure to a separate event log in Applications and Services LogsCloudBackupOperational. Pan believed that an unexpected or corrupt event log entry was preventing the integration service from reading this event log properly.

I cleared the log file, restarted the Windows Azure Backup Integration Service and re-launched the Dashboard, and after some delay, presumably while the Dashboard pulled new data from the Azure Backup Agent, I am now able to see the status data in the Dashboard.

My personal feeling is that a bad event log entry shouldn’t cause this integration to fail, but suffice to say, it looks like it does.

Restoring Client Computer Backup Database in Windows Home Server 2011

Quite sometime ago (probably about two months now), the primary drive on my Windows Home Server 2011 was giving me issues due to a new device driver I installed. Nothing got me going with ease: Last Known Good Configuration, Safe Mode, nothing. The problem lied in the fact that the server wouldn’t acknowledge that the OS disk was a boot partition, and after leaving it to attempt to repair the boot files by itself, which for the record, I don’t think I’ve ever seen work, I took to it manually.

Launching the Recovery Console command prompt from the installation media, I tried the good old commands that have served me well in the past for Windows Vista and Windows 7 machines I had to repair, bootrec and bootsec, but nothing worked, so I was left with only one option to re-install the OS. I wasn’t concerned about loosing personal data that is stored on a separate RAID volume, but I was concerned about my client backups which were stored on the same volume.

Using a USB attached hard disk, I manually copied out the Client Computer Backups folder, then rebuilt the operating system. I don’t keep active backups of the Home Server operating system because the Windows Server Backup utility in Windows Server 2008 R2 isn’t that hot. It doesn’t support GPT partitions over 2TB which obviously is an

Once installed, Windows Home Server sets up the default shares and folders including the Client Computer Backups. The critical thing here is that no clients can start a backup to the server before you complete these steps. Once a client starts a backup to the server, it creates the new databases and files for the server, ruining the chances of importing the existing structure.

From the new OS installation, open the directory where the Client Computer Backups live. The default location is C:ServerFoldersClient Computer Backups, but I had moved mine to D:ServerFoldersClient Computer Backups. Once you’ve found the directory, copy across all of the files I had previously copied from the burnt install of Windows and overwrite any files that it requests.

Once completed, restart the server. This will restart all of the Windows Home Server services responsible for running the Dashboard and the Client Computer Backups. Once the restart has completed, open the Dashboard and select the Computers tab where you normally view the computer health states and backups. On first inspection, it looks as though you have no clients and no backups, but look more closely and you will se a collapsed group called Archived Computers. Expand this group and you will see all of your clients listed and all of their associated backups will be listed if you select the Restore Files option for a computer.

The thing to point out here is that these backups will remain disassociated from the clients. Once you re-add a client to the server and commence a backup, it will be listed as a normal computer and the Archived Computer object for it will also remain listed. This is because the server generates GUIDs for the backup files based on a combination of the client identity and the server identity and because the reinstallation of the operating system will cause a new GUID to be generated, they are different. This isn’t a problem for me, but I’ve read a number of posts on the TechNet forums at Microsoft where people have had trouble locating the Archived Computers group in the Dashboard interface and think that they’ve lost everything which clearly isn’t the case.

Azure Backup for Windows Server 2012 Essentials

Last night, I posted saying that I think Microsoft had missed a trick in not taking advantage of the Windows Azure Cloud Backup features in Windows Server 2012 Essentials, and today it looks like I must eat a slice of humble pie.

After some reading on the subject this evening, it appears that Microsoft are actually incorporating it, but not natively. To access the feature, you need to install a plugin. A blog post on the Small Business Server TechNet Blog details the installation steps to get the plugin installed and working (http://blogs.technet.com/b/sbs/archive/2012/09/18/windows-azure-online-backup-and-windows-server-2012-essentials.aspx).

Users of Windows Server 2012 Essentials can get a free six month trial for the service, however information on pricing is hard to find and understand: There is nothing on the trial signup page which offers an insight into what you will pay beyond the trial? Using the extremely complicated (and for good reason due to its capability and scale) Azure Pricing Calculator gives you a hint as to what you will pay but I think Microsoft need to provide some confirmation around the storage options.

Storage is offered in two different flavours: Geo Redundant and Local Redundant with the former seeing your data replicated throughout the Azure global infrastructure and the latter seeing your data only being replicated within your geographic region, but I can’t seem to find anything that states whether either option is valid for the backup service, or if you must use a particular option? Geo Redundant storage is £7.58 per month for 100GB, while Local Redundant is £5.64 per month for 100GB to give it some context.

The two storage types will have implications on your views on the United States and their laws such as the Patriot Act. If you are precious about your data (you should be) and don’t want these authorities to be able to view it under law without your consent which is essentially what the Patriot Act boils down to, then you may want to consider against the Geo Redundant option as after all, Local Redundant still gives you way more availability than your single on-site server. The region that your data is stored in is determined by the country you select during registration, so make sure you set it correctly.

Compare the above prices to those of one of the most popular Windows Home Server cloud backup solutions, Cloudberry and Azure directly looks good. For the same 100GB of storage, you will pay $9.30 a month for Amazon S3 or $12 a month for Google Cloud Storage, plus a $29.99 license cost for the Cloudberry product.

The thing to be conscious of, is this small catch: retrieving the data. Azure provides free unlimited inbound (upload) traffic so you pay nothing to upload your backups, but download is priced per gigabyte per billing cycle. If your server was to fail and you need to pull down your 100GB of data back to the server once it is recovered, then in a single billing period then you will pay £6.55 for 95GB (the first 5GB is free), but the key to remember is that this is a one time cost if and when the server fails. This price also will vary based on your geography. The price I’ve shown is for US and European egress data. If you like in another location, then the price is £10.37 instead, so bear this in mind.

Looking at this as a home user and not an SMB, I think paying £5.64 a month is a very small price to pay for piece of mind that all of my family pictures and important documents can be protected to a much higher degree than I can do at home with a Mirror Storage Space and an external USB or eSATA disk on-site backing up the server. From the perspective of an SMB then your data is your business so only you can value what your data is worth, but I would guess a lot. If you are an SMB without the luxury of a full time IT professional or a well managed agreement with a Microsoft Partner for supporting your environment, then I would guess that this service could one day prove invaluable.

Windows Home Server Backup: Wife Approval and the Potential

Last night I spent about two hours working on Nicky’s laptop which she had somehow managed to get infected with a virus or multiple viruses should I say.

I tried loads of things to correct the wake of problems caused by it, however I was having a hard time so I contemplated using my investment in Windows Home Server and flexing it’s Recovery CD for fighting crime (or virus).

I didn’t have to run the backup in the end as I managed to fix the problem, but the point needs to be addressed of just how wife friendly Windows Home Server actually is, and let’s face it: If your a geek / tech-head with any interest in things like Home Servers, Media Centres and the like you know that it has to be wife friendly or you will never get budgetary approval 🙂

Read the Full Post

Sky+ HD Multi-Room Shared Planner

I got some anonymous information last night from a friend about a service Sky are considering introducing here in the UK.

The service looks as if it’s going to be called Sky+ HD Multi-Room Shared Planner – What a mouthful.

The premise of it is that you have multiple Sky+ HD boxes in your house, and you can share recorded TV amongst those boxes throughout your house. The service also touts that you will be able to connect to your Sky+ HD boxes via your PC and access that content also.

Read the Full Post

Using NTBackup in Vista

So it’s official that with all of the nice new features and pretty things in Vista that the new backup utility sucks especially if your interested in scheduling things from the command line.

Reason? The new Vista backup utility only allows you to backup entire volumes or specified mount points using their GUID, so what happens if you want to backup only a few selected folders?

The answer is to use NTBackup from Windows XP or Server 2003, however this isn’t as simple as copying the ntbackup.exe file from the XP System32 directory.

Read the Full Post