Export (0) Print
Expand All

Azure Backup Frequently Asked Questions (FAQ)

Published: September 7, 2012

Updated: September 30, 2014

Applies To: Windows Server 2008 R2 with SP1, Windows Server 2012

The following is a list of commonly asked questions about Azure Backup. If you have any additional questions about Azure Backup, go to the discussion forum and post your questions. Someone from our community will help you get your answers. If a question is commonly asked, we will add it to this article so that it can be found quickly and easily. 

Azure Backup has been tested on the following server platforms:

  • Windows Server 2012 R2 Standard

  • Windows Server 2012 R2 Datacenter

  • Windows Server 2012 R2 Foundation

  • Windows Server 2012 Datacenter

  • Windows Server 2012 Foundation

  • Windows Server 2012 Standard

  • Windows Storage Server 2012 R2 Standard

  • Windows Storage Server 2012 R2 Workgroup

  • Windows Storage Server 2012 Standard

  • Windows Storage Server 2012 Workgroup

  • Windows Server 2012 R2 Essentials

  • Windows Server 2008 R2 Standard SP1

  • Windows Server 2008 R2 Enterprise SP1

  • Windows Server 2008 R2 Datacenter SP1

  • Windows Server 2008 R2 Foundation SP1

  • 64-bit version of Windows Server 2008 Standard with Service Pack 2 (SP2)

  • 64-bit version of Windows Server 2008 Enterprise with Service Pack 2 (SP2)

  • 64-bit version of Windows Server 2008 Datacenter with Service Pack 2 (SP2)

  • 64-bit version of Windows Server 2008 Foundation with Service Pack 2 (SP2)

CautionCaution
Azure Backup cannot be used with servers running the Server Core installation option of either Windows Server 2012 or Windows Server 2008 R2.

Any currently configured backups will be stopped. You will need to reregister the server with the backup vault and it will be considered a new server by Recovery Services, so the first backup operation that occurs after registration will be a full backup of all of the data included in the backup, instead of just the changes since the last backup occurred. However, if you need to perform a recovery operation you can recover the data that has been backed up using Recover from another server recovery option. For more information, see Rename a server.

The following table identifies the drives and whether they can be used with the Azure Backup service.

 

Drive description Azure Backup supported

BitLocker-protected volume

Yes, but the volume must be unlocked before the backup can occur.

File System identification

Yes. NTFS is the only file system supported for this version of the online backup service.

Removable Media

No. The drive must report as fixed to be used a backup item source.

Read-only Volumes

No. The volume must be writable for the volume shadow copy service (VSS) to function.

Offline Volumes

No. The volume must be online for VSS to function.

Network share

No. The volume must be local to the server to be backed up using online backup.

The following table lists the supported list of file and folder attributes/types and the expected behaviour of the Azure Backup when it encounters these types:

 

Attribute/type Supported Expected Behavior

Encrypted

Yes

Changes in file cause full file transfer

Compressed

Yes

Change in file cause delta transfer

Sparse

Yes

Changes in file cause delta transfer

Hard Links

No

Skipped

Reparse Point

No

Skipped

Encrypted + Compressed

No

Skipped

Encrypted + Sparse

No

Skipped

Compressed + Sparse

Yes

Backed up as sparse file

Compressed Stream

No

Saved as uncompressed stream

Sparse Stream

No

Discarded stream

Yes. In the current implementation the agent service converts the deduplicated data to normal data when it prepares the backup operation. It then optimizes the data for backup, encrypts the data, and then sends the encrypted data to the online backup service.

No. The backup vault stores the backed up data that had been transferred up to the point of the cancellation. Azure Backup uses a checkpoint mechanism so that the backup data gets check-pointed occasionally during the backup and the next backup process can validate the integrity of the files. The next backup triggered would be incremental over the data that had been backed up previous. This provides better utilization of bandwidth, so that you do not need to transfer the same data repeatedly.

The size of the cache folder is determined by the amount of data that you are backing up. In general you should expect that 10-15% of the space required for data storage should be allocated for the cache folder.

This can occur when the backup schedule settings stored on the local server are not the same as the settings stored in the backup vault. The following are some possible reasons for this occurrence:

  • The local server has been recovered to a known good state.

  • Azure Recovery Services have recovered the settings to a known good state.

When either the server or the settings have been recovered, the backup schedules can lose synchronization. If this has happened, you should reconfigure the backup policy and then Run Back Up Now to resynchronize the local server with Azure.

This error is sometimes caused when the Azure service have been recovered from a serious error. As a result of this recovery, your server settings might have been modified. If you are seeing this error you should re-register your server with the backup vault. Once you have registered your server, all operations will resume according to your previously configured schedule.

Support for the current release of Azure Backup requires that you install the Update Roll up 2 for System Center Data Protection Manager SP1 before installing the Azure Backup Agent. For full details on this update, see Update Roll up 2 for System Center Data Protection Manager SP1.

All the data that is backed up is compressed and encrypted before being transferred. If you are backing up a large amount of data you can expect about 25% less data to be transferred because of the compression, though this might vary depending on the type of data being backed up. If you are backing up a smaller amount of data, you might see more data being transferred than the amount of data. This is because a fixed data of around 35 MB is transferred during the first backup irrespective of the amount of data being backed up. Subsequent backups only transfer the changed/added data, so the difference in data size for those transfers is purely due to the effect of encryption and compression.

Any servers that are registered using the same certificate will be able to recover the data backed up by other servers that use that certificate. If you have servers that you want to ensure recovery only occurs to specific servers in your organization, you should use a separate certificates designated for those servers. For example, human resources servers could use one certificate, accounting servers another, and storage servers a third. That would give you a way to control recovery by installing the appropriate certificates on the recovery servers.

In general the backup data is sent to the datacenter of the Backup Service to which it is registered. The easiest way to change the datacenter is to uninstall the agent and reinstall the agent and register to a new Backup Service. Then, to avoid restarting the server, open the Services control panel, and stop and restart the OBEngine process to reset the datacenter to which the data is being backed up.

The only limit that applies is to the data source you are backing up. Due to the way Azure is architected, data sources need to be less than 1.65 Terabytes (TB). If you have volumes that exceed 1.65 TB on your server that you want to backup to the cloud you can do either of the following:

  • Divide the large volume into smaller volumes prior to backup.

  • Pick and choose files and folders from the volume to backup so that the total amount of data being backed up from the volume is less than 1.65 TB.

In release 2.0.8683.0, the storage limit was increased to 1.65 Terabytes (TB). If you are not using version 2.0.8683.0 or greater, then your storage limit will remain 850 GB. To increase the storage limit to 1.65 TB, see KB2989574and install the hot fix.

You can 25 vaults per subscription, and up to 50 servers per vault. In total you can have 1250 servers per subscription.

At this time you cannot backup entire Azure Virtual Machines or perform a system state backup of Azure Virtual Machines using Azure Backup. However, you can backup all of the files and folders on the virtual machines using Azure Backup.

We are also developing a wiki for errors and events encountered with the online backup service on TechNet. All of our event and error message codes are listed on the Azure Backup Errors and Events List article and each code will get an individual wiki entry. Please contribute your experiences as you encounter specific errors and events.

Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft