

Note the exact amount of space freed is difficult to predict due to the need to run a garbage collection job to clean up references to deduplicated container data that is no longer referenced by existing files.įor dedup volumes using the HyperV or Backup UsageType, one method is to
Rapid recovery powershell job to check jobs free#
Often, additional steps will be necessary to free up space. When deduplication is involved, simply deleting files may not be sufficient.

To recover from this condition, it is necessary to free up space on the volume.
Rapid recovery powershell job to check jobs full#
The amount of data a deduplicated volume can store depends on the underlying physical device, the level of deduplication in the dataset and theĪbility of the system (IO storage subsystem, memory available, CPU speed) to complete the deduplication optimization processing for the amount of daily data churn.Ī volume with deduplication enabled can become full when the deduplication savings percentage is not high enough or the dedup optimization job cannot keep up with optimizing the data churn. The only event log we have is for “disk full”, which is too late.Īs with any storage system, a deduplication enabled volume will become full when enough data is written to it. Note: There is no event log indicating deduplication is not keeping up with the data churn and volume is running out of free space.
