Dealing with large users and backup timings

Backups creation is slow due to large sizes

If your backups end up being very large and take a long time to create, you have a few options to reduce the load on your system:

  1. Do backups less regularly. Doing backups only once per week on the slowest day of the week will reduce the load for other days.

  2. Only backup some users one day, other users another day, etc. Spreading out the backup load for all accounts over the span of a week (or longer) will reduce your system load.

  3. You can disable some areas of the backups if you don't need them, or want to use other backup methods (like rsync) for those areas. Skip IMAP Dataopen in new windowSkip Databasesopen in new windowSkip the /home/user/domains directoryopen in new window. Rsync would be a good replacement for this. Skip the home.tar.gzopen in new window (other files in the User's home)

  4. Relating to #3 above, newer versions of DA have options as to which areas you'd like to backup directly in the backup options.

Admin Level -> Admin Backup/Transfer -> Create Backup

"Step 4: What" of the backup creation has options so you can pick what areas are to be included in the backup.

If Users have an over-sized /home/username/domains directory, you can use this technique to skip the "Domains Directory" option in the backup, and use rsync to transfer the Domains Directory. Related guide: [/directadmin/backup-restore-migration/migrations#pulling-data-from-a-remote-directory-with-rsync]

  1. A more complex option is to move data under the User's home directory into a folder that won't be backed up. A list of these directories can be found here. For example, say they have a path with a lot of static images that don't need regular backups:

/home/user/domains/domain.com/public_html/images This concept would be to move the images into a skipped folder, eg:

cd /home/user
mkdir var
chown user:user var
cd var
cp -Rp ../domains/domain.com/public_html/images .

which will now give you /home/user/var/images, which won't be included in the backup.

Next, move the current folder out of the way, and then link to it:

cd /home/user/domains/domain.com/public_html
mv images images.old
ln -s ../../../var/images .

and test to ensure it's working. Once satisfied it's working, you can delete the images.old directory.

  1. Some OS's support a program called ionice. This will cap the rate of disk access done by any binary. See this threadopen in new window for information on using ionice to slow down the dataskq and it's child (tar) processes.

To cap just the tar binary with ionice, see this feature: http://www.directadmin.com/features.php?id=1423

Backing up and transferring large accounts

In some cases, you might have a User account which you need to transfer, but the size of the account is too large for the backup system to handle easily. Some causes might be:

  • The file systems can't create large files.
  • There is not enough free space on the drive to create the tar.gz file.
  • the ftp or transfer method is not able to handle large files.
  1. A solution would then be to break up the backup into smaller pieces. This can be done via:
Admin Level -> Backup/Transfer -> Step 4: What

where you can create a backup with or without certain areas.

With this ability, you can create the backup in smaller chunks, and restore each one on the remote server.

  1. In other cases, the data under /home/username may be a significant portion of the size of the backup file. In that case, in "Step 4: What", you can de-select "Domains Directory" and/or "E-Mail Data" and create/transfer/restore the backup without them. Then to transfer those areas, you can use rsync to pull the /home/username data from the backup box to the new remote box. Note that this guide mentions the transfer of /home/user/domains, but if you need to include email data, or any other data not in /home/username/domains, then rsync the entire /home/username directory.

Don't forget to reset file permissions if you're doing manual file copies (rsync may handle this automatically, but if unsure, you can do it anyway)

  1. If the large usage of the account is due to MySQL databases, de-select "Database Data", and backup/transfer/restore the tar.gz backup file. This should create the empty databases under the account on the new box. Then use mysql and mysqldump to manually transfer the databases themselves.

How to spread out the backup process over a longer period

If you wish to keep your load average below a certain point, and not allow backups to run right away if the load is over a certain point, the following script will help. It will check your load average before each backup file is created. If the load is too high, it will wait for 5 seconds, then check again. It will continue this process until the load is low enough, then create the backup. If the load is not below the threshold after 20 attempts, a non-zero value is returned and that user backup is skipped and an error returned in DA.

  1. Create the file /usr/local/directadmin/scripts/custom/user_backup_pre.sh and place the code inside:
#!/bin/sh
MAXTRIES=20
MAXLOAD=8.00

highload()
{
          LOAD=`cat /proc/loadavg | cut -d\  -f1`
          echo "$LOAD > $MAXLOAD" | bc
}

TRIES=0
while [ `highload` -eq 1 ];
do
          sleep 5;
          if [ "$TRIES" -ge "$MAXTRIES" ]; then
                    echo "system load above $MAXLOAD for $MAXTRIES attempts. Aborting.";
                    exit 1;
          fi
          ((TRIES++))
done;
exit 0;
  1. And make it executable:
chmod 755 /usr/local/directadmin/scripts/custom/user_backup_pre.sh

Related: http://www.directadmin.com/features.php?id=978

Backup system runs at the same time quotas are counted in the tally

Starting from DirectAdmin version 1.596 the panel uses realtime_quotas=2 by default, if not specially set in your installation the issue should not appear anymore.

DirectAdmin relies on the quota system to count how much disk space quota a User is using.

If realtime_quotas=0 is set it could be running a backup at the same moment DA is counting that quota, there is a chance DA will count almost double the size of the used disk space, since backups and the assembly area are all done as the User.

There are a few ways to avoid this scenario or to change things around so that you don't need to ever worry about it.

  1. Start using the realtime_quotas=2open in new window .

  2. For ftp backups, tell DA not to use /home/tmp for assemblyopen in new window. This only works if /home has its own partition, and you move it to some other partition. For example, set:

backup_tmpdir=/tmp

assuming that /tmp is large enough to store this temporary data.

  1. Use the Direct Imap Backupopen in new window option, so the /home/username/imap folder is loaded directly into the backup without assembling a copy of it first. This will greatly speed up the backup process, and also reduce the space used, lowering the chances of hitting the issue.

Limiting the Reseller and User backups to specific times

If your don't want to increase the load of your system at peak hours and wish to only allow backups to be created by Resellers/Users within a certain range of time, you can use the all_pre.sh script to do so.

In this example we will code the all_pre.sh for the CMD_USER_BACKUP and the CMD_SITE_BACKUP to only be able to create backups between the hours of 1am and 8am.

Create /usr/local/directadmin/scripts/custom/all_pre.sh, and fill it with this code:

#!/bin/sh

HOUR=`date +%k`
MAKINGBACKUP=0
if [ "$command" = "/CMD_USER_BACKUP" ]; then
    if [ "$action" = "create" ]; then
        #when=now or when=cron
        MAKINGBACKUP=1
        if [ "$when" = "cron" ]; then
            HOUR=$hour
        fi
    fi        
fi

if [ "$command" = "/CMD_SITE_BACKUP" ]; then
    if [ "$action" = "backup" ]; then
        MAKINGBACKUP=1
    fi        
fi

if [ "$MAKINGBACKUP" -eq 1 ]; then
    if [ "$HOUR" -ge 1 ] && [ "$HOUR" -lt 8 ]; then
        #this is a valid time, exit.
        exit 0;
    else
        echo "Backups must be created between 1am and 8am";
        exit 1;
    fi
fi
exit 0;

And make it executable:

chmod 755 /usr/local/directadmin/scripts/custom/all_pre.sh
Last Updated: