A new version of this script is released at: https://www.sysblog.nl/directadmin-backup-cloudvps-objectstore-curl/
Please use the new version for better compatibility with DirectAdmin backup.
We use the CloudVPS Objectstore to store data in a redundant (3 times) and cheap way, ideal for backups. One of the main disadvantages is that files on the Objectstore may not be larger than 5 GB in size. To workaround this we created a customized ftp_upload.php script (= bash script that handles the upload of backup files within DirectAdmin) that split files larger than 5 GB into smaller files of 4 GB each, which has the disadvantages that you cannot restore these backups through the DirectAdmin GUI any more. 99% of the DirectAdmin accounts are never larger than 5 GB so this is not a big issue.
DirectAdmin only supports FTP or SFTP to connect to your backup destination. CloudVPS has a FTP to Objectstore proxy so you can upload your backups through FTP to the Objectstore, great!. Since FTP data is not encrypted (SFTP is) and because the FTP to Objectstore has some issues from time to time we developed a new customized ftp_upload.php that uses CURL to upload the backup files to the Objectstore. Tests in production are succesful, but I have to admit we don’t use this script on all production systems yet.
DirectAdmin backup settings:
IP: ftp.objectstore.eu (only used when using stock ftp_upload.php)
Username: <projectid>:<username>
Password: <your password>
Remote Path: <name of the folder in your destination>
Port: 21 (only used when using stock ftp_upload.php)
Secure FTP: <choose> (only used when using stock ftp_upload.php)
The file can be saved to:
/usr/local/directadmin/scripts/custom/ftp_upload.php
The script:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 | #!/bin/sh CURL=/usr/local/bin/curl DU=/usr/bin/du BC=/usr/bin/bc TOUCH=/bin/touch if [ ! -e "${ftp_local_file}" ]; then echo "Cannot find backup file ${ftp_local_file} to upload"; /bin/ls -la ${ftp_local_path} /bin/df -h exit 11; fi ####################################################### # cURL upload_file() { if [ ! -e ${CURL} ]; then CURL=/usr/bin/curl fi if [ ! -e ${CURL} ]; then echo ""; echo "*** Backup not uploaded ***"; echo "Please install curl by running:"; echo ""; echo "cd /usr/local/directadmin/custombuild"; echo "./build curl"; echo ""; exit 10; fi # Ensure ftp_path ends with / ENDS_WITH_SLASH=`echo "$ftp_path" | grep -c '/$'` if [ "${ENDS_WITH_SLASH}" -eq 0 ]; then ftp_path=${ftp_path}/ fi # Split username into projectid and username IFS=':' read -r projectid username <<< "$ftp_username" # Check filesize of the backup. If larger than 5 GB split into files of 4 GB FILESIZE=$(stat -c%s "$ftp_local_file") if (( $FILESIZE > 5138022400 )); then split -b 4G "$ftp_local_file" "$ftp_local_file.part-" # Upload all splitted files for file in $ftp_local_file.part-* do result=`curl --show-error -X PUT -T "$file" --user "$username":"$ftp_password" https://$projectid.internal.objectstore.eu"$ftp_path" 2>&1` done else # Backup is smaller than 5 GB so upload it result=`curl --show-error -X PUT -T "$ftp_local_file" --user "$username":"$ftp_password" https://$projectid.internal.objectstore.eu"$ftp_path" 2>&1` fi if [ 0 -eq $? ]; then exit 0 else echo $result exit 1 fi } ####################################################### # Start upload_file |
*warning* We are not responsible for any kind of damage or incompatibilities you experience when using this article, test this on a test server first before using it in production. Always check the code of the patch file yourselves before implementing it. */warning*.