/contrib/famzah

Enthusiasm never stops


2 Comments

Goodbye Acronis cloud — Hello Encrypted S3 backup!

Over time the backup strategies for my personal laptop are changing in the search for the most cost-effective, robust and secure solution. And it must be able to back up both my Windows host and Linux virtual machine.

  • I tried a backup to an AWS EC2 instance for a while but this was expensive.
  • I then changed to Acronis Cloud backup because I’m very satisfied with their local hard disk backups. But their online cloud backup was an unpleasant experience. The cloud backup failed without indication in the taskbar; when I clicked for more info, the cryptic “error(0x49052524) in lib; please contact support” was displayed; I contacted support to no avail — but they wanted me to reinstall; it fixed itself after a dozen of days; this has happened two times in a few months; last but not least, when I wanted to browse my online backup the web interface was really slow. Sorry Acronis, but you really disappointed me.

Now I’ve come to an open-source solution for my backup needs — the Encrypted S3 Backup written in Bash based on the official Amazon Command-Line Interface (CLI). This simple backup system leaves control and visibility in your hands. Additionally, the backup scripts are very small and you can easily audit them. The README provides all information about the design, security, usage, disaster recovery, etc. More or less, it’s a solution for Linux technical guys, and not really suited for end-uses who should try Duplicati instead. And it doesn’t back up an “image” of your system but it is file-based. Only the file data is archived, so you can’t restore the file owners, permissions and other meta info.

Let’s review the pricing side. In my case I’m doing a daily backup for 125 GB data in 320,000 files.

  • The incremental daily backup costs me $2.73 per month. 89% is the cost for S3 (mainly the GB-storage cost) and the rest is for bandwidth.
  • The initial one-time upload of 70 GB costed me $3.43. Expect about double for 125 GB.
  • The projected cost for a full restore is $11.59 where 96% is the price of the used bandwidth from S3 to Internet.
  • All prices are without taxes.

As far as performance is concerned, S3 is great!

  • Browsing my backup versions in the online S3 explorer is lightning fast.
  • The daily sync for 125 GB data in 320,000 files takes 23 minutes. I don’t change a lot of files on my laptop during my daily activities.
  • My initial upload performed with a speed of 10 MBytes/s, and it could have been faster if I had more than 80 Mbit/s Internet at my disposal.

Note that in the end you need to trust AWS S3 to encrypt your data server-side, and then to completely forget your original data.

Backup icon by PRchecker

Advertisements


4 Comments

Backup Google Sites automatically

I just found out how to make my Google Sites backup script almost non-interactive, so I decided to share. My usage pattern of this script is that I run it every month in the Linux console, and then the weekly backup of my hard disk takes care to additionally back up the information.

Why bother backing up Google Sites?
While Google are very reliable and probably they will never fail me here, I want to have an offline backup of my Google Sites pages in case someone steals my Google Account. So I back up. Online and offline, every week.

The backup script uses the wonderful free Java application “Google Sites Liberation“. My script is actually more like a sample Bash usage of this Java tool. You need to download the .jar file and store it in the same directory as the backup script. The source code follows:

#!/bin/bash
set -e
set -u
set -o pipefail

trap 'echo "ERROR: Abnormal exit." >&2' ERR

# config BEGIN

GUSER='username@gmail.com'
WIKI_LIST='wiki1 wiki2 wiki3'
JAR_BIN='google-sites-liberation-1.0.4.jar'
ROOT_BACKUP_DIR='./sites.google.com'

# config END

echo "We are using '$JAR_BIN'. Check for a newer version:"
echo '	http://code.google.com/p/google-sites-liberation/downloads/list'
read

echo "The directory '$ROOT_BACKUP_DIR' will be deleted!!!"
echo 'Press Enter to confirm.'
read

rm -rf "$ROOT_BACKUP_DIR"
mkdir "$ROOT_BACKUP_DIR"

echo -n "Enter the password for '$GUSER': "
read -s -r -e PASS
echo ; echo

for wiki in $WIKI_LIST ; do
	BACKUP_DIR="$ROOT_BACKUP_DIR/$wiki"
	echo "*** Exporting '$wiki' in '$BACKUP_DIR'..."
	echo "Press Enter to continue."
	read

	mkdir "$BACKUP_DIR"
	java -cp "$JAR_BIN" com.google.sites.liberation.export.Main \
		-w "$wiki" \
		-u "$GUSER" \
		-p "$PASS" \
		-f "$BACKUP_DIR"
	echo
done

References: