Support

Akeeba Backup for Joomla!

#8705 Restoring from Amazon S3

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by nicholas on Sunday, 21 November 2010 03:49 CST

biopgoo
Hello,

We have successfully set up Akeeba Pro to backup to Amazon S3. According to the instructions we have set it up to use fixed size part files to avoid timeouts. For one of our Web the backup process generated at least 10 part files. The minimum amount of part files generated is three.

We are really new using Amazon S3. We have not found a way to download in one step all the part files. We have not found a way to download the part files to a server. Right now we consider the only restoration procedure for this large web to do at least 10 download operations to our computer and then upload each part to the server where we want the Web restored which is a time consuming and inconvenient way of restoring from backup.

Is there a simpler way of getting our backups from Amazon S3?

Love

steph.s
Hi biopgoo,
I am not very experienced with Amazon S3, but you should be able to use a FTP client like CuteFTP or FileZIlla to download all of your files at one time to your computer and upload all of your file parts at one time to your hosting server. This is much faster than using a file upload download client located on the hosting server.

nicholas
Akeeba Staff
Manager
At the time of this writing, I am not aware of a PHP script which can reliably fetch files from S3 to your live server without an intermediate download to your local PC. I am exploring some ways to make that happen, but it will take me quite some time to accomplish (in the range of 2-3 months).

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

biopgoo
There must be a way of downloading several files matching a pattern from S3 to a server. I have opened https://forums.aws.amazon.com/thread.jspa?threadID=54972to see if the Amazon community has any advice.

Doing 12 individual downloads to my desktop and subsequently uploading 150MB over a slow DSL connection seems like a poor alternative of managing backups.

Hopefully the Amazon community will have advice they can provide us for simple restoration of our backups.

Love,
Andres

nicholas
Akeeba Staff
Manager
Figuring out which files to download is the least of my worries. My major woe is download speed and PHP timeouts. Let's say you have a 300Mb backup file and you can download approx. 1Mb/sec from S3 to your server. This means that the file takes 300 seconds (5 minutes!) to download. Since PHP's maximum execution time is usually in the area of 10-30 seconds, the process would time out. What I have to do is to start a chunked download, a couple of megabytes every time, keeping on appending to the backup archive as each chunk is downloaded. This requires a ton more code than a simple download, that's why it's not implemented yet. This is exactly what I am working on. I know how to do it, but I really need time - and I can barely find any these days :p

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

user21004
This was a real concern for me as what good is a backup tool without a good recovery mechanism.

Fortunately there are existing solutions out there.

If you have shell access, s3cmd (a console s3 utility) works great ! I was able to list, upload, and download from my S3 account without any trouble at all. Very straightforward to use and does not require you to have root access to the server. You just need Python on your server (which is pretty standard now).

http://s3tools.org/s3tools

nicholas
Akeeba Staff
Manager
AFAIK if you have shell access there is also a much simpler way to do that without installing anything extra on your server :) You can just created signed time-limited download URLs for each backup part and use wget or curl to fetch them to your server. Providing signed download URLs is easy with any S3 desktop tool such as CloudBerry Explorer (Windows) or S3Fox (cross-platform, runs in Firefox).

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

user21004
Cool. Just getting started with S3 and learning what's out there. If you run your own server you can also mount an entire bucket as a file system as well.

The signed download thing is very convenient - thanks for pointing it out.

nicholas
Akeeba Staff
Manager
You're welcome! These are the small (and very important) tips of the trade ;)

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

user21004
Continuing on the S3 discussion, my S3 profile runs great from the backside without using the file-chunking algorithm. It seems to me that ZIP might be a better alternative than JPA in this case. In the case of a restore, moving a single file back to the site and unzipping seems like it would be simplest path to follow.

nicholas
Akeeba Staff
Manager
As long as you don't use split archives, ZIP is a good choice for that scenario. Just note that Linux' unzip and split ZIP files do not work very happily (or at all!).

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

user21004
i tried to unzip -t site--xxx-etc.. just verify that I could unzip a file created by ABpro but the test fails with

Archive: db-www.njatob.org-20101119-212655.zip
error: expected central file header signature not found (file #2).
(please check that you have transferred or created the zipfile in the
appropriate BINARY mode and that you have compiled UnZip properly)
file #1: bad zipfile offset (lseek): 1836015616
At least one error was detected in db-www.njatob.org-20101119-212655.zip.


Running Ubuntu Linux version 2.6.32-24-server on VPS
The unzip command works just fine on other zips.

So I d/l'd the file to local PC -- tried a number of thing (including the Akeeba Extract util) -- they all fail.

THEN I tried running kickstart on my local LAMP server.

Success (sort of ). ended up with a bunch .Snn files which were clearly SQL file, two SQL files (the site and the external DB) and an INI file. . Not sure what I'm supposed to do with them. I expected to see just to SQL files (one for the site and one for the external DB).

Hmm....

nicholas
Akeeba Staff
Manager
Confirmed. There is a bug in the ZIP archiver. If the site is still available, please take a backup using the JPA format. If it's not available, the .S* files are SQL dump files which you can restore with phpMyAdmin (first the .sql, then .s01, then .s02, etc)

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!