Support

Akeeba Backup for Joomla!

#24508 Remote Transport Failing

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by nicholas on Monday, 22 February 2016 10:48 CST

user89568
EXTREMELY IMPORTANT: Please attach a ZIP file containing your Akeeba Backup log file in order for us to help you with any backup or restoration issue. If the file is over 2Mb, please upload it on your server and post a link to it.

Description of my issue: I put a ticket in with you on Friday but when I came back to look it's no longer here or can be found so I am putting this in again. If this this a duplicate place forgive me but wanted to make sure someone saw my ticket on my issue.

I have setup Akeeba and while the backup is working without issue both manually and through a daily Cron when trying to transport the backup to a remote server it fails no matter which options are selected. I have tested this with the following WebDAV, FTP/FTPS, and SFTP.

I can make the test connection to the FTP or FTPS server successfully within the configuration but after the backup it will fail to move any data or when manually attempting to move the data the following error is displayed in the backend "
Upload of your archive failed.

Uploading /home/XXXXXX/public_html/YYYYYYYY/administrator/components/com_akeeba/backup/site-www.speedwaydigest.com-20160221-181502.jpa has failed."

When switching over to WebDAV and attempting to again move the data over manually I get no error but just a blank screen and nothing ever happens. I am able to connect to my web disk from a standard desktop without issue and place data in it. I have also added the IP address into my firewall of my server of the remote server so that it won't block any type of traffic inbound/outbound.

Additionally I have a test server that I setup with FTP and WebDAV to test with to see if I gained the same results as my backup server I use and was able to duplicate the same to my test server.

nicholas
Akeeba Staff
Manager
Regarding FTP/SFTP it's very likely that you are giving it the wrong initial directory, or that directory is not writeable. Neither can be tested when you use the "Test FTP Connection" button – as you're told in the confirmation message following clicking that button.

Regarding WebDAV, the problem is the file size.I can see that the total size of the backup is 13,481,938,433 bytes split into 2Gb files. Unfortunately this means that you're asking PHP to load a 2Gb file in memory before sending it to your WebDAV server. The memory limit of PHP is just 256Mb so this fails.

What you need to do is set the Part size for split archives to a more reasonable value for file transfers, 100 Mb or less.

This will create 20 times as many files but it's the only way to have it transferred out of your server via WebDAV. Only very few remote storage services, such as Amazon S3, allow you to transfer big files in smaller pieces circumventing this issue. WebDAV –just like FTP and SFTP– does NOT offer such an option. Finally please note that the 100 Mb setting will only work with CRON jobs. If you want to use the same backup profile from the back-end of your site you need to use a MUCH smaller part size, e.g. 10-20 Mb.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

user89568
I have the Initial directory set at / which is what it defaults to in Filezilla then the sub directory for placement is backup which is write able which I am using a root server account and have tested working externally. I've also left the sub directory blank and even left the initial directory blank.

But in all honesty your solution really isn't meant for a site of my size if I have to break down the backups into 100MB file sizes this would be somewhere around 134-135ish or so individual files at 13.5GB backups and growing each day???

Only real reason I bought it was for the remote transfer options and wish I would have known this prior. Nice product but I guess lesson learned on my part.

nicholas
Akeeba Staff
Manager
I do NOT appreciate your passive aggressive tone at all. You never asked me a pre-sales request and you did indicate that you are fully informed about the product. I don't accept responsibility for your actions. This is VERY insulting, it's a borderline ToS violation and does get your account flagged and your ticket closed. Next time ask a pre-sales request if you are not sure. That's why pre-sales requests are free of charge (I felt that it was self-explanatory but, alas).

Regarding my previous reply, it's not "my" solution! It's fundamentally how your web server is setup and how web servers, in general, work. On top of that it's how the file transfer protocols you chose to use (FTP and WebDAV) work. Both require you to be able to send the entire file in a single request. Due to the way servers and PHP itself work you MUST load the entire file into memory for this to happen. You do understand that we NEITHER write PHP itself, NOR do we set up your server, correct?

Therefore the real issue you have is a limitation on your server, namely the memory_limit in php.ini, i.e. the maximum amount of memory a PHP script can use at one time. If you set the PHP memory limit to 2.5Gb instead of the current 256Mb and you have enough available RAM on your server you will not have a problem using WebDAV ASSUMING that the remote end accepts file uploads that big (extremely unlikely – WebDAV was NOT designed for that kind of volume). I consider this solution to be far more ridiculous than lowering the part size for split archives so I didn't even recommend it.

Just a parenthesis here. If your problem is file organization you do realize that you can use variables in the directory name, right? It's in the documentation. You could use a directory name like /backups/[DATE] to keep all files from the same date under the same folder, a different folder for each calendar day. This makes the point about the number of files completely moot.

However, as I explicitly stated above, you can always use Amazon S3 WITHOUT using a small part size for split archives. Amazon S3's API allows us to split the large, 2Gb, backup parts in smaller chunks (100Mb works great in CLI as I already said). Yes, I told you explicitly in my previous reply. And yes, THAT is the BEST solution for a largely growing site like yours. I can tell you that after having had many long chats with the administrator of a site with a 64Gb backup, growing 1Gb every month. When it comes to storing obscene amounts of data, efficiently, there's nothing better than Amazon S3.

The full list of remote storage provides which support chunked big file transfers are:
  • Upload to Amazon S3
  • Upload to Dropbox (v1 API)
  • Upload to Dropbox (v2 API)
  • Upload to Microsoft OneDrive


However due to pricing considerations I would only recommend Amazon S3 for a site the size of yours.

Finally, do bear in mind that even if we made FTP/SFTP work they would still not be a good choice for your site because of how PHP handles FTP/SFTP uploads. That's why I didn't even offer to debug this issue for you. It's an exercise in futility. You'd still need to use a small part size for split archives to let PHP itself manage to wrangle the file, assuming that the remote end wouldn't throw a fit on the amount of data being shoved to it in a very limited amount of time (most FTP servers do come with some basic protection features to prevent abuse).

I am closing this ticket since I've already given you the optimal solution, twice. Use Amazon S3. You won't be disappointed.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!