Support

Akeeba Backup for Joomla!

#32421 Fetching backup from remote server give time-out error if the website runs with Cloudflare

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by on Friday, 06 March 2020 17:17 CST

rikoooo
Hi,

As described in the topic subject, fetching backup from remote server give time-out error in back-end if the website runs with Cloudflare.

Cloudflare has a time-out of 100 seconds which is a problem when fetching big backup like mine (5 GB). It times-out at around 21%, I have checked the backup folder and noticed the download stopped at the same time of the time-out.

It would be nice to fix this issue with some automatic refresh script or make it full Ajax.

Thank you for your great product.

Regards,

Erik

nicholas
Akeeba Staff
Manager
FYI the archive fetch does use multiple page loads just like backing up, or extracting the backup archive, or restoring the database, or... you get the idea. I use this pattern everywhere in my software where I expect a non-trivial amount of time to elapse doing something. I'm the only one using this much harder but far better working method to work around PHP script timeouts. Everyone else uses the "infinite PPH time limit and a prayer" method which works less than half of the time but is cheaper to develop and maintain by four order of magnitudes. I guess the difference is that I write software to maximize the number of people I can help, other developer write software to make maximize the number of people they can easily take money from. Different goals, different means.

But I digress. The problem that you have is that not all remote storage services allow for chunked downloads. For example, Amazon S3 allows us to download 1MB at a time. That takes between 0.5 and 10 seconds on most servers, allowing us to split the work in roughly 5 to 10 second work units. FTP, however, does not allow chunked downloads. We have to download an entire archive part. If your archive part is 2GB we'll have to try and download 2GB all at once which will, of course, cause a timeout.

Another related problem might be that your site's backend, you said, is running through CloudFlare. This is a bad idea for several reasons. You should have it not be cached by the CDN. If you've set up CloudFlare to basically do a pass-through to your server for /administrator URLs you can ignore this paragraph; I've read your ticket as CloudFlare caching the backend pages.

Most likely your solution lies on the backup side of things. Tell me which part size for split archives you're using and which remote storage service / method you're using for your backups. I will tell you what are the hard technical limitations you cannot overcome and how to reconfigure your backups so you can avoid these limitations.

Meanwhile, I can tell you what to do with your existing archives. You can download them manually and put them in the backup output directory where Akeeba Backup is looking for them. If you're not sure, check the info icon of the backup entry in the Manage Backups page. It will tell you where the backup archive is expected to be. After the files are there reload the Manage Backups page and you can now use Restore on those backups.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

rikoooo
Hi Nicholas,

Thank you very much for your detailed reply

To answer your questions :

- Cloudflare CDNS caches are disabled for /administrator URLs , but 100 seconds timeout cannot be disabled so this is a hard technical limitation that cannot be overcome.

- Archiver engine = jPA format
- Part size for split archives = Custom 2047.88 MB
- Chunk size for large files processing = 1MB
- Big file threshold = 1MB

- Post-processing engine = OneDrive (LEGACY)

Thank you in advance for your kind support.

Erik

nicholas
Akeeba Staff
Manager
OK, this makes sense and it looks like I was right in my guesstimate about the reason of your issue.

OneDrive allows you to upload each file in small chunks but it only allows downloading the entire file at once. Therefore you're trying to download a file up to 2GB at once which will, of course, time out.

The only workaround is to se the "Part size for split archives" to something much lower, e.g. 30MB. This means that each file will take around 15 to 60 seconds to download which is well within the constraints of CloudFlare and your server.

Of course this workaround will only work for future backups. Existing backups need to be transferred manually as I said in my last reply.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

rikoooo
Thank you Nicholas for your reply,

I noticed the timeout happened at around 2GB. So I guess I can configure the chunk size to something higher than 30MB, maybe 500 MB?

Thank you

nicholas
Akeeba Staff
Manager
Yes, that's information I didn't have so I went very conservative. You can try decreasingly smaller part sizes until you find the sweet spot for your server.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

rikoooo
Hi,

Ok Thank you, I will let you know if it's work soon as my next backup is created.

Erik.

System Task
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!