You are still changing the PHP time limit, not the Apache time limit. This is unnecessary. When you check the "Set an infinite PHP time limit" box in the configuration page you've already made sure that PHP doesn't time out on your server. The problem is that Apache won't wait for the 10 minutes it takes for your backup to upload and spit a gateway timeout error.
While you could set the Apache time limit to 2000 seconds, it would be a terrible idea as a single runaway PHP script (bugs do happen!) would bring down your server with 100% CPU usage. I have been there. It's not fun. It's a royal pain in the rear to fix that on a live host.
Frankly, using a PHP application through a web browser to transfer a very large amount of data to another server in a single step is unwise and not recommended. You can, however, transfer a very large backup archive using a backup running through a CRON job, only if you use the native CLI backup script (cli/akeeba-backup.php). This is exactly why I wrote it. The only limit is PHP's file handling, meaning that you can only use part sizes up to 2Gb (after that PHP may not be able to write to or read from the backup archive – that's a limitation of the PHP language due to the use of the platform-specific signed integer for its internal integer type, if you care to know the gory details). In any case, please use the akeeba-backup.php CLI script instead of the web interface.
There is a reasoning behind my suggestion:
- If you want a single backup archive no matter what, I assume you do that because it's more convenient when you are transferring it from Dropbox to some other local storage. Most likely that's best implemented as a scheduled backup. The native CLI backup script and a CRON job is THE way to do that.
- If you want to transfer your site to another server by means of a backup archive you shouldn't be going through Dropbox. Going through Dropbox means that you have to wait for Dropbox to sync, then FTP the backup to the new site. That's a waste of time. Instead, you can use a small part size (5 to 10Mb) and the Upload to Remote FTP post-processing engine, with the "Upload kickstart.php" option ticket. Taking a backup will then transfer the backup archive parts and kickstart.php to the new site, meaning that you only need to run Kickstart from your browser once the backup is done. No download/upload required and you can even run this process from a tablet or a smartphone!
So, for all practical purposes, what you are trying to do makes little to no sense and of course won't work due to well thought out server restrictions. If you do lift those restrictions you jeopardise the stability of your server, ending up shooting your own feet in the long run.
If you have questions about the best way to move forward, please post back with your use case. I will be able to describe the best solution when I know exactly what you are trying to achieve.
Nicholas K. Dionysopoulos
Lead Developer and Director
🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!