FYI the archive fetch does use multiple page loads just like backing up, or extracting the backup archive, or restoring the database, or... you get the idea. I use this pattern everywhere in my software where I expect a non-trivial amount of time to elapse doing something. I'm the only one using this much harder but far better working method to work around PHP script timeouts. Everyone else uses the "infinite PPH time limit and a prayer" method which works less than half of the time but is cheaper to develop and maintain by four order of magnitudes. I guess the difference is that I write software to maximize the number of people I can help, other developer write software to make maximize the number of people they can easily take money from. Different goals, different means.
But I digress. The problem that you have is that not all remote storage services allow for chunked downloads. For example, Amazon S3 allows us to download 1MB at a time. That takes between 0.5 and 10 seconds on most servers, allowing us to split the work in roughly 5 to 10 second work units. FTP, however, does not allow chunked downloads. We have to download an entire archive part. If your archive part is 2GB we'll have to try and download 2GB all at once which will, of course, cause a timeout.
Another related problem might be that your site's backend, you said, is running through CloudFlare. This is a bad idea for several reasons. You should have it not be cached by the CDN. If you've set up CloudFlare to basically do a pass-through to your server for /administrator URLs you can ignore this paragraph; I've read your ticket as CloudFlare caching the backend pages.
Most likely your solution lies on the backup side of things. Tell me which part size for split archives you're using and which remote storage service / method you're using for your backups. I will tell you what are the hard technical limitations you cannot overcome and how to reconfigure your backups so you can avoid these limitations.
Meanwhile, I can tell you what to do with your existing archives. You can download them manually and put them in the backup output directory where Akeeba Backup is looking for them. If you're not sure, check the info icon of the backup entry in the Manage Backups page. It will tell you where the backup archive is expected to be. After the files are there reload the Manage Backups page and you can now use Restore on those backups.
Nicholas K. Dionysopoulos
Lead Developer and Director
🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!