Hi
Which are the best setting to backup a site with 40GB Database and 60GB Files :)
Server:
64GB RAM
16 Cores
Thanks for the help.
Best regards
Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.
Latest post by nicholas on Friday, 17 June 2022 00:00 CDT
Hi
Which are the best setting to backup a site with 40GB Database and 60GB Files :)
Server:
64GB RAM
16 Cores
Thanks for the help.
Best regards
You can definitely back up that site but it will be slow. The last person who reported a 100GiB backup told me it took just over 24 hours to complete. The slowest part of the process was, naturally, dumping the database.
Let's keep in mind some facts:
So, your server is more than enough as we'll only need 1 core and well under 128MiB of RAM. We designed the backup to be frugal; your server is best used serving requests :)
Before going on, let's consider that you are never going to run this backup from the web (backend of the site, frontend legacy backup URL or backup API). You will only ever run this backup from the CLI if you value your sanity.
This means that the timing settings and fine tuning settings are essentially ignored so we don't comment on them.
The archiver options are your choice; you have plenty of memory on that server. Heck, you have more RAM and cores than my two dev machines combined. Just remember that increasing the size of large files means that you are reading this entire file in memory, compressing it, and writing it to disk. This will slow the backup down. Also remember that large files (over 1MiB by default) on a web server are typically already compressed in some form or another: images, videos, audio, PDF. So I'd leave these settings as-is.
Logs will take a LOT of space. Set the Log Level to None unless you run into a problem. This will create zero sized log files.
The only pertinent options are the remote storage engine ones. You have 100GiB of data and you are going to back it up. Do you have another 100GiB of working space to spare? If so, no problem, have the upload to the remote storage happen at the end of the backup. If not, use a remote storage engine which allows for immediate transfer of each backup archive part and set the part size for split archives to 2000MiB (just under 2GiB to keep on the safe side). The working space in the latter case is 2x the part size + 1MiB i.e. 4GiB.
That's all there is to it. It just takes a lot of patience because you are looking at a backup time between 17 and 25 hours, depending on the row structure of that 40GiB database!
Nicholas K. Dionysopoulos
Lead Developer and Director
🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!
Easter vacation: We will be closed from 17 April 2025 16:00 UTC to 21 April 2025 06:00 UTC due to observing the Christian Easter holiday. Support will be closed during that time for both new tickets, and replies to existing tickets.
Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.
Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!