The trace actually confirms what I said.
Note that the code we use to connect to the S3 API has not changed. It's https://github.com/akeeba/s3 which we also use in Akeeba Backup itself. However, we don't list files in Akeeba Backup. We only write and delete files. This is why you never noticed a problem; that part of the code does not run there.
However, you're right in saying that you didn't observe an error in the past. It was a bug that was fixed :) Previous versions of UNiTE ignored the problem of the S3 API returning nothing and tried to download just the one file (.jpa, .jps, or .zip) you had specified in the job XML file. I suspect that since you had single-part archives you never noticed this being a problem.
In this version we use the progress bar from the Symfony Console package. We pass the total file size read from the S3 API for all part files as its max length argument. When the S3 API fails to return a file listing this is zero and causes the progress bar to emit an error. The root cause is not the S3 API code (which hasn't changed and used in Akeeba Backup), nor the Symfony progress bar; it's that the S3 API returns no file listing.
For what it's worth, it's not a problem in UNiTE's logic either. I have tested this with a multi-part (10 parts) archive using Amazon S3 proper and it works.
It would appear that something's up with Wasabi. Can you try using your credentials with s3cmd using its ls command to list the contents of your bucket? s3cms uses boto3, the official S3 API implementation for Python and is maintained by Amazon Web Services itself. This will tell us if the problem is with the Wasabi credentials, or at least their implementation of the S3 API.
Nicholas K. Dionysopoulos
Lead Developer and Director
🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!