Support

Akeeba Backup for Joomla!

#22256 Backup to RackSpace Cloud Files re-opened

Posted in ‘Akeeba Backup for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
n/a
PHP version
n/a
Akeeba Backup version
n/a

Latest post by on Friday, 17 April 2015 17:20 CDT

bubu678
I have read the tickets:
#21685 – Can´t send backup to Backspace
#21062 – Rackspace

And some others.

So far any of my sites that use Akeeba v3.11.3 works no problem up loading Rackspace Files. Versions 4.x have issues and rarely worked. Yes Akeeba has issues with Rackspace Files.

I also develop using WordPress. I use BackWPup (from www.inpsyde.com) the free version which has the option to upload to Rackspace Files. I use them in all my WP site and have never had an issue with it. Are they using the old API, which looks like is still active?

I know we have beaten this to death, but I find it odd that BackWPup works. You also have a solution for WP, but I guess that you have the same problem.

Maybe look at the source code that they use to compare against yours. Will see your finds before planning to use Amazon S3.

Boris

nicholas
Akeeba Staff
Manager
I'll have to repeat myself like a broken record. RackSpace said the old API would be retired mid-November 2014. We had no problem with the old API. After RackSpace's stern warning that the old API is being retired we switched to the new API. The new API is unreliable. Most of the times it DOES work. Sometimes it doesn't. When it doesn't it usually takes hours to days to get back in working order.

The thing is that you can't know which API version a software uses just by looking at what works and what doesn't. Whether the new API works or not is a factor of many things such as where your account was created, which geographical area the client is currently in, the time of day and presumably the phase of the moon or something. Having one server in Dallas, USA running Akeeba Backup and one Mac in Athens, Greece running CyberDuck (also using the new API) trying to upload files to the same RackSpace CloudFiles account we observed on different days and times of the day that:
  • Both were working (70% of the time)
  • CyberDuck in Athens was working a bit slowly, Akeeba Backup in Dallas was not working
  • CyberDuck in Athens was not working, Akeeba Backup in Dallas was working a bit slowly
  • Both were not working


Which means that using RackSpace CloudFiles with the new API is a crapshot. Most of the times there's no problem. But when there IS a problem you're completely hosed.

All I can suggest is try to create a new bucket in the London geographic area (which is what I used for my tests) and if the transfer fails keep on trying every day for the next 20 or so days. At some point you'll see it works and from then on it will be working until the service becomes congested again and starts failing all the same.

And I have to repeat myself yet again. We DO use the RackSpace API published by RackSpace. We DID confirm that we are not doing anything stupid by checking it against the implementation of a known good software (CyberDuck) which does let us manage RackSpace CloudFiles. The problem is not on our site. The problem sits squarely on RackSpace's pathetic implementation. Namely, their authentication server has severe reliability issues.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

bubu678
Thanks for the reply Nick.

I see your point. I guess `fanatical support` does have its limitations. I will bring it up with my contacts there and get their response.

As of March 17, I got an email from telling of a new product RS CLOUD CDN. I thought CLOUD FILES was supposed to be that. Maybe there is some issues with CLOUD FILES. Interesting.

I will tryout Amazon S3. What I don`t know is the costing. They have some convoluted form method.

Will keep you posted.

Boris

nicholas
Akeeba Staff
Manager
CloudFiles is a files storage service. It's not a CDN. A CDN has edge nodes around the world aiming for fast delivery of already uploaded content. It's like Amazon S3 and CloudFront: you upload the files to S3, they get delivered everywhere around the world lightning fast (and a tad more expensive) via CloudFront.

Regarding storage costs, S3 and CloudFiles have the same cost. In fact, last time I checked (in November) CloudFiles was a wee bit more expensive, but we're talking about one cent of a dollar or so per Gb for the first few Gb of storage used. In the end of the day it translated to less than a dollar per year, so who cares really? But yeah, their cost calculation form IS complicated. It's designed for large companies with Terabytes of data to store, Petabytes of data to deliver and hundreds of servers to run. For small companies like ours that form is a serious overkill. Better look up their price list, it's more straightforward!

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

System Task
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!