More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.
And the crazy part is that it sounds like Google didn’t have backups of this data after the account was deleted. The only reason they were able to restore the data was because UniSuper had a backup on another provider.
This should make anyone really think hard about the situation before using Google’s cloud. Sure, it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure but I’m surprised Google themselves do not keep backups for awhile after an account is deleted.
The IT guy who set up that backup deserves a hell of a bonus.
A lot of people would have been happy with their multi region resiliency and stopped there.
No, they had backups. They deleted those, too.
A replica is not a backup.
Google Cloud definitely backs up data. Specifically I said
The surprise here being that those backups are gone (or unrecoverable) immediately after the account is deleted.
Actually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.
The same can probably happen on AWS, Azure, any data center really
Uh, yeah, that’s why I said
Sure, if you colocate in another datacenter and it isn’t your own, they aren’t backing your data up without some sort of other agreement and configuration. I’m not sure about AWS but Azure actually has offline geographically separate backup options.
I use AWS to host a far amount of servers and some micro services and for them if you don’t build the backup into your architecture design and the live data gets corrupted, etc you are screwed.
They give you the tools to built it all, but it is up to you as the sysadmin/engineer/ dev to actually use those tools.