- cross-posted to:
- technology@lemmy.zip
- programming@programming.dev
- bofh@group.lt
A week of downtime and all the servers were recovered only because the customer had a proper disaster recovery protocol and held backups somewhere else, otherwise Google deleted the backups too
Google cloud ceo says “it won’t happen anymore”, it’s insane that there’s the possibility of “instant delete everything”
Bloody Harry ( @harry315@feddit.de ) 129•11 months agoRemember people: The cloud is just someone else’s computer.
dan1101 ( @dan1101@lemm.ee ) 23•11 months agoYeah there’s that, and the fact that you have no control over how much the bill will be each renewal period. Those two things kept me off the cloud for anything important.
cmnybo ( @cmnybo@discuss.tchncs.de ) English4•11 months agoMost cloud providers have a way to set limits. Make sure you learn how to set appropriate limits to avoid unexpected bills.
IronKrill ( @IronKrill@lemmy.ca ) 4•11 months agoThe limits don’t matter if the provider raises their price next month.
Natanael ( @Natanael@slrpnk.net ) 3•11 months agoAnd some functions don’t support hard limits, you’d have to set up a script monitoring load and literally take down your service if you get near the max
я не из калининграда ( @imnotfromkaliningrad@lemmy.ml ) 11•11 months agothats why i am trying to explain to my family since forever. their answer always amounts to something like “it would be illegal for them to look at my data!” like those companies would care. .
umbrella ( @umbrella@lemmy.ml ) 3•11 months agoin many cases “looking at my data!” is in their TOS
delirious_owl ( @delirious_owl@discuss.online ) 9•11 months agoUnless its a self-hosted cloud. Then its your own computers
Mossy Feathers (She/They) ( @MossyFeathers@pawb.social ) 96•11 months agoThey said the outage was caused by a misconfiguration that resulted in UniSuper’s cloud account being deleted, something that had never happened to Google Cloud before.
Bullshit. I’ve heard of people having their Google accounts randomly banned or even deleted before. Remember when the Terraria devs cancelled the Stadia port of Terraria because Google randomly banned their account and then took weeks to acknowledge it? The only reason why Google responded so quickly to this is because the super fund manages over $100b and could sue the absolute fuck out of Google.
Pechente ( @Pechente@feddit.de ) English30•11 months agoThis happened to me years ago. Suddenly got a random community guidelines violation on YouTube for a 3 second VFX shot that was not pornographic or violent and that I owned all the rights to. After that my whole Google account was locked down. I never found out what triggered this response and I could never resolve the issue with them since I only ever got automated responses. Fuck Google.
umbrella ( @umbrella@lemmy.ml ) 2•11 months agoone of my accounts was locked for no reason once. i apparently did well to not trust important data to them anymore.
heluecht ( @heluecht@pirati.ca ) 23•11 months ago@Moonrise2473 Regardless of one thinks about “cloud” solutions, this is a good example, why you always should have an offsite backup.
Hirom ( @Hirom@beehaw.org ) 5•11 months agoThey had backups at multiple locations, and lost data at multiple (Google Cloud) locations because of the account deletion.
They restored from backups stored at another provider. It may have been more devastating if they relied exclusively on google for backups. So having an “offsite backup” isn’t enough in some cases, that offsite location need to be at a different provider.
heluecht ( @heluecht@pirati.ca ) 6•11 months ago@Hirom With “offsite” I mean either a different cloud provider or own hardware (if you hold your regular data at some cloud provider, like in this case).
Hirom ( @Hirom@beehaw.org ) 1•11 months agoThat would indeed be a good backup strategy, but better be specific. “Offsite” may be interpreted in different ways.
Tangentism ( @Tangentism@lemmy.ml ) 2•11 months agoIt may have been more devastating if they relied exclusively on google for backups.
Which is why having any data, despite the number of backups, on a cloud provider shouldn’t be seen as off-site.
Only when it is truly outside their ecosphere and cannot be touched by them should it be viewed as such.
If that company didn’t have such resilience built into their backup plan, they would be toast with a derisory amount of compensation from Google.
Hirom ( @Hirom@beehaw.org ) 2•11 months agoHaving a backup at a cloud provider is fine, as long as there is at least one other backup that isn’t with this provider.
Cloud provider seems to do a good job protecting against hardware failure, but can do poorly with arbitrary account bans, and sometimes have mishaps due to configuration problems.
Whereas a DIY backup solution is often more subject to hardware problems (disk failure, fire, flooding, theft, …), but there’s no risk of account problem.
A mix is fine to protect against different kind of issues.
Tangentism ( @Tangentism@lemmy.ml ) 2•11 months agoas long as there is at least one other backup that isn’t with this provider.
Which is exactly what I was saying.
Any services used with a cloud provider should be treated as 1 entity, no matter how many geo-locations they claim your data is backed up to because they are a single point from which all those can be deleted.
When I was last involved in a companies backups, we had a fire safe in the basement, we had an off-site location with another fire safe & third copies would go off to another company that provided a backup storage solution so for all backups to be deleted, someone had to go right out of their way to do so. Not just a simple deletion of our account & all backups are wiped.
That company had the foresight to do something similar & it’s saved them. [edited - was on the tube when I wrote this and didnt see the autocorrect had put ‘comment’, not ‘company’]
Hirom ( @Hirom@beehaw.org ) 2•11 months agoOkay, I misinterpreted your comment.
Tangentism ( @Tangentism@lemmy.ml ) 1•11 months agoNo, it’s all good. We’re on the same page about disaster recovery!
Simon ( @Simon@lemmy.dbzer0.com ) 10•11 months agoJust an FYI in case you don’t follow Cloud news but Google has deleted customers accounts on multiple occasions and has been for literal years. This time they just did it to someone large enough to make the news. I work in SRE and no longer recommend GCP to anyone.
This is the best summary I could come up with:
More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.
Services began being restored for UniSuper customers on Thursday, more than a week after the system went offline.
Investment account balances would reflect last week’s figures and UniSuper said those would be updated as quickly as possible.
In an extraordinary joint statement from Chun and the global CEO for Google Cloud, Thomas Kurian, the pair apologised to members for the outage, and said it had been “extremely frustrating and disappointing”.
“These backups have minimised data loss, and significantly improved the ability of UniSuper and Google Cloud to complete the restoration,” the pair said.
“Restoring UniSuper’s Private Cloud instance has called for an incredible amount of focus, effort, and partnership between our teams to enable an extensive recovery of all the core systems.
The original article contains 412 words, the summary contains 162 words. Saved 61%. I’m a bot and I’m open source!