|
I've got a heap of archived data that the company won't let me delete for various reasons, but I'd like to get it off of live storage so I don't have to keep expanding "expensive" storage and don't have to keep backing it up locally. I'm thinking of dumping it all onto S3, and I'm wondering what people do for backups of this type of data. It's not critical enough that I have to worry about redundancy beyond the default 3 AZ's, the only thing I'd be protecting against is someone (me) accidentally pressing the wrong thing and deleting/overwriting something. Would S3 versioning be enough for that use case, or should I do something more?
|
# ¿ Oct 31, 2018 17:03 |
|
|
# ¿ May 16, 2024 07:37 |
|
Does Glacier protect against someone fat fingering something and wiping a directory? The glacier lock write-once thing is overkill, I'm hoping that at some point I will actually be allowed to delete the stuff.
|
# ¿ Oct 31, 2018 17:55 |