Ages ago, there was a time where my dad would mail back up tapes for offsite storage because their databases were large enough that it was faster to put it through snail mail.
It should also be noted his databases were huge, (they’d be bundled into 70 pound packages and shipped certified.)
We’re storing data in peanut butter? Please tell me there’s jam involved.
/j it’s amazing we’re talking about petabytes. My first computer had like 600 meg. (Pentium 486 cobbled out of spare- old- parts from my dad’s junk”Parts” rack.)
Ages ago, there was a time where my dad would mail back up tapes for offsite storage because their databases were large enough that it was faster to put it through snail mail.
It should also be noted his databases were huge, (they’d be bundled into 70 pound packages and shipped certified.)
Just a couple of years ago I was sent a dataset by mail, around 1TB on a hard drive.
Later I worked on visualization of large datasets, we didn’t have the space to store them locally because they were up to a PB.
We’re storing data in peanut butter? Please tell me there’s jam involved.
/j it’s amazing we’re talking about petabytes. My first computer had like 600 meg. (Pentium 486 cobbled out of spare- old- parts from my dad’s
junk”Parts” rack.)😁 ya my first “computer” was a ZX-81 with 1kB of ram, type too much and it was full! A card with a whopping 16kB later came to the rescue.
It’s been a wild time in history.
Mail dataset in standard-compliant way. Like RFC1149. Don’t forget that carrier should be avian carrier.
Local is very vague word. It can be argued, that anything, that doesn’t fit into L1 cache is not local.