There are certain classes of exciting problems which are surfaced only in a massively distributed systems. This post will be about one of them. It’s rare, it’s real and if it happens, it will take your system down. The root cause, however, is easy to overlook.
It’s surprising how the volume of data is changing around the world in the Internet. Who would have thought 10 years ago that in the future a physical experiment will generate 25 petabytes (26 214 400 GB) of data, yearly? Yes, I’m looking at you, LHC. Times are changing, companies are changing. Everyone is designing for scale a tad different. And that’s good, as it’s important to design for the right scale.