Let’s be clear, this isn’t the single programmer’s fault. Everybody will eventually make a mistake. The fact that it wasn’t caught by mitigating measures such as reviews, tests, and audits is the real error we can learn from here.
A Proton-M booster carrying a GLONASS satellite crashed shortly after takeoff at Baikonur in 2013. The failure was caused by a gyroscope package that had been installed upside down. The receptacle had a metal indexing pin that should’ve prevented the incorrect installation. The worker simply pushed so hard that it bent out of the way.
When you make a foolproof design, God makes a better fool.
I think it was a different era, to borrow an awful phrase. In 1962 they were still figuring out best practices for reviews, tests, and audits. Even today, lone hero outputs can get pretty far when processes aren’t follow.
Let’s be clear, this isn’t the single programmer’s fault. Everybody will eventually make a mistake. The fact that it wasn’t caught by mitigating measures such as reviews, tests, and audits is the real error we can learn from here.
A Proton-M booster carrying a GLONASS satellite crashed shortly after takeoff at Baikonur in 2013. The failure was caused by a gyroscope package that had been installed upside down. The receptacle had a metal indexing pin that should’ve prevented the incorrect installation. The worker simply pushed so hard that it bent out of the way.
When you make a foolproof design, God makes a better fool.
I think it was a different era, to borrow an awful phrase. In 1962 they were still figuring out best practices for reviews, tests, and audits. Even today, lone hero outputs can get pretty far when processes aren’t follow.
But did leadership recognize that, or did the programmer catch the blame?