Look, in any area where we let humans do things, every once in a while there will be a big screwup; that is the sort of creatures humans are. And if you won’t decrease regulation without a screwup but will increase it with a screwup, then you have a regulation ratchet: it only moves one way. So if you don’t think a long period without a big disaster calls for weaker regulations, but you do think a particular big disaster calls for stronger regulation, well then you ... apparently want the maximum possible regulation, which is probably to just basically outlaw that activity. And if that doesn’t seem like the right level of regulation to you, well then maybe you should reconsider your ratchety regulation intuitions.
Bryan Caplan has a good quote on this topic:
George Stigler famously observed, "If you never miss a plane, you're spending too much time at the airport." I heard that he wasn't amused by his secretary's corollary, "If you never make a typo, you're typing too slow."
I'll have to send that one to my boss.