An analogy I've heard bandied around over the years, not just by the Ruby folks, is the programmers and guns analogy. Give people freedom and they won't destroy the world, they'll be happy. Give a bunch of programmers the proverbial AK47 and they'll not start shooting each other.
For all the ways this is a bad analogy, and a bad model, none is worse than the fact that it's basis it completely flawed.
Bad code is not something you have to pull the trigger on, it's the default. It's like bad art, or bad bricklaying. When you don't know any different, you make bad art, lay crooked bricks and write crappy code. People don't get born into a state of understanding and skill. To extend this analogy thing, the gun is already firing, and you have to work to dodge it. Giving programmers the proverbial gun isn't giving them a tool. It's like a gun that once you've pulled the trigger, it won't stop firing, you just better hope you figure out how to aim it so it doesn't hit anyone else. Friendly fire in a war zone is a leading cause of death. Only extreme training will reduce it, and it's rarely eliminated.
Of course, it's coding so there aren't lives at risk right? Well, unless you're writing the software that flies a 747 through the sky, or a system that measured medication for the pilot who's flying the 747, or perhaps the code that set his smartphone alarm off two hours early and now he's low on sleep when the emergency happens. It might feel like an edge case, until you consider all the possible ways the human part of that system, or it's dependent systems, like the flight crew, the ground staff, the air traffic controllers might have been affected by a simple code bug which could end up being just as serious as a malfunction in the guidance system of a 747.
Hyperbole? A little, but also, not entirely.