Many aff’s making security claims on the last two topics have responded to K’s with an argument along the lines of: we can’t ignore the worst case scenarios, to do so would cause disaster. Cards I would lump into this broad category are like that Macy SFP is backwards card, Fitzsimmons, most of the fear good cards like Sandman, a bunch of Krauthammer cards about terror/attacks on the US etc.
Most negatives respond to this by reading the monkeys throwing darts card, and maybe a Bleiker card about how prediction hurts agency with no impact. I don’t really think these cards are adequate.
Predictions fail is pretty good defense, but it doesn’t really address the offensive claim that if we don’t consider predictions bad things will happen (either because the crazies will takeover the political or because when we ignore our fears they remanifest themselves). To a certain extent you could say the idea that fears remanifest etc is a prediction, and therefore will fail! But I think that it is probably better to just read a more specific card.
Predictions “hurt” agency can be a decent argument, but it usually requires a lot more time investment/explanation then the neg invests in it to flush out the impact and explain it. Like the above, however, it is not totally responsive. You could obviously say that the the claim we must predict or face doom is the the link, but again that is sort of lame.
The reason I think these 2 arguments are sometimes inadequate is that the type of judges who like these predictions good arguments are judges who lean more towards policy than K on the spectrum. For these judges, engaging them on the merits of predictions substantively is a better option. There isn’t really a stock card to go to on that front though, so it may take some research. Here is a card I found recently that I think is moving in the right direction:
At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.
I didn’t get to give my answer until the afternoon, which was: “My nightmare scenario is that people keep talking about their nightmare scenarios.”
There’s a certain blindness that comes from worst-case thinking. An extension of theprecautionary principle, it involves imagining the worst possible outcome and then acting as if it were a certainty. It substitutes imagination for thinking, speculation for risk analysis, and fear for reason. It fosters powerlessness and vulnerability and magnifies social paralysis. And it makes us more vulnerable to the effects of terrorism.
Worst-case thinking means generally bad decision making for several reasons. First, it’s only half of the cost-benefit equation. Every decision has costs and benefits, risks and rewards. By speculating about what can possibly go wrong, and then acting as if that is likely to happen, worst-case thinking focuses only on the extreme but improbable risks and does a poor job at assessing outcomes.
Second, it’s based on flawed logic. It begs the question by assuming that a proponent of an action must prove that the nightmare scenario is impossible.
Third, it can be used to support any position or its opposite. If we build a nuclear power plant, it could melt down. If we don’t build it, we will run short of power and society will collapse into anarchy. If we allow flights near Iceland’s volcanic ash, planes will crash and people will die. If we don’t, organs won’t arrive in time for transplant operations and people will die. If we don’t invade Iraq, Saddam Hussein might use the nuclear weapons he might have. If we do, we might destabilize the Middle East, leading to widespread violence and death.
Of course, not all fears are equal. Those that we tend to exaggerate are more easily justified by worst-case thinking. So terrorism fears trump privacy fears, and almost everything else; technology is hard to understand and therefore scary; nuclear weapons are worse than conventional weapons; our children need to be protected at all costs; and annihilating the planetis bad. Basically, any fear that would make a good movie plot is amenable to worst-case thinking.
Fourth and finally, worst-case thinking validates ignorance. Instead of focusing on what we know, it focuses on what we don’t know — and what we can imagine.