As seen in: Mutually Assured Destruction #
- Freeze, punk! I’ve got you in my pistol’s sights. You can either come over and let me handcuff you, or I’ll blow your brains out.
- Well if you go to shoot me, then I’ll shoot you.
- Okay, then I guess nobody’s getting shot and you’ll be coming with me.
- No no no. That’s not right.
- Look, I’m putting down my - wait, what?
- That’s not right. Your behavior is suboptimal.
- What do you mean?
- Once you put down your gun, how do you know I won’t just shoot you anyway? From my perspective, I could either shoot you and get away, or come with you and go to jail. And presumably, you don’t want to be shot. So, putting away your gun is a dominated strategy. Do you understand?
- So you…want me to shoot you?
- Well technically we both want you to shoot me. That’s the Nash Equilibrium.
- Wait a minute, I took a game theory class once…isn’t this a case of mutually assured destruction?
- Not quite - the destruction isn’t mutually assured if there’s no possibility of retaliation in the case of non-cooperation. If you put your gun away and then get shot, for example, you can’t then shoot me back because you’ll be dead. See?
- Well then… (intensely grips trigger) I have just one more question for you, punk.
- (loads bullet into chamber)
- What if we’re in a situation where we are each receiving incomplete information, so we are each only able to observe a signal of the other’s behavior that has some probability of being wrong?
- That’s a great question - the canonical approach is to update our beliefs in accordance with a Bayesian framework. You know what? Let me grab a chalkboard quickly and we can really hash this one out.