There is a post at Less Wrong today called “Fixing akrasia: damnation to acausal hell”.
Akrasia means weakness of will. It’s a recurring theme on Less Wrong – it has its own entry at the LW wiki – for two reasons. First, self-improvement is part of the LW culture. Second, there are activities that are overwhelmingly important but which get hardly any social support (e.g. Friendly AI, cryonics), so it requires unusual measures for people to keep at them.
Precommitment is another concept with a special aura in LW-land, coming from its role in decision theory. If an agent can genuinely guarantee in advance, that it will act in a certain way, that affects how other agents will model it, which can in turn have desirable consequences.
The author of today’s post observes that Roko’s Basilisk involves “precommitting” to work at making the future AI, that will punish you if you don’t measure up. But he also observes that human beings cannot presently precommit in the comprehensive, ironclad, absolute way supposed in game-theoretic analyses. His conclusion is that “fixing akrasia” is actually risky, because it will enable the kind of psychologically absolute commitment required for the demonic deal to go through.
At one level, I’m charmed by the drama and the mental acrobatics here. It turns out that one of the great goals of “extreme-rationalist” self-enhancement, actually opens the way to one of its great bogeymen! If this was science fiction, it would be a great plot twist. This is quality conceptual entertainment, just like Roko’s original posts. It’s a worthy extra twist to the whole affair.
At another level, I think it’s crazy to actually be worrying about this. I’m not too worried that someone is worried about it, because reality contains too many bizarre possibilities, and there’s always someone somewhere who freaks out over them. People worry about the multiverse, about determinism, about solipsism, you name it.
But mostly, I was interested to see the basilisk tiptoeing back into the discourse. It is named nowhere in the post, and the author endorses its banishment from public discussion, but the post is nonetheless all about such “risks” of “acausal trade”. It will be interesting to see whether the post is allowed to stand.