Comrade Angles utilise witches.town. Vous pouvez læ suivre et interagir si vous possédez un compte quelque part dans le "fediverse".

Hmm. Random thought: Do you think it's morally wrong to try and control the actions of future versions of yourself? Like, to try and commit yourself to something you can't get out of, specifically because you think future you might want to get out of it? :/

Comrade Angles @Angle

What about for very long term decisions? Is there a maximum time span after which attempting to control yourself is unethical? :/

@Angle Nope. So long as you've got sanity clauses built in or have worked through the ethics of a solid precommitment. I would feel that this is a function of the action rather than the precommitment, if the action is moral and ethical to take now in response to whatever imagined scenario, then it's moral and ethical to take then, so long as there is an onto relationship between the set of decision criteria established and the action.

Of course, if the signal proxies are poor proxies, that...

@Angle can be quite problematic, but that's again... a general case.

There is nothing "problematic" about this sort of contract space qua itself.

@Angle It *is* worth considering why you may want to be able to defuse the contract in the future though and have viable methods for aborting if conditions are such that abandoning the precommit matches its own precommitment.

@Angle Mutual suicide via past's poor planning is... not wise.

@Angle I dunno about "unethical", but there's fiction about oaths gone wrong for a reason (e.g. the Oath of Fëanor in _The Silmarillion_, in fantasy; in serious drama the Sean Penn film "The Pledge" comes to mind.)