Tuesday, 11 May 2010

Sentences to Ponder

Here's an interesting thought:

This is what makes me slightly cross; we have reasons to believe some of our mental faculties should have evolved to be truthful, such as aspects of vision, but there is no reason our moral feelings should have evolved to make us benefit others consistently, so it is dishonest of us to pretend that following them somehow will.

Katja Grace

Although there's a healthy dose of self awareness:
As a friend pointed out, I am hypocritically motivated by disgust at others’ hypocritical motivation by disgust.


  1. AKA morality doesn't scale.

    There is a quote from Freiderich von Hayek about this from an economic perspective which I have been trying to track down for a long time. Something about the futility of applying the mores of a small hunter-gatherer group (altruistic helpfulness etc) to the society at large.

  2. Does this mean that the larger society has no option but to learn a better set of codes by trial and error? And even if it does "learn", there's no reason to suppose that the new code would "feel right", at least not for a very long time. Then again, research suggests that the gut feeling is much smarter than rational deliberation in a lot of situations, so the rational guy appears naive and limited. Damn confusing.

  3. Sigh9: Agreed. Although even at the individual level, I think that it predicts that we're not particularly moral. If my understanding is correct, it predicts that we'd overtly have admirable instincts, then covertly betray them. Scaling to the level of society would presumably amplify those problems that already exist, as well as introducing new, complex problems.

    Aida: I don't see that it has to be trial and error. We're certainly capable of conscious design.

    The new code almost certainly wouldn't feel right, but I think we could adapt as a culture. There's plenty of things (racism, for instance) that probably have roots in our evolved past (suspicion of other groups) but we've learnt to, at least overtly, control. And other things, like reading, that we didn't evolve to do but train ourselves anyway.

    I don't think that gut feeling is so great in evolutionary novel situations, and this would seem to count, since we'd be deliberately trying to fix the bugs in our evolved code.