i was flipping through beyond good and evil as i was trying to figure out that story, and again i came upon one of the passages that really makes me uncomfortable.
he's so good at piercing my illusions. but i can't get a grip on a positive response to his arguments. so i just feel a little nauseous.
in this case it was the beginning of section 194 in "on the natural history of morals." i'm actually not comfortable posting it here; it feels too insightful and personal and painful.
--
polanyi--the epistemologist/chemist i'm reading--raises some information-theory based objections to evolution. i'm not actually to the chapter where he really engages with them, so i don't know if he believes them or not, but they remind me a lot of one of the big mathematical arguments against it recently:
dembski and others argue that the no-free-lunch theorem (""[...] all algorithms that search for an extremum of a cost function perform exactly the same, when averaged over all possible cost functions.""--from Wikipedia, from somewhere else) applies to evolution. that is to say that fitness is clearly some sort of functions (or set of functions) and so arguments for a search strategy that consistently gets to better fitness faster than random (like evolution) are inherently spurious.
the most interesting explanation of why this isn't true that i read that Dhruv linked to from Lawrence's wiki: imagine a topographical map the surface of which is higher for mountains and lower for valleys. If I understand correctly, the no-free-lunch theorems argue that you can't define a function f(x,y) that iterated on a set of coordinates gets you to the highest point faster than random, across all possible such maps. Why? Because every time you try to build a function that's better than random you build information into the function as assumptions that is wrong for some maps.
The argument that I can't remember the author of says that this ain't a great model of evolution. Because most evolution is co-evolution--flowers becoming scented and bees learning to smell those scents--your function actually deforms the map. Fitness isn't really a pre-determined extremum; the organism changes the environment for all the other organisms trying to maximize their fitness and vice versa, with accurate assumptions becoming steadily more accurate and inaccurate ones being dropped.
I keep imagining a similar graph of social behavior. The only ways to become a "better person": change behavior to better match the environment or change the environment to better match the behavior. Reminds me of the role of ritual I kept writing about when I was reading Wieseltier and Aristotle: I was caught up in the image of wearing ruts into the soil that one could follow until one understood again which directions to go. It's the same image (albeit topographically reversed) as with this map, in which by climbing I can make the mountain higher and see out over a more distant view.