October 2, 2013
Posted by on
Note to self: Short attention spans don’t go well with long series of related posts.
I’ve been quite busy with a certain little person weighing 3100gr. with a thing for crying (in 3 hour intervals) in the middle of the night. Pure bliss nonetheless.
Until I get some sleep again, a little post I had buried in my drafts folder:
Years ago I read a review on Blade Runner, where the reviewer says the most important point on the movie is: What does the Replicant think? How do they feel about being a Replicant? About being a robot? How can a sentient being come into terms with the realization that they are robots whose thinking is mechanically determined?
That question used to fascinate me. It doesn’t anymore. I think I know how Rachael must feel. Once you achieve a little bit of auto gnosci, who doesn’t feel like a robot? Consciousness is very cool, but in the end, all our behavior is deterministic, and once you get a hold of someones character, individual behavior is predictable to ennui.
Look at these people at the NYT (I know I said I wouldn’t read it again, but Sailer linked to it). How are they not robots? They say exactly the same things, they came up with the same excuses that any of us could have foreseen years ago. They’re programs, and not very complex one at that.
Schopenhauer wrote extensively about character and fate; how once you know a person’s character (or your own) thoroughly enough, you can accurately predict it’s behavior in most cases. We only keep the fiction of autonomy going because of a lack of brainpower, or attention towards other people.