Discussion about this post

User's avatar
Bob CLuness's avatar

damn. gonna have to squeeze this last minute style into my PhD footnotes (after i've listened to it first!)

Robert Shepherd's avatar

I agree with almost all of this! I suppose what I don’t understand is… after a certain point, why use the term “intelligence” at all?

When I disagree with people like Bostrom, it’s usually because I think their arguments are anthropomorphic in ways they might not have necessarily noticed. And “intelligence” as an idea feels like it often bends towards a human view of the world, where what’s being talked about here often bends away.

I would tend to see the world in terms of attractor spaces, I think. There are solutions to things which work well; there are many paths to approaching those solutions; iterative processes which accumulate iterations are enough to get you there.

To me, “people make mistakes and record the things that work” is not so different to “some organisms die, and the ones which survive inherit locally useful genes.” They’re both a means of approaching the sort of transcendental solution space I think you’re talking about, and they both don’t necessarily involve intelligence.

In fact I think a lot of what we think of as being intelligently created might, in fact, be blindly done— the world we see as created by human intelligence is really created by iterative process.

But in that case, I think “intelligence” is maybe just a dangerous concept generally; a hubristic one. If it was the case that *almost every* example of complex entity in the world came about through iteration towards an evolutionary attractor state, and we’d consistently gone “it must be a human-like mind, creating things with human-like genius!” then… I think one might be skeptical that we’ll see something like that with AI?

I would guess we’d see something mad and unpredictable instead, because this kind of transcendental space keeps producing things we’re not smart enough to predict. It very specifically doesn’t stand to reason, at least all the time. But I definitely agree there’s no reason for it to centre the human— perhaps I just think that even Land is centering us a bit too much; that even these terms are still too anthropomorphic

2 more comments...

No posts

Ready for more?