1 Comment

Great conversation. I'm curious about whether it changed your perspective on the overall question to a significant degree?

From my perspective, I find Robin's re-framing pretty convincing, although there is one argument I didn't hear anyone bring against it, which is that, in addition to changes that occur through the process of adaptation and evolution, there are, occasionally, "black holes" in value space, places where a civilization can get catastrophically stuck, and, regardless of where our descendants eventually end up, one always has a reason to be on the lookout for them and steer around them. That seems to me to better capture the concern that most AI doomers have. Likewise, Robin's fear of a totalitarian world government that permanently halts growth and expansion could be described as one of these, so you could say you are both worried about the same sort of thing.

Expand full comment