AI Dumerism weaponizes anxiety for political power
βAI-dumerism is just kind of, you know, panicking about, you know, the fact that if there's a system that is too complex, our brains or human brains or generative models can't have a predictive model of them, and so we can't control them. And things we can't control give us entropy about our model of the future, and that induces anxiety. And then AI-dumerism, to me, has been a weaponization of people's anxieties for political purposes.β
Accelerationism responds to rapid technological change
βRapid technological acceleration has been a fact of a human civilization for about a century and that acceleration is, it's self-accelerating. You can respond by saying it's inevitable. You can respond by saying we have to slow it down, as a lot of people did. And it's just constantly a rapid response to basically the effects of the ideas that were tried to be executed by previous generations.β
βIt's just been an observation that systems tend to self-adapt and complexify in order to capture work from their environment and dissipate heat. And that is the fundamental driving force behind all of progress, all of quote unquote acceleration, all of everything we see today. It's like gravity. It's, you can argue with thermodynamics, doesn't care, it keeps going.β
Deceleration increases risk of civilizational failure
βI think mathematically provably, having a decelerative mindset, and it's a general pattern of many subcultures of making yourself small, degrowth, and so on, it's actually negative, it gives you negative fitness, and actually accelerating your downfall as an organism. Whether it's a decelerative mindset at an organization level, in a company, at a national level, at an individual level, you're lowering your likelihood of being part of the future.β
βIf you take any one bit and you kind of accelerate indiscriminately, then basically you do lose all value. And so, to me, the question is, how do we accelerate intentionally? I think there is a real sense in which we have one shot at this.β