it's a variant of the concept that doesn't have a specific outcome that most variants imagine.
You can read about them here if you scroll down https://en.wikipedia.org/wiki/Accelerationism
This is not from a standard left or right set of desires for some outcomes, but more of nihilistic, "okay destroy everything then, idc"
>>0x000002
> My thumb got stung by a bee. Rather than avoid or remove the bee's nest, I'll just hit my hand with a fucking hammer
That would be an important component of the philosophy though, the people are without agency.
>>0x000003
> That would be an important component of the philosophy though, the people are without agency.
What removes their agency?
I'm not really a believer in accelerationism but it's interesting.
From that context, we might have already hit the "technological singularity" that it talks about.
For me what will either prove or disprove that is, when AI starts taking all the jobs, are we going to move towards Star Trek world or Blade Runner world?
If we can't shift gears at that point, then that is the singularity kind of.
>>0x000006
> I'm not really a believer in accelerationism but it's interesting.
I'm not smart enough to understand it. It's just intellectual babble from people that think they can predict the future. Maybe it is interesting but probably a good idea to not take it 100% seriously.