🧵 View Thread
🧵 Thread (27 tweets)

in retrospect maybe a memetic advantage i don't talk or think much about having is that i was AGI-pilled for long enough that i know what having a totalizing narrative about the most important thing feels like from the inside and i know what it feels like to step out of it

@QiaochuYuan ❤️ https://t.co/AOCICNEQj4

this was a tweet about “people who are my type” but actually I realize it’s ex-everything. Ex-mormons, ex-smokers... it takes a certain courage to walk away from something that at one point meant a lot to you, and these people have both a lightness and a darkness about them https://t.co/QXGHLOCNtY

@visakanv oh i wrote a thread about this nice https://t.co/YpA3OOhvKY

a lot of the people i respect the most know what it's like to feel betrayed by an ideology, and what it's like to deconvert. it's humbling to know you can be full of conviction for years and then realize you were confused the whole time. you take your own beliefs less seriously

oh i have written about this but obliquely https://t.co/jjbBMIxp5C

being taken over by an ideology for the first time is a lot like being in love for the first time. you don't know what it's like for it to end. you can't imagine being obsessed by anything else. and you don't have a frame of reference for what abusive behavior looks like

so, less obliquely: i was involved with lesswrong / CFAR / the rationalists from ~2012 to ~2018, briefly worked for both MIRI and CFAR, got to talk to a lot of the higher-ups in the ecosystem, learned a lot from the experience, and have a lot of dirt on unhealthy dynamics

it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")

a lot of stuff you wouldn't do if you were calmer becomes justifiable. it becomes easy to justify exerting force in various ways on other people and there were (and are) people in the ecosystem much better at doing that than other people in the ecosystem could (and can) handle

the rationalist and EA ecosystems select massively for recruiting people (~nerds with poor social skills) who are confused in specific highly correlated ways about e.g. feelings, the body, etc., and these people are, to put it bluntly, very vulnerable to abuse

in retrospect when i was first exposed to these ideas (~2011) i was a tiny infant and i was not prepared to handle them in any real way. i was thinking about saving humanity when - please forgive the dramatic phrasing - i didn't even know what being human meant

one reason i've only talked about this indirectly until now is because in 2019 i wrote a 20-page google doc ranting about some version of this point of view and sent it to a bunch of rats and some of them were like "oh my god THANK YOU" and some of them got reeeeally angry

i didn't really have the social or emotional resources at the time to sustain any kind of fight so i gave up and stopped talking about it but i do still remember all the people who were like "oh my god THANK YOU" and while it's outdated in many ways i stand by a lot of that doc

the other reason i've only talked about this indirectly is that some of the dirt i have is confidential. but like. people talk to me about their feelings and some of those people talk to me about their feelings about other people in the ecosystem and so. now i know things

none of this, by the way, has any particular relevance to the importance of dealing with AGI as a problem. i've been actually spooked about this since alphago vs. lee sedol (before i was kinda LARPing it) and still spooked but not devoting a lot of attention to it specifically

i was not involved with this particular group but various dynamics related to what i obliquely described here are becoming gradually more public https://t.co/WPuOJ2azSp

on being a neurodivergent mark https://t.co/8Ufd5r3rjG