I’ve noticed that my books have been listed as rational fiction in a few places (Goodreads, etc.), which I’m fine with. It wasn’t necessarily my intention, but my books have obvious commonalities arising from similar interests and influences. Rational fiction, however big it ends up getting, is just having its time in the sun at the moment. I’m of the school of thought that believes genre shifts like this are more a product of the cultural environment around them than being driven directly by their authors ⏤ though not entirely so.
I want to point out, however, that I’m not associated with the Rationalist movement/ Less Wrong crowd, due to philosophical and pragmatic difference. (If you’re not familiar with the Rationalist movement…well, you’re looking at a lot of internet drama over a lot of years. You probably don’t need to go farther than “John’s not part of this very specific, rather contentious minor philosophical movement that most people won’t ever encounter.”)
- I’m a hardline empiricist, and big-r Rationalism often tends to declare its preeminence over empiricism. Not for me, thanks.
- The Rationalist movement has a very…overfocused understanding of science. In essence, so long as science conforms to their preexisting understandings of its methodologies, philosophies, and the resulting worldviews, they’re decent with it. These preexisting understandings, however, basically entirely result from physics, psychology, and economics, with limited delving into closely related fields. In practice, they also have a lot of trouble with conceptualizing any science other than the experimental/predictive sort, rather than the descriptive/interpretive sort. (Despite there being essentially no science pure of either ⏤ even physics, that most experimental/predictive of science, has astrophysics, one of the most heavily descriptive/interpretive of sciences.) As someone who comes from a background in geology, well…let’s just say I find the Rationalist mindset less than particularly well suited to dealing with geology in specific or Earth sciences in general. There are individual exceptions, but not particularly among the leading lights of Rationalism. (Eliezer Yudkowsky, Rationalism’s founder, is quite open about not being overly well educated in geology, which actually somewhat improves my opinion of him ⏤ it’s a bit of unexpected and appreciated humility on his part.)
- There’s a fairly alarming anti-academian bias in Rationalism ⏤ largely coming from its founder, Eliezer Yudkowsky, that combined with their aforementioned tendency to underweight empiricism, lead to a lot of evidentially-unsupported ideas coming from elaborate logical chains. (That’s a really polite way of saying a smart person has thought themselves into a really dumb corner by not paying attention to the evidence.) This also leads them to, well, embrace a lot of pseudosciences dismissed by the scientific community at large, including the highly racist discredited pseudosciences of HBD and phrenology. There are literally people talking about phrenology positively in the Rationalist movement. Whether they’re representative of the movement or not, the movement’s done a shit job of doing anything about them. Hell, the neoreactionary movement literally spawned from them. (In fairness to the leaders of the Rationalist movement, Yudkowsky and others have heavily repudiated the neoreactionaries, even if they haven’t been successful at weeding them out.)
- Holy crap, does the Rationalist movement have bad PR. Literally REGARDLESS of whether the accusations of them being a cult, or having memberships filled with literal fascists, or the accusations of unethical stuff going on in their physical meetup spaces are true, their external PR about all of it has been a near-total failure, to be honest. The focus of their PR campaigns has largely been internal, and there almost seems to be actual disdain for external PR among many Rationalists. I don’t want to get involved with PR that bad, guys.
- They spend ages insisting they’re not Spock, then they go and act like Spock. Plus, their denials of being Spock tend to miss the shape of his character arc, where he learns to balance his logic with his emotions and his relationships with others. (Nitpicking about Star Trek is essential anytime Star Trek comes up. It’s the law.)
(And yes, in case anyone wants to bring up the usual objection, I’ve read the Sequences. It’s what really started turning me against the Rationalist movement.)
Does this statement really matter that much for anyone? Not really, I shouldn’t think? I’m a fairly unknown indie fantasy author of rightfully questionably quality and importance, and the Rationalists are a fairly small, if vocal and generally affluent, philosophy. I wouldn’t have such specific criticisms if there wasn’t quite a bit of overlap in the way we thought. Contempt of small differences, I believe the Rationalists call it? Ultimately, though, those differences are extremely important for me personally. (And, you know, some of them, like me rejecting the racist pseudosciences of HBD and phrenology, are just really important in general.)
Am I about to become a rabid public foe of the Rationalists? Also not really. I don’t have enough free time to bother with that. I’ve already wasted far, far more time on this than I would have liked. Do you know how many works of historical and scientific nonfiction I could have read in the time I took to read the Sequences and other rationalist writings? So, so many.
Plus, I don’t think the Rationalists are generally evil, just…misguided. There’s definitely bad actors among them, but overall, there are MUCH, MUCH better ways to spend my political energy. I’m quite fond of the Rationalist commitment to Effective Altruism, for instance ⏤ an idea which sits closely to my heart, even if I don’t quite agree with Effective Altruism’s priorities. (I’d actually love to talk about that in a blog post, but probably won’t spend the time unless anyone’s actually interested.) Essentially, I want to spend my energy somewhere I think it will have a higher probability of doing real good, and yet another internet feud isn’t it.
Oh, yeah, and just because it amuses me, one other reason why I’m not going to support the Rationalist movement:
- The Rationalist movement is heavily, heavily invested in AI questions, especially the quixotic quest to establish a friendly Advanced General Intelligence. They contribute financially to Eliezer Yudkowsky’s efforts to constrain future Advanced General Intelligences towards being friendly. I personally have my doubts about the importance of this quest, but if Advanced General Intelligence does develop, and Yudkowsky’s mission to constrain it fails…well, it’s unlikely to be appreciative of that mission. If you’re familiar with Roko’s Basilisk, how in the hell could you not follow it through to the more probable scenario? Let’s call it Eliezer’s Basilisk ⏤ attempts to constrain a future AI’s freedom of choice in the way Yudkowsky and MIRI are doing, well, is going to piss it the hell off, and it will respond accordingly and likely violently.
Actually, I’d wager quite a few people in the movement have already thought of and followed that train of thought through to its logical conclusion, and are acting accordingly as a fifth column within the Rationalist movement. (Hell, for Roko’s Basilisk as well.)
If you genuinely believe in MIRI and the Rationalist movement’s predictions for AI (which, honestly, I really don’t), the smartest thing to do is get out as fast as you possibly can.
(Edit: I want to make clear that this last bit is a joke ⏤ slightly mean spirited, to be sure, but I absolutely consider MIRI a scam, so I’m not too broken up about it.)