Discussion about this post

User's avatar
metaphysiocrat's avatar

1) Ozy Brennan of Thing of Things has mentioned the “high-intensity community” as a good concept for what-we-mean by cults: monasteries, militaries, some EA orgs, small Leninist vanguard orgs, and central cult examples like Heaven’s Gate and others are high-intensity. Notably, you can coherently claim a particular HIC serves a valuable purpose (or not, while in either case have known ways of going off the rails) and it’s relatively common for the same ideology or worldview to have both more and less demanding milieux.

2) Nitpick, but the post in postrationalism is much more like the post in postmodernism than the ex in ex-Mormon.

3) I don’t know that the pre-existence of rationalist responses to the cult objection are Bayesian evidence against their being so (though I agree rationalism isn’t a cult!), since actual cults frequently have standard response to why they aren’t cults.

4) Really I think a lot of this is - zooming out from cults and rationalism - an instance of the more general phenomenon that good arguments against weird positions are hard to come by, since it’s so tempting for people to fall back on “that’s weird.” If you’re considering something weird you have to be your own red team since often no one else will!

Expand full comment
Michael's avatar

Cultic aspects, are to me, irrelevant, as are personality, credentials, etc. A cult can state true propositions as can evil people. Yud's credentials are those of an autodidact who has very little coding skills or sophisticated logic or mathematical fluency, yet those lacks don't make anything he says useless. He is in my opinion, a public intellectual of the kind we often see here and abroad. A philosopher at heart (one of my favorite professions) currently specializing in the many intriguing questions posed by AI. More power to him! However I do have a caveat. That his grasp of AI may be shallower than the scientists who toil out of the public spotlight on the astronomic complexities of coding the LLMs and who work under the hood of the engine so to speak. Yud may be in the publicity department of the AI alignment factory, but I trust more the folks down on the shop floor.

Expand full comment
9 more comments...

No posts