No mechanism to robustly distinguish cargo cultists from actually competent people and serve as feedback for becoming the latter

Insufficiently tolerant of laziness, creepy people, and neoreaction

Rampant Status-seeking Behavior

Not enough risk-taking among LWers in terms of investing tentatively in different worldviews (ie, too afraid of appearing 'wrong' for the sake of contributing possible insights later on), perhaps creating conditions ripe for group-think.

Focus on EY

Another pilosophical issue: The Sequences being such a verbose text and typically reliant on many metaphors to convince the reader of their correctness exacerbates the problems with correctness in bridging those long inferential gaps. Its reliance on referencing itself makes it even worse.

Voting cliques. voting based on community reputation instead of content

Nerd culture is terribe

Some parts of the Sequences put off people who would make excellent community members because those posts show Eliezer being a jerk (as defined by most currently living humans). Links to the originals are still all over the place, and people who do try to read the Sequences in order start running into them rather fast ("Bayesian Judo" immediately comes to mind as an example). Stop filtering for jerks!

Poor at managing issues. When LW was it it's peak, tolerance of neoreaction was not bad in itself, but LW was not capable of managing its image before the eyes a hostile and non-rationalist socieity

Overly reverential attitude to Yudkowsky resulting in alienation of people interested in Rationality who didn't like his work, attitude or writing style, or who were just put off by the 'fanclub' feel of it.

If you aren't one of the top voices, it's easy to fall through the cracks and not be noticed by anyone or bond with others on an equal level.

Inaccessibility of many of the social sides of itself (like poly)

Too intolerant of people not possessing the same ideas or intelligence.

We need something to collectively fight for; without this we're bound to break off into a continual diaspora.

Generally a culture of who could use the most less wrongy jargon to talk about obvious things. N.b. A comment on the Facebook page once where someone wrote something like "you seem to believe this thing while I believe something else, and since we're both rational agents, by Aumann's agreement theorem there must be some information you have that I don't, so what would that be?" And they could have just asked "Why?"

Too much hostility/arrogance.

The AI charity and rationality workshops seems like expensive scams

Perceived as a cult

too insular

Hard to find most interesting posts.

A bit too collectively self-satisfied, but that might just be the cost of enough solidarity to support friendly conversation.

Too intolerant of people who are inbetween LessWrong and the mainstream

Can't get anything done.

Perhaps not cult-like enough, in retrospect.

Unapproachable for most human beings

Jargon-ridden

Too self-important. Tendency to make up ridiculous proper nouns where other terms already exist & pretend new concepts have been created.

Not enough empirical testing of ideas.

Too intolerant of everything else.