Eliezer typical-minding the world (see the April Fools post) | EY not enough instrumental ideas | TMT None of the above are actually bad. | N I haven't noticed a serious problem with any of these. | N Not using terms already defined by other fields of research (reinventing the wheel) | R Hard to explain shortly, but I think the understanding that is lacking is pretty well summed up in the quote "There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy. " | NI Seemed unwelcoming to outsiders. | U Lack of a big blinking sign saying start here stupid | DKWS Sequences are too long for someone starting out | DKWS Too much handwaving, not enough rigour | LR I never noticed any big problems | N Too much eschewing of work done by experts not in LessWrong | NS too cargo-cult | CC complete ignorance and rejection of established philosophy of ethics and epistemology (but not decision theory) | NS None of the above. | N Intense political association with US-style libertarianism | BP Antifeminism, etc | NPE Focus on interpretation of quantum mechanics | NS Not enough emphasis on lifestyle and application. | TMT Much apparent philosophical convergence was due to accepting positions which seem natural to those with the particular cognitive style common to Less Wrongers, not because the arguments for those positions were good. | TM too much invention of terms that already have established equivalents in some other discipline so that people less familiar with LW have to find the LW-version of terms they already know | R Over-eagerness to draw practical conclusions from theoretical arguments. | TMT Bayesianism as a dogma and unifying paradigm, when it isn't even computable over interesting, i.e. unbounded, domains. | LR EY lacking understanding of normies | EY Inappropriate attempts to apply "meta" and math | LR Too much of a personality cult | PC Most people in LessWrong weren't/aren't very good at presenting their thoughts in an effective manner socially speaking - being able to get others to listen. | SI Too excited to jump down the throat of anyone who said something that could be seen as disagreeing with the sequences | C elitism | NPE UTILITARIANISM? REALLY? MOSQUITO NETS? | SP The core information about key claims was (and remains) too spread out over the sequences. We desperately need an 80/20'd version of AI-to-Zombies. | DKWS Over reliance on nerd culture | BA Too self-centered - for example, making up in-group jargon for something that already has a technical term | R Lesswrong failed to acknowledge it reinvented the wheel why when I can only consider a willful ignorance of the history of philosophy. And then it went off the deep end by writing Harry Potter fanfiction. | R People aren't comfortable with more casual interaction. | TS Too much focus on a specific sort of overly abstract analysis of AI, which ignores practical issues that make AI safety both more difficult and less important. | LR Eliezer's overconfience about Many Worlds, and overconfidence in general | A Personality cult of Yudkowsky | PC Neoreactionary infilitration | BP Lack of norms around what was useful discussion | BN Vague thinking on many-worlds (way better than copenhagen of course but information-theoretical still wins out imho) | LR Too dismissive of correction and criticism beyond community's orthodoxy | DAC overly reverential attitude to Yudkowsky resulting in difficulty in rationally assessing the quality of his work and arguments. | PC Too many threads veering into fringe politics. | BP Way too big for its britches; rampant overconfidence and self absorption despite not being especially clever or insightful. Very poor engagement with existing literature. What was good wasn't new, what was new wasn't good. | A Narrow-minded worldview, acceptance of one viewpoint without knowledge of alternatives; blanket disregard of most of philosophy | NS It's considered to have "peaked"? | N The "point" of LW seems to be "bullshitting about rationality-adjacent topics" and it accomplishes that pretty well. | N Projections of confidence/arrogance that attracted more people who projected confidence/arrogance. The original confidence/arrogance may or may not have been warranted, but subsequent levels of it often weren't. | A Too much not-invented-here syndrome | R Some members tend not to be very rational in their commenting, actively outgrouping members. | SI Failure to actually address foundational issues when they were brought up | DAC Thinking that if you've read the sequences then you're more rational than other people, and so trusting your instinct more and listening to others less, ironically making you less rational. | A No opinion | S Thought that we were smarter than everybody else, thus: Neglecting the Virtue of Scholarship, dismissing dissenting thought too quickly | A Usual drift into a social club | BN Too much arrogance on Eliezer's part, not enough thinking outside his box / meta-thinking about what the community might be missing by largely being formed of INTJ quantifier etc types. | NI Obsession with cognitive biases and "rationality is about System 2 overriding System 1". | NI It wasn't the pure crystallized epiphany-juice that the Sequences were. | MB Failure to encourage/generate high quality content outside of the core authors | MB Taking EY seriously | EY Lack of focus on AI | M Failure to understand the motivation of non-LW poeple, the straw-man often applied to cultural/philosophical/religious communities outside LW. | SI Reinventing the wheel -- A lot of Less Wrong concepts were already concepts in academia outside Less Wrong | R too harsh for newcomers who are legitimately trying to learn by asking 'valid' questions getting burned and downvoted | U I think there was a little too much focus on EY the person/author rather than the arguments, many of his arguments were good but even his poor arguments were privileged a little too much | PC Human minds aren't capable of the analysis necessary to make the sort of broad claims that LW (or any ideological group) does, and so the group ends up doing a lot of ideological short cuts that look like the short cuts of any political group. | IM Too self-reinforcing | C Intimidating, hard to fit in | U Antipathy towards domain experts | RE Split in focus of the community between practical life improvement and philosophizing about Future AI Gods. | M Hubris | A failure to understand that Not Taking Ideas Seriously has societal value in general/for many people | SI Rationalism doesn't work. You won't get better results in life by applying this method. | IM Not enough skepticism for the standard Progressive cultural beliefs. | TP Conflation of epistemic rationality with instrumental rationality. | LR Sequences were too tightly intertwined | U "Worship" of effective gods in the community (i.e. Eliezer), which could have been circumvented by full anonymity | PC Focus on AI without any domain expertise | RE too much emphasis on what essentially amounted to "self-help" | M Bad philosophy and history of science | NS Cesspool comment sections - they were a weird mix of insightful, logically-sound discussion and useless, didn't-read-the-article ranting. | MB Epiphany Addiction Shots | M Extreme Self Congratulatory Tone. EXTREME. UGH. Sorry. Wait, is this a "Philosophical Issue"? Seems to fit the category about as much as "Too Much Jargon." | A Too often overconfidently incorrect. | A Too much reliance on Yudkowsky's sequences. There was a void left when he stopped writing. | SCA Many Worlds as a litmus test for rationality. | SP Subtle stuff I don't feel like writing an issue about right now. | S False dichotomy between rational discussion and personally liking or being friends with someone | SI Too foussed on one person | PC Logical chains getting too long. | LR Weird treatment of feminism/women's stuff kept me from taking it too seriously | NPE Not enough people contributing relative to consuming - needed to be more of a place of purpose, and without that most of the people who actually make things went elsewhere | M Too focused on expanding community | M Made it hard for good content to be generated and iterated on because people were reluctant to post | U Excessively shallow, S-type thought processes. | NI If someone isn't practicing, then they're an armchair scholar, and they don't matter. Most Lesswrongers are armchair scholars (I think) | TMT Not enough focus on actual interpersonal social lifestyle issues | M Focus on Yudkowsky and his pet interests instead of rationality in general. | EY Had no problems at its peak | N Failure to incorporate more topics for conversation, such as Literature, the History of Philosophy, and metaphysics. | NS No problems at peak. | N crappy web technology | BTP Came across as too insular | U It's a sort of strange place. Like the Louvre it is a palace of wonders, but accessible to only those with a degree in Fine Arts and where all the staff are carrying machine-guns. | E Seems to worship intelligence. | A General dismissal of expertise as a valid thing by certain community leaders | RE Too closely associated with/uncritical of Eliezer Yudkowsky | PC I am unaware of any problems | N People trying to drive out minority viewpoints | DAC cultish | C Lack of results, best summarized by Yvain's "Extreme Rationality: It's Not That Great" post. | TMT Ignorant of / divorced from mainstream philospophy | NS Contrarion cluster | TC Too confident that LessWrongers understood things better than outsiders | A repugnant | S Insistence on utilitarianism as simple mathematical calculation. | SP Hubris | A Mostly seemed to be a collection of smug jerks engaging in mental masturbation about how smart they were compared to others, rather than providing useful or interesting insights. | A I feel that the decline started when "all the easy stuff had been done". The simple fact of the matter is that the sequences cover pretty much all of the interesting and important problems, and the vast bulk of recent posts have just been philosophical masturbation. | SCA Not enough focus on intentionally building skills. | TMT Too weird for most people. | BA Too many Epiphany Addiction baits | M Not practical enough | TMT It is turning into a cult | LC Too many of the arguments were cargo-cultish and not very good. | CC Focus on cryonics/AI is ok, but wasn't organized into subsections enough to sort th. | II Too scattershot, not enough cumulative progress on developing useful thinking skills. | M Not enough iteration on jargon. | NEJ Insufficient effort put into establishing a canon beyond the sequences/a condensed form of a philosophical/decision-theoretic theory of everything; Bad formatting/layout: insightful comments take too much effort to find, insightful posts take too much effort to find | II Too distracted self-help, and identity. Abstract junk was always lesswrong's strength. | NET No complaints really. LW at its peak was pretty great. | N Over confidence of specific viewpoints of scientific understanding. Specifically QM. | A nitpick culture | TC I don't think anyone rightly criticises the idea of mind uploading. | LR Insufficient discounting of long inferential chains, and even considering this to be a virtue | LR Not so effective EA | SP A lack of focus on cultural questions | M Too much nitty in-fighting and navel-gazing as opposed to action. | TMT Calling Less Wrong a cult of personality around Eliezer Yudkowsky is a defensible position. | PC Focus on effective altruism. While I agree with the premises and am going to participate, I'm not really interested in reading posts on it. | M I believe LW focused too much on theory and long-term applications. There was a great deal of time spent on fantasies of beisutuskai, but very little advice for what we can do to improve right now. We need concrete steps now, otherwise we will not attain the hypothesized far-future awesomeness. | TMT Too accepting of bullshit feelings and random people not feeling 'welcome' in the community. | TP Added to my neuroticism about self-value | DMM "I am a member of this community, therefore my thinking is lightyears above everyone else, and whatever comes out of my mouth is superior to actual research." | A Although I think these are the "biggest" philosophical problems, I don't consider them particularly large. They're the least small of our small problems. | N I don't really think anything was wrong with it | N People on Less Wrong argue about a lot of stuff. Sometimes I think a "LessWronger" is actually someone who criticizes LessWrong all the time. LessWrong is fine! It doesn't really have problems other than the tendency to attach problems to itself. | N Focus on Eliezer's fundraising | EY Disorganized giant web of links and dependencies made it hard for newcomers | II Uncritical hero worship by too many new members | PC Needs more instrumental-minded actions and posts | TMT It's scary and abrasive and has culty vibes and I can't link other people to it | C People were overoptimistic about the difficulty in writing good blog posts, even for a community of like-minded people. Just because you know a lot about AI/psych/stats/CS, doesn't mean you can write a good blog post, esp in a reasonable amount of time. | SCA Not sure what its "peak" was. Obviously, it's peaked at some point, but I'm not sure when it was. | S Cult-like behaviour | C Not enough focus on practical, non-extreme things we can do to help the world | TMT Epistemic closure | E Mistrust in academia | RE Eliezer does not know shit about quantum mechanics while pretending he does | EY Not enough of the nameless virtue | BN LW's core concerns around decision theory and AI were ones that only few people could contribute to meaningfully. The rest of us could only lurk and nod along in vigorous agreement at our computer screens. The bit that interested everybody, the biases, rationality and self-improvement bits were great, but not as focused. We learned a lot, but those issues are not things that will keep a community together for ever. People will move on to more specific projects, which is exactly what happened. | SCA The metaethics sequence was too unfocused and inconclusive. | O Too confident that they know better than outsiders | A Mild bias against Blue Tribe people with preference for Grey Tribe. | BP Elitism (dismissive/disvaluing of time spent with people or wisdom learned from people who weren't 120+ IQ protogeniuses). | NPE Poor scholarship | NS Obsession with the Bayesian view of statistics to the exclusion of other alternatives. | NS There didn't seem to be an in-between place between those most invested in the community and those who were still in-progress with the sequences. The community on the site felt inaccessible. | U (Slightly) reinventing the wheel | R pretentiousness | BA People not being able to SELF-apply the discourse ideals and then attempting to talk about topics above our sanity waterline, like politics and sex and sexual politics. | BP You need to link facebook/reddit more. Alternatively I would create one board like hackernews which has all the good stuff everyday. Your main issue is discoverability. | II Sequences really need to be redone with A/B testing and maybe youtube videos. Exercises and questions would be good. | SFI Tackling hard problems in a public forum? That sounds pretty doomed to fail. | IM This is like the second time I've ever been to the site, so I don't know. | S Dismissal of related work done in relevant mainstream fields, including philosophy and computer science | NS Coalesced around autodidacts with little formal training in math, which leads to a lot of verbiage and not a lot of rigor, eg, Eliezer's comments about whether randomness is ever necessary for an algorithm. | LR A tendency to hyperfocus on the ridiculousness of Roko's Basilisk stopped most people (inside and outside of LessWrong) from thinking about more-plausible acausal-trade-based ideas. | TMRB To be fair I wasn't around for the peak, however, I'd cite the aforementioned Basilisk. It suddenly presents a sort of 'Rational Devil' to a super intelligent Al's 'Rational Messiah'. It's silly and mildly off-putting. Thankfully the information on the site, especially the Sequences, is far too useful to just toss aside. | NERB Normalizing behaviors that cement the community's low status from the perspective of the rest of society. This is a self-reinforcing participation filter. | SI Insufficient recruitment of academics | RE Widespread usage of some bullshit elements of the right-wing political culture like "Maslow's hierarchy of needs", "Overton windows", Chesterton quotes and the worship of awful people like Eric S. Reymond and Robin Hanson | NPE Too little awareness of existing work/broader context; "NIH syndrome" and LW exceptionalism | NS Don't know when it was best. | S I got the sense that certain pet explanatory frameworks (e.g. Bayesian probability) motivated the substance of, and positions taken by the core LW teachings, and not the other way around; meanwhile invocations of other frameworks (e.g. computational complexity theory) were absent or lacking. This is probably mostly a matter of knowledge gaps, though. | NS Very broad but shallow knowledge - LessWrong writers (and commenters) enjoy trotting out complex features of philosophy and science and citing poorly explained summaries so as to prove their points by hand-waving | LR The linking/interconnectedness was overwhelming going in for the first time, almost to the level of TVTropes. | II I cannot say, I was not aware of this site back then. | S Too much ingroup signaling | SI Loss of purpose/drive | M Too much frequentism bashing from people who didn't know shit about what it is | CC Sometimes people who weren't Eliezer or Scott would make posts | SCA Too much talk. Not enough action. / Too much focus on epistemic rationality and navel-gazing, not enough practical advice and communities. | TMT Besides creating some (probably very fun) communities in nerd Dense areas in the US, LessWrong doesn't seem to have done enough to actually help people be more productive, as a matter of fact, it acts(ed) as a productivity sink for people who feel like it helps to read it, but who are just spending time with little benefit. | TMT Lack of intellectual diversity. | NS talks a good game but never delivers | TMT Not willing to treat certain subjects (ex., religion) as a source of potentially-useful tools. | NS Too little regard for established philosophy and too many poor attempts to re-invent existing philosophical ideas. | NS The length of an inferential gap being bridged increases the probability that one of the steps will be wrong, and the conclusion can't be trusted after that. | LR endorsement of eugenics | NPE No one has figured out the right way to do internet discourse yet. Less wrong was better than most in that respect, but still had all the usual problems of organization and coherent presentation for anyone not present through the initial development. | II Percieved valorization of system 2 thinking over system 1 | NI The medium doesn't support or encourage deep exploration of a topic. | II Racism | NPE The sequences were not, and still have to be, condensed into something less daunting for newcomers. | DKWS overreliance on work done inside the community, reinventions of the wheel are very common | R If I stop a sequence, it's hard to pick back up again; and I'd really like a group of people to go through them with to discuss, but they're all already pretty discussed. | ISS Not good enough at making sure all its interesting decision theory ideas were on solid ground | LR A lot of defensiveness induced by people being cruel to them on the internet. | SI 'What is lesswrong.com?' was difficult to answer. | DKWS Focus on Many Worlds Interpretation. Regardless of it's merits, the reasons for it's inclusion were insufficiently explained (I only learned recently why Yudkowsky considered it to fit in the sequences, and am skeptical of the reasoning.) | QMS When others went to fill the void partially, all the low hanging fruit exhausted. | SCA Too intimidating and legitimately hard to compose idea into good post | U Neglect of prior work | NS Too focused on math | NI LW-style rationality seems to behave badly in minds not already somewhat inclined that way. (Individuals with IQ below the median immediately misuse elements of Rationality when introduced to them. Given that half the human race is below median IQ, this is an important flaw.) | IM Heavy focus on Bayes theorem, I know people who have said in person it was overbearing. | M In relation to 'Criticism of the Scientific Method': Not criticizing it specifically, but not understanding its proper place in the toolbox and throwing the baby out with the bathwater, as it were | O Engagement with cognitive science seems to be non existent, and amounts to brains/subjective experience "is just computers bro". | NS Strong libertarian and computational bias. The problem is not in having a libertarian ideology or a cs background. The problem is that many participants have a hard time even understanding a problem or a solution that sounds divergent from libertarian dogma or the assumptions of current computational problem solving | NS The community struck me as being massively overconfident in subject matters not well understood by most members. | A Too much focus on quantum mechanics | QMS contains pissing contests and mean people (is not unique in this, but still) | SI Too much jargon that was weird Yudkowsky misinterpretations of real concepts | NS Tendency to reinvent the wheel, but call it the "toroid of static friction" | R seriously though, > 2048-32 >implying unitary evolution as he means it is *real* BECAUSE I SAID SO | E Not enough scholarship-virtue and not enough organization of the scholarship that was done | NS Worship of Eliezer Yudkowsky | PC People not being able to OTHER-apply the discourse ideals and insufficiently upvoting playfully/hastily written content that was on the right track. | TS Unresolved tension between epistemic and instrumental rationality | LR The tendency to take itself too seriously (e.g. Roko's Basilisk). LW's cross to bear is the purported saving of all of humanity. Relatedly, the tendency to invent proper nouns where perfectly good terms already exist & pretend that new and all-important concepts have been created. | R If the site is past its peak, people probably expected too much, or a wide array of different things from it. This is natural. | O