What philosophy? Looking at threads now is depressing as shit. It's a ghost town.
not enough instrumental ideas
No focus at all
no major philosophical issues atm
Ethics nonsense
Assumption that Effective Altruism is correct / treating EA as a core LW topic like AI or cryonics.
Too focused on concrete junk.
too little new work on logic and decision theory
like before, need to understand "There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy. "
Feels like all the big names have moved on to other things
no focus - I don't even have an idea what value I am supposed to get from the site
Lack of new material
Not focused enough on abstract junk
Many previously prolific contributors now publish and discuss on other platforms instead.
Hostility to no 100% converts.
I am not involved enough to know
Too political (not literally political parties, but political movements like 3rd wave feminism, BLM)
Overreaction to trolls or critics
Insufficiently Focused
Eschewing work done by experts outside of LessWrong and LessWrong-friendly groups
Core subjects are either explored and 'done' (Sequences etc) and reduced to news-only posts, or are too technical for most readers to participate in current discussions (math, AI). Other subjects don't give a strong appearance of being on topic for LW. If the site is subject-oriented, it may make sense for it to have few new posts; this conflicts with the desire to also make it community-oriented.
it's ded
A lack of diversity of thought.
There is still philosophy on Less Wrong??
same as earlier -- have to find the LW version of terms I'm already used to using
Not acceptance that the peak has passed, inability to create clear path's to new suitable places.
No real change that I can tell, but I don't frequent LW much now.
Too dismissive of mainstream philosophy
MIRI-style worry about AI is really dumb and crazy even though computers causing harm is entirely worth worrying about.
Criticism of specific aspects of science from people who don't know what they're talking about, and really should realize they don't know what they're talking about.
Quality of content presented in a less compelling manner
Being mostly dead
elitism
Still too much nerd culture
This survery is being manipulated by reddit.com/r/sneerclub
Less wrong doesn't work on advancing the state of the art in an artificial intelligence. The Singularity Institute would have been a better thing to continue.
It's dead.
I don't think there are any philosophical issues.
My points have not changed much since LW's heyday
Focus on effective altruism.
Too much anti-philosophy bias.
Lack of original articles that are both interesting and comprehensible to non-mathematicians.
Willful(?) ignorance of other parts of cognitive science and the philosophy surrounding it, dismissiveness of embodiment
Too little criticism of the Scientific Method (and too little "arguments of the other side" in most articles)
There's really only one narrative style, and if that gets old or you don't like it...
Ignorance of life outside elite professions like programming, academia, etc.
Insufficient resources/focus on translating knowledge of rationality to practice of rationality
Excessive contrarian rejection of the sequences, often on spurious grounds. E.g., no longer encouraging newbies to read the Sequences
Not focused enough on Doing Good
Naive contrarian beliefs, despite only superficial understanding of the mainstream.
no opinion
Talks about all this epistemology and model-building stuff, and "the map is not the territory", but instead of encouraging people to be careful when building models, and to hold all their beliefs lightly, and presenting a number of alternative perspectives to show that there are many models that seem sensible, it doggedly promotes a single ideology. A website about epistemology or rationality should not do that.
I just read slatestarcodex instead.
Abandoned
too willing to completely regard expert opinion because you looked into a topic for 5 minutes and have rationality super powers
too serious/pretentious
Reinventing the wheel -- Many LW concepts treated as novel already have robust bodies of work outside LW
I'm not around Less Wrong right now.
Too many posts that look/feel crank-ish like the regular flowcharts of multiple possible causes of WORLD DOOM.
All the interesting people left.
In theory, it's fine. Like 99% of your practicers just make the same lazy thinking that any other political group does. But if everyone rigorously followed the sequences, mostly people would never say anything.
Epistemic arrogance
Intimidating, it was hard to fit in
lack of respect for authorities in other fields
Not much ongoing interaction or novel ideas
Lack of original work
Rationalism still doesn't work.
All content is hidden away
Epistemology corrupted by Progressivist groupthink.
Swallowed whole by Effective Altruism movement
Big names/contributors have stopped posting.
Focus on Effective Altruism
natural aging of community
Little activity
Ouroboros-esque decay as the low rate of new content causes the community to turn inwards on itself and detach from the rest of the world.
Too much tolerance of people who are deluded about their gender
He's dead, Jim.
misuse/overuse of jargon - everything you oppose is a fallacy
Unuseful high barriers to entry; false dichotomy between rational discussion, and personal relationships.
Too much of the site is focused on the contriversial figure that is eliezer yudkowsky. It's a bit of a cult of personality.
Focus on AI X-risk
same as above
What philosophy?
LessWrong has an interesting reputation on the internet, partly because of Yudkowsky, partly because of the Basilisk, and partly because of the slander of others. While I agree with most if not all of the stuff posted on LessWrong and affiliated websites, many people still seem to be put off by its appearance, the small, likeminded community, and its reliance on the works of Yudkowsky. Add the Basilisk fiasco into that, and people get a picture of LessWrong that is not what it should be.
Lack of a good intro - "From AI to Zombies" is a step in the right direction, but not enough
Lack of leadership and vision. The communities brain power and ethic should be turned to New, though related, Horizons.
Focus on AI without actually knowing much about how to build one; focus on speculation about black-box models with unrealistic assumptions
crappy web technology
Too much brain power required
Too smug
Lack of new content
failure to move on and develop as a community with real impact
Its dead
Deadness
Ditto answer to previous question, except now it is empty of people.
Dunning Krueger of expertise by respected community leaders
Poor quality of submissions
I don't know because I got bored and didn't read anything interesting for ages and stopped reading. There didn't seem to be a lot of sense of community amongst people who didn't already know each other from elsewhere, and most of the interesting posts were elsewhere.
I do not participate sufficiently to be aware of any problems.
Too many people would rather have it die than evolve
Not enough exploration of new ideas
dismissal of other philosophical positions with out argument or evidence. For example, the reality of abstract objects or the objective existence of beauty. Maybe, this is covered is the sequences, but I haven't read them.
Nobody there
Contrarion cluster
Seems to be a bunch of people sitting around smug about their superiority without backing up said superiority meaningfully
People want the fast and furious pace to continue, but there simply isn't enough left undescribed in the topic space.
No direction for content (e.g. which topics to explore?)
Little or no remarkable new content
A refusal to take certain things for granted, while pretty much doing that for other things. It's widely accepted in the Rationalist sphere that transhumanism or atheism is roughly correct, but accepting that, say, fascism isn't correct is apparently a bridge too far.
Not enough focus on acquiring useful thinking skills
Ignorance of other discourses (anthropology, critical theory, Continental philosophy, etc.) -> reinventing the wheel, falling unwitting into known problems
Too distracted by self-help and identity
Assumption that effective altruism is a/the most valuable enterprize.
all the good people left
abuse of "rationality" to justistify personal perspectives
Hard to join in as new member. Too much effort
Engagement with cognitive science seems to be non existent, and amounts to brains/subjective experience "is just computers bro".
difficulty distinguishing opinions expressed in ignorance of what has already been explored on lw versus opinions that have earnestly considered previous discussion and nevertheless concluded that the consensus/es lw has converged on is/are misguided.
To much woo in rational wrapping and a very high "Dunning - Kruger rating" aka " I thought very hard about it and in my head it makes sense so therefore it must be rational"
I didn't realize LessWrong was still a thing? I thought it was dead.
Focus on lifehack-ish applause-light "rationality"
Easy pickings are done (aka sequences and top posts) so hard to come up with new interesting stuff to post as people.
Reading mostly-unhelpful stuff about productivity was the opposite of productive
Although I think these are the "biggest" community problems, I don't consider them particularly large. They're the least small of our small problems.
Not enough focus on the fundamentals of rationality.
It's become a touch stagnant.
Lack of posts
People on Less Wrong argue about a lot of stuff. Sometimes I think a "LessWronger" is actually someone who criticizes LessWrong all the time. LessWrong is fine! It doesn't really have problems other than the tendency to attach problems to itself.
Focus on Eliezer's fundraising
Stagnation
Long since vanished up its own ass - all the interesting stuff is happening on Tumblr these days.
Seems there is a core body of ideas that has been posted. The rest is commentary, and is suffering from being at the same place as the core.
Too focused on specific details and meta discussions
Same-needs more posts on instrumental rationality
Lack of distinct subsections for distinct types of topics
Too much focus on personal experiences, lifehacks, and progressive/inclusive language. The focus has become the community and the users instead of the ideas and philosophies.
Not enough interesting content - too much "effective altruism" related.
Inadvisable disabling of evolutionarily adaptive cognitive safety features
Lack of purpose / No clear projects
The members of LessWrong confuse that which comes after with that which comes before. A man came after, so they place him before and call him "Eliezer Yudkowsky." Words come after, so they place them before and call them "the Sequences."
Don't currently follow lesswrong
Not enough focus on practical ways to help the world
No idea what it's like now
Dying (I have no solution)
The community has run its course, as most communities do. :)
I don't actually visit LW; I just came in when I first discovered it, read all the interesting posts by Eliezer, Luke, and Scott Alexander, and left
Too unfocused
"
No coherence, no approachability
I only found this site today.
Not enough instrumental rationality/ the actual work of overcoming bias
It's dead.
Not enough new content
Focus on moral philosophy
Elitism, still.
Not enough good content, too hard to find good old stuff
Content is preachy and speculative
pretentiousness
Too much self-help woo. Calling it instrumental rationality doesn't improve it.
The world in 2016 does not have a philosophic NEED of Lesswrong in the way the world of 2008 did.
Insufficiently focused on abstract junk
Too homogeneous (not enough viewpoints from people outside of software engineering/philosophy/math)
For the above- I'm not sure if the reliance on Jargon has truly declined, or if I'm personally more familiar with the jargon.
Sequences are densely written and often obliquely targeted. An LW2.0 should seek to update them for clarity and accessibility.
Too much focus on nonprofit giving / Optimum employment / etc.
I get the impression that some people use computer science jargon purely for the sake of it, which may alienate some potential members.
Too focused on effective altruism
So new here that I'm not sure.
Too isolated from the main body of cognitive science
There is too high an implied social barrier to creating/pointing to content on LW.
From SSC, I get the impression there's too much social justice stuff there now
A lack of cohesive principles on the website itself.
Don't know since I don't read LW proper anymore
References to other pages that refer to others that refer....
People too worried about what the proper use of the site is for. L
Abstract Junk gets too much effort
I hear you're "embalmed" now, that seems like a pretty significant barrier to participation.
Self-important. Makes up proper nouns.
Too short here to write anything useful.
cult of personality - "EY wrote" is synonym for Truth.
Focus on cryonics and encouragement of polyamory don't bother me, personally, but if the goal is to appeal to and attract more people to the community, maybe that needs to be toned down a bit. Those are the more cultish-seeming aspects of LessWrong that would scare away the uninitiated. After they've earned their possibly-self-aware Rationalist talking hat, sure, bring on the brainwashing. It's not like they'll be able to turn off that new voice in their head anyway. But to get them to that point, it might help to emphasize (like in top-level posts) that there is no orthodoxy on those issues and on things like Torture vs Dust_Specks. I love learning from Eliezer, but when his positions aren't accepted by the community as uncontroversial, it might help (or look better, at least) to outline both sides of the debate, in a separate post and not down in comments. LessWrong makes an impression of being like this snowball of ideas: they were separate issues initially, but have been packed together and rolled around and into some sort of philosophy. Newcomers might feel they have to either accept or reject it as a whole. What if they are worried about UFAI, but all this talk of cryonics and brain uploading really creeps them out? TL;DR: uncontroversial issues --> explicit debate. (This is also related to the next question)
While focus on cryonics, encouragement of polyamory, and preference of Torture over Dust Specks don't bother me personally, they might scare away newcomers.
Lack of purpose/direction (nobody knows what it means to "go create the art")
Needs to be shot in the head, instead is still shambling, like animated corpse of former self