This is in response to a post in the Effective Altruism Facebook group.
>I’ve noticed a recurring problem with discussions in EA Facebook groups, and really this is a problem with discussions on all Facebook groups focused on discussing and debating serious, smarty pants topics.
>People say really mean things.
>Sometimes it is blunt. Sometimes it is passive-aggressively wrapped up in diplomatic rational speak. Either way, it makes me want to light a house on fire.
Here’s the bottom line. It is impossible to engage in a productive or worthwhile conversation with someone who, subtedly or overtly, challenges your intelligence, blanketly dismisses your ideas, or otherwise treats you without respect. Meanness salts the earth of fruitful discourse.
>When people talk down to each other, many people (most people?) disengage. It becomes too stressful. Even the second-hand degradation is uncomfortable to watch, and being directly degraded is unbearable. Inevitably, some people will continue to engage, but it will be the people with the most aggressive, adversarial, and abrasive conversational style. This is what I often see happening in EA discussion groups (and other discussion groups too). The most engaged participants are the ones who are rude and condescending to the people they disagree with.
In-person interaction, and human relationships in general, don’t have all the same features as online discourse. EA is a community which takes discussing and debating serious, smarty-pants topics to the level you do something about, that some people put their all into, and a way of life. Having been part of the EA community since 2011, knowing hundreds of effective altruists, and hearing stories about experiences which are similar to my own observations, I notice the same thing happens in person. Effective altruists not only communicate but in their behaviour over extended periods of time often can be blunt, passive-aggressive, mean, disrespectful and dismissive. Meanness salts the earth of fruitful cooperation.
As effective altruists are drawn to hubs where they live, work and date each other when they’re already part of the online bubble that is EA to boot, the relationship an individual has with a whole community can distort their lives. As online conversations and personal experiences compound, they just become our lives. So effective altruists can end up living lives where they feel the people they’ve joined are all this.
For years EAs have talked about improving online discourse, but there’s lots that even if it somehow does make it onto the internet doesn’t go public before the whole movement. And it shouldn’t. I think local EA communities resolve their interpersonal conflicts one way or another, and whatever means local groups can generate to resolve their own problems may be the best we can hope for. Beyond that, the best I can think of is some effective altruists start a project suggesting or recommending ways of communicating and being around one another which are more compassionate. We can’t force that on people. However, I don’t want effective altruists to project that poor online discourse in the community is utterly unrelated to what happens in those small local groups, or private mailing lists, or at meetups, or in our daily lives as EA becomes our daily lives.
Ultimately, solving communication or coordination problems isn’t some uniquely intellectual task. It’s a human problem. Nobody has this all figured out. To the extent in person and online effective altruists are finding solutions with how to get along better, we’re finding solutions for how people get along better, period. This is something the rationality community has focused on for its own purposes more than the EA community. It’s not my impression the rationality community is particularly better or worse than anyone else at having its members get along better. They’ve certainly found or generated tools for doing so, but it takes lots of concentration and focus to utilize them well. What tools they’ve borrowed from, e.g., different therapeutic paradigms I don’t see the rationality community having a better knack for it than those already working in caring professions.
What I think the rationality community is good at with helping people get along is showing each other how to get along with a big group in novel social arrangements. To some extent EA and lots of other social movements like it are unprecedented in modern history, and we’re thrust together in ways our intuitions and traditions don’t prepare us for. The rationality community seems focused on pulling on lots of working practices for improved communication and applying them to the unusual situations and lifestyles they find themselves in, because rationalists try unusual things. Doing the most good is about doing good in unusual ways too, or else it wouldn’t be necessary.
Improving online discourse is hard. Discourse in EA seems good enough, though, for the most part people are sticking around. It’s working decently, but I think EAs need to take stock that this is a community for which a lot of us we’ll know each other for the rest of our lives. Solving all global problems seems like it may take awhile. Solving online discourse problems is a surface layer which will become less important than the way of life and culture EA is becoming. It’s not too late to shift those norms for the better. I think in time how we learn to get along everywhere will dominate considerations of how we learn to get along online
Again, I want to emphasize this all needs to be acknowledged, but a lot of this is part of the human condition. I think EAs get so excited by the ideas and the projects and the people we hope we can find a superhuman way of solving interpersonal problems too. EA is a movement which generates families, but it won’t be a movement which can prevent divorces. EA is a movement which gives people families of choice, but it won’t be a movement which can prevent all those families from splitting up like any other family might. All we can do is try.