CEA on community building, representativeness, and the EA Summit

Posted on Tuesday, August 14th 2018
(last updated Wednesday, October 6th 2021)

(This post was written by Kerry Vaughan and Larissa Hesketh-Rowe with contributions from other members of the CEA staff)

There has been discussion recently about how to approach building the EA community, in light of last weekend’s EA Summit and this post on problems with EA representativeness and how to solve it. We at CEA thought it would be helpful to share some of our thinking on community building and representativeness in EA.

This post comprises four sections:

  1. Why work to build the EA community? - why we prioritize building the EA community and think this is a promising area for people to work in.
  2. The challenges of prioritization - how prioritizing some activities can present challenges for community building and representativeness.
  3. CEA wants to support other community builders - how we can do better by working with other organizations and individuals.
  4. Our views on representativeness in EA - why we believe EA should be cause-impartial, but CEA’s work should be mostly cause-general, and involve more description of community priorities as they are.

Why work to build the EA community?

Ultimately, CEA wants to improve the world as much as possible. This means we want to do things that evidence and reason suggest are particularly high impact.

In order to make progress on understanding the world, or in solving any of the world’s most pressing problems, we are going to need dedicated, altruistic people who are thinking carefully about how to act. Those people can have a much higher impact if they are guided by and can add to cutting-edge ideas, have access to the necessary resources (e.g. money) and can coordinate with one another.

Due to this need, we think one way we can have significant impact is by building a global community of people who have made helping others a core part of their lives, and who use evidence and reason to figure out how to do so as effectively as possible.

This is why we consider working on building the EA community a priority path for those looking to have an impact with their career. Work done to bring people or resources into the community, or to help build on our ideas and coordination capacity, can multiply our impact several-fold even if we change our minds about which problems are most pressing in the future.

(You can see some considerations against working on EA community building here.)

The challenge of prioritization

CEA’s challenge is prioritization. Given that we have a finite amount of money, staff and management capacity, we have to choose where to focus our efforts. CEA cannot do everything that the EA community needs alone.

This year, we’ve been primarily focusing on people who have already engaged a lot with the ideas and community associated with effective altruism, so that we can better understand what those people need and help them put their knowledge and dedication to good use. We think of this as analogous to focusing at the bottom of a marketing funnel and getting to know our “core users”.

In practice, this has meant focusing on projects like running smaller retreats for people who are already highly engaged with EA and putting more attention on a smaller number of local groups, rather than trying to provide broad support to many.

Our plan has been to get these projects up and running and reliably doing valuable work before expanding our support further up the funnel. At this point, however, we are starting preparations to get more done higher in the funnel. Some valuable actions we’d like to take up the funnel soon include running a broader range of events, funding more projects, and supporting more local groups. To achieve these new goals, we’ve recently been looking to hire community specialists, events specialists, and an EA Grants Evaluator.

Inevitably, focusing on one area usually means deprioritizing other things that would also add a lot of value to the EA community. We try to mitigate some of the costs of prioritization by helping other groups provide support instead.

CEA supports other community builders

We generally encourage members of the EA community to get involved in building the EA community, especially in areas that are valuable but currently not prioritized by CEA. Because CEA is currently management and staff-constrained, the easiest way for us to support others is with funding, branding, and expertise.

Some actions we’ve taken (or plan to take) to support the work of others include:

  • Providing more than $650,000 to groups and individuals doing local community building (in progress).
  • Re-launching EA Grant applications to the public with a £2,000,000 budget and a rolling application process (to be launched by the end of October 2018).
  • Helping groups run EAGx conferences in their local areas by providing the brand, funding (both for the event and a stipend to organizers), and advice (this year we supported events in Australia, Boston, and the Netherlands).
  • Supporting Rethink Charity’s work on LEAN with a $50,000 grant (grant provided).
  • Supporting Charity Entrepreneurship’s work to build new EA charities with a $100,000 grant (grant currently being finalized).
  • Supporting the LessWrong 2.0 team with a $75,000 grant.
  • Supporting the EA Summit with a $10,000 grant.

There’s certainly more we can do to support the work others are doing, and we’ll be on the lookout for more opportunities in the future.

The EA Summit

A recent example of one of the ways we’re trying to support non-CEA community building efforts is by supporting the EA Summit, which took place last weekend. The EA Summit was a small conference for EA community builders, incubated by Paradigm Academy with participation from CEA, Charity Science, and the Local Effective Altruism Network (LEAN), a project of Rethink Charity.

In late June, Peter Buckley and Mindy McTeigue approached Kerry and Larissa to discuss their concerns around a growing bias towards inaction in the EA community and a slowdown in efforts to build a robust, thriving EA community. We decided that these were important problems and that the EA Summit was a good mechanism for addressing them, so we were happy to support the project.

The largest consideration against support was based on the concern that the Summit was incubated by Paradigm Academy, which is closely connected to Leverage Research. We concluded that this was not a compelling reason to avoid supporting the conference. The EA Summit was a transparent project of clear value to the EA community.

Three CEA staff members attended the conference with Kerry providing the closing keynote. Our impression was that the conference was a success. Despite being organized on short notice, the event had over 100 attendees, was well run, and ended with an excellent party. Attendees seemed to come away with the message that there are useful projects they can work on that CEA would support, and overall had overwhelmingly positive things to say about the conference.

However, the fact that Paradigm incubated the Summit and Paradigm is connected to Leverage led some members of the community to express concern or confusion about the relationship between Leverage and the EA community. We will address this in a separate post in the near future. (Edit: We decided not to work on this post at this time.)

EA and Representativeness

One area the EA Summit aimed to address was concern about representativeness in EA, most recently raised by Joey Savoie. The question of how CEA should represent the EA community is one we’ve thought about and discussed internally for some time. We plan to write a separate post on this, but here is an outline of our thinking so far. We believe the EA Forum should be a place for everyone to share and build upon ideas and models, so we’d love to see discussion of this here.

On representativeness, our current view is that:

  1. The EA community should be cause-impartial, but not cause-agnostic.
  2. CEA’s work should be broadly cause-general.
  3. Some of CEA’s work should be descriptive of what is happening in the community, but some of our work should also be prescriptive, meaning that it is based on our best guess as to what will have the largest impact.
  4. We’re unsure who our work should be representative of.
  5. While we took some steps to address representativeness prior to Joey’s post, we welcome suggestions on how we can improve.

The EA community should be cause-impartial: EA is about figuring out how to do the most good and then doing it. This means we don’t favor any particular beneficiaries, approaches or cause areas from the start, but instead select causes based on an impartial calculation of impact (cause-impartiality). This in turn means we should be both seeking to reduce our uncertainty about the relevant impact of different causes and seeking to find new areas that could potentially be even more important (see Three Heuristics for Finding Cause X for some ideas on how this might be done).

Success for the EA community should include a strong possibility that we learn more, change our minds, and therefore no longer work on causes that we once thought were important.

CEA’s work should be broadly cause-general: The reason we have an EA community instead of individual communities focused on specific causes is:

  1. We don’t know for certain what causes are most important and we may discover a new Cause X in the future.
  2. We don’t know for certain which approaches to existing causes are most important and we may discover new approaches in the future.
  3. Despite our uncertainty, we can take actions that are useful across many causes.

CEA’s work should be broadly beneficial regardless of one’s views on the relative importance of different causes. This is why our mission is to build the EA community. We believe our comparative advantage lies in finding and coordinating with people who can work on important problems.

CEA’s work as both descriptive and prescriptive: While most of our work is cause-general, there will be cases where we have opportunities to support work in particular cause areas that we currently believe are likely to have the highest impact.

We think it is therefore helpful to make a distinction between aspects of CEA’s work that are descriptive and those that are more prescriptive.

Descriptive work aims to reflect what is actually happening in the EA community; the kinds of projects people are working on and issues people are thinking about. The EA Newsletter is a clear example of this because it includes updates from around the community and from a variety of EA and EA-adjacent organizations.

Other aspects of CEA’s work should be prescriptive, meaning that they involve taking a view on where the community should be headed or on what causes are likely to be most important. For example, CEA’s Individual Outreach team does things like help connect members of the community with jobs we consider high-impact.

In forums where CEA is providing a resource to the entire EA community (for example, the EA Forum, Effective Altruism Funds, or events like EA Global), our work should tend towards being more descriptive.

We’re unsure who our work should be representative of: One challenge in making our work more representative is that it’s unclear what reference class we should be using when making our work more representative.

On one extreme, we could use all self-identifying EAs as the reference class. This has the downside of potentially requiring that our work address issues that expert consensus indicates are not particularly important.

On another extreme, we could use the consensus of community leaders as the relevant reference class. This has the downside of potentially requiring that our work not address the issues that the overwhelming number of community members actually care about.

The best solution is likely some hybrid approach, but it’s unclear precisely how such an approach might work.

Soliciting a wider range of viewpoints: Although we think we should do more to address representativeness concerns, we had already taken some steps to address this concern prior to Joey’s post on this issue.

These included:

  • Consulting ~25 advisors from different fields about EA Global content (already in place).
  • Changing the EA handbook to be more representative (in progress).
  • Selecting new managers of the Long-Term Future and EA Community EA Funds (in progress).

We do, however, recognize that when consulting others it’s easy to end up selecting for people with similar views, and that this can leave us with blind spots in particular areas. We are thinking about how to expand the range of people we get advice from. While we cannot promise to enact all suggestions, we would like to hear suggestions from forum users about what else they might like to see from CEA in this area.