CEA's Strategy

Last updated: March 2021

CEA's overall aim is to do the most we can to solve pressing global problems — like global poverty, factory farming, and existential risk — and prepare to face the challenges of tomorrow.

There are many opportunities to help others, but some are much more effective than others, and it's hard for people to find the best opportunities. People also often need social support, funding, and mentorship in order to pursue a career that's focused on helping others. As a result, talented people often have little impact (compared to what they could achieve) or become disillusioned with altruistic projects.

The EA community is a place where people can learn about and discuss which ways of doing good are most effective, based on impartially altruistic, truth-seeking principles. It also provides people with social support and connections that help them to help others, whether they become researchers, run their own charities, focus on personal philanthropy, or follow another path.

CEA is building this community by creating and sustaining high-quality discussion spaces. We curate events, local groups, and an online forum where people with a broad range of views discuss the most effective ways to address some of the world's most pressing problems.

If we succeed, the next generation of leaders, thinkers, and philanthropists will be focused on addressing these problems as effectively as possible.

We work across three broad areas: onboarding, continued engagement, and community health.


We focus our recruitment efforts on students and young professionals.

We want to work with EA-involved professionals and students of all ages, but we're especially focused on recruiting younger people, because this is where we have the best track record and experience. We also think that younger people typically have more free time and flexibility in their plans. Survey data indicates that people who heard about EA in their early 20s are the most likely to become deeply involved in the community.

The main downside of recruiting young people is that it will often be a long time before they reach their peak impact, and it's hard to predict how their values or priorities might change before then. This is part of why we also care about continued engagement (see below).

Local groups are one of the top ways that young people get more deeply involved, as demonstrated by the EA Survey and interviews with community members. We think that targeted events and resources can bolster student groups, and we aim to integrate events and the EA Forum more closely with groups' work (as we did with our introductory events and Student Summit in 2020).

Our goal is to solve pressing global problems, but we can't address everything at once. So as we decide how to use staff time and funding, we believe it's important to focus on the decision-makers who will be most influential in the future. In our society today, these future decision-makers are more likely to be found at highly-ranked universities. Therefore, we are especially focused on supporting groups at these universities.  However, we also provide free resources and support to all EA groups around the world, and host open discussions on the EA Forum.

As we assess our own work, we track whether we're helping people take significant actions based on a good understanding of EA principles. This could include making a career plan, choosing a field of study, donating (or pledging to donate) a significant percentage of their income, or similar. We plan to carry out interviews and surveys to learn what actions they've taken (and why), but we don't feel that we are well-placed to rank the impact of particular actions, so we will be relatively agnostic as long as the interviewees seem to have a good understanding of the ideas of effective altruism. We do not require someone to have a particular type of job, or a particular way of prioritizing causes.

We hope that our focus on especially promising populations and high-quality discussion will allow us to recruit people who go on to have a lot of positive impact. Throughout this work, we aim to be welcoming to people of different backgrounds.

Continued engagement

We also aim to ensure that existing members of the EA community, whether newer or more experienced, get value from the community and continue to take significant action based on EA principles.

This is valuable for several reasons:

  • People can learn about things that cause major changes in their plans. Due to new research and changing circumstances, the most promising opportunities for impact change over time. For instance, the relative value of earning to give fell around 2014 as Open Philanthropy became a major donor to EA causes. If people continue to engage with the community, they can continue to update their plans based on information like this. We think that this could substantially increase their expected long-term impact.

  • New ideas can help people in their current roles. If people are keeping up with new ideas from the community, they may learn things that allow them to be more effective in their current roles. For example, a researcher might learn about relevant research in an adjacent field, or a donor might learn something that changes where they donate.

  • Connections can lead to opportunities. If people are well-networked with others in the space, they'll have an easier time getting a project funded or switching to a role that's a better fit for them.

  • Existing members can help to onboard new members. They can do this by introducing their friends to EA or by providing advice and mentorship to newer members. Personal contacts are the most common way for people to find out about EA.

  • Engagement supports motivation. We want EA community members to feel supported and sustained over time in taking actions such as working in high-impact careers or donating effectively.

Community health

We also aim to preserve the EA community's ability to grow and produce value in the future.

Bringing people into the community is important, but EA's discussion norms, culture, and reputation are also big determinants of our long-term success. How we discuss ideas is vitally important because it shapes our ability to learn more, uncover mistakes we're making, and resolve uncertainties and disagreements. Additionally, EA's internal culture, reputation, and demographics affect who feels comfortable engaging with our community.

We think it's important that we build a healthy intellectual culture, a positive reputation, and an inclusive community. If we fail to do so, a lot of the movement's potential could be squandered. This could happen because we fail to focus on the most important issues, because we can't work with certain important groups, or because we miss key opportunities to grow the movement and help more people increase their impact.

We do many things to protect and develop these resources, including:

  • Moderating discussion and selecting speakers in a way that promotes a culture of collaborative truth-seeking, and curating introductory content that reinforces that culture.
  • Advising people who are speaking to the media about EA topics.
  • Helping to navigate conflicts between community members.
  • Improving demographic diversity and building an inclusive and equitable community.
  • Supporting geographic and professional areas where EA is just beginning to get established (and where its norms and reputation might be especially fragile).

Where do we work?

We aim to support recruitment, retention, and value preservation work around the world, in collaboration with local community builders. We currently manage the EA Forum, and provide resources and support calls for all EA groups. We've also held or supported global and regional conferences in countries including the US, the UK, Singapore, Germany, and Australia. We plan to maintain or improve these resources. 

While many of our resources are available for all organizers and individuals, our ability to fund full-time organizers at local groups is limited by our funding and by the time it takes to evaluate groups, so that program does have a narrower focus, which you can read about here.

Where we are not focusing

We want to be clear about what we're not doing, so that people who want to work on community building have a better sense of which areas are neglected. We think some of these things could be impactful if well-executed, even though we don't have the resources to take them on. This blog post gives more information on areas that we are not planning to focus on (as of March 2021).

In short, we’re not focusing on:

  • Reaching new mid- or late-career professionals
  • Reaching or advising high-net-worth donors
  • Fundraising in general
  • Cause-specific work (such as community building specifically for effective animal advocacy, AI safety, biosecurity, etc.)
  • Career advising
  • Research, except about the EA community
  • Content creation
  • Donor coordination
  • Supporting other organizations
  • Supporting promising individuals

Our plans for the next year

The above is focused on our high-level and long-term strategy. For more on what this means for our programs in the short term, see our work in 2020 and our plans for 2021.

Other resources

Other pages that discuss our strategy:

For news and updates on effective altruism more generally, we recommend signing up for the monthly EA Newsletter or reading the EA Forum.

Highly Engaged EA Metric

This section was added March 2022

This year we’ve been piloting a metric (called “High Engaged EAs”) within a subset of university groups.

We define highly engaged EAs as people who use high-quality reasoning to apply impartially altruistic principles to significant actions.

We don’t think that the Highly Engaged EAs metric is the only thing that matters for groups to aim for, but we think it helps group leaders make trade offs in a valuable direction.

Breaking this definition of Highly Engaged EAs a bit:

  • EA principles: there is no official list of EA principles, but we think that 80,000 Hours’ Key Ideas page or the Introductory EA Program cover most of the key topics/considerations.
  • Significant actions include things like career decisions, or credible steps toward a career plan for younger individuals.
  • High-quality reasoning means a person can do things like list their cruxes for their cause prioritisation (i.e. what would change their mind), and doesn’t seem to be retroactively justifying their current career path.

Our best guess is that most Highly Engaged EAs have spent over 100 hours engaging with high-quality EA content. But note that the “100 hours” isn’t really the thing we care about. It’s the actual understanding and application of impartially altruistic principles.

What the highly engaged EA definition does and doesn’t mean:

  • While want group leaders to help people think through the key topics / considerations in 80,000 Hours’ Key Ideas page or the Introductory EA Program, the highly engaged EA definition does not mandate that a person has to hold a certain cause prioritisation or set of empirical beliefs about the world.
  • We do think it’s important that individuals share norms outlined in our guiding principles.
  • We do think it’s important that people can list reasonable cruxes for their current cause prioritisation.
  • The highly engaged EA definition does not mandate that a person is “socially involved in EA”.
    • For example: Person A might come to 50 EA events, or live in a group house with other EAs – but this does not make them a highly engaged EA. Meanwhile Person B may have never interacted with other effective altruists in person, but may have deeply read through things like the 80,000 Hours website and the EA Forum and have shaped career decisions around ideas that emerged from these spaces. It’s possible Person B is a highly engaged EA, despite having little social interaction with other EAs. The point we are emphasising here is that we want Group Leaders to be focusing on helping people engage with key ideas and apply them to their lives, not just trying to maximise the number of people showing up to a social events.
  • Just because someone is working for an EA-aligned organization, or in an EA-aligned role, doesn’t mean they’re necessarily a highly-engaged EA: e.g. they might not have thought through that career decision carefully and open-mindedly.

Over this year we have worked with some group leaders to collect a (rough) baseline (2019-2020) of Highly Engaged EAs (HEAs) and a set of peer benchmarking data (fall 2021) from Oxford, Harvard, and Stanford. We also asked people engaged in their group about the largest positive and negative influences of other EA programs they might have engaged in (EAG/x, 80,000 Hours program, reading the EA Forum, virtual programs, etc.). This allows us to start to track the effectiveness of different campus models and activities over time. After talking with group leaders, we think we should have included a qualitative question in our user interviews about which aspects of their group interactions have had the most significant impact on them to date, as it would have provided helpful feedback for group leaders.

We think it’s a good thing for group leaders to have group members to complete surveys about their experience with their group and see how it changes over time. In addition to tracking the number of Highly Engaged EAs in their group, some group leaders have reported that the “self-reported engagement levels” used in the EA Survey have helped them track changes in their group over time - especially if they want to collect group member self-reported data.