The Centre for Effective Altruism (CEA) is helping to build a community of people acting on altruistic, impartial, truth-seeking principles, by nurturing spaces to discuss EA ideas. This post sets out our plans for the year.
In 2019, we carried out an Executive Director search, which ended when I (Max Dalton) was confirmed as Executive Director and subsequently brought on Joan Gass as Managing Director. We also focused on improving the execution of our programs (e.g. by more reliably following through on public commitments).
In 2020, we’ll be figuring out how we want to build on that foundation — i.e. improve our organizational focus and program objectives. We’ll also begin to narrow our focus by investing more in groups and community support, and considering how to spin off our fundraising projects (Giving What We Can and EA Funds).
Below are our goals for 2020. We’ll review these goals periodically and adjust as circumstances change. We might not achieve all of these goals.
- Developing our strategy
a. Gather feedback and data to test our strategy.
b. Share more information about our strategy on the EA Forum and our website.
c. For all projects, draft goals that link to our overall strategy, and use these goals as the basis for our annual review.
- Narrowing our scope by considering spinning off EA Funds and Giving What We Can
a. Make a plan for this area that has been confirmed by the ED, CEA’s trustees, and project staff.
b. Hire at least one person in this area.
- Hiring to support groups, diversity, and public communications
a. Hire at least one person to increase support for university group organizers.
b. Hire, or bring on a consultant, to the community health team.
c. Gather feedback from members of underrepresented groups on EA messaging.
- Improving online discussion
a. Run at least two online events between April and December 2020.
b. Achieve a mean score of >8/10 on monthly support calls with organisers of EAGx conferences.
c. Share a sequence of introductory material on the Forum, and add supporting features to enable these materials’ composition.
d. Make it easier to find Forum content via improved tagging and search.
e. Improve the Forum post editor and reach out to potential authors.
- Streamlining internal collaboration and processes
a. Maintain or increase average team morale scores during the COVID-induced remote work period.
b. Aggregate data from at least three programs into a central CRM.
c. Invest our reserves in low-risk, low-rate, diversified portfolios.
d. Move into a new office in Oxford.
e. Revamp our hiring and onboarding systems.
f. Implement a new grant management system.
g. Align account codes, automate reporting, and share quarterly financial reports with project leads.
I’d welcome feedback on these plans via this form or in the comments, especially if you think there’s something that we’re missing or could be doing better.
We are developing our answers to some of the following questions:
- What is CEA’s scope? What things should we do? What should we not do?
- We hope that clarifying our plans will make it easier for others to coordinate with us — in particular, to take on areas that we’re not focused on.
- What would indicate that we’re doing a good job overall? What metric(s) should we focus on?
- We hope that this will help us to focus internally and make tradeoffs.
- It will also make it easier to communicate with community members and donors about the progress we’re making.
- What are the top priorities for CEA over the next 1-3 years?
- Which target audience should we focus on?
I have been working with staff, trustees, and an experienced executive advisor to develop draft answers to the questions above.
Some preliminary answers are:
- What is CEA’s scope?
- CEA should focus on nurturing spaces for people to learn about and act on EA principles. These discussion spaces might include events, local groups, or online discussion spaces like the Forum. We want to promote the principles of EA with illustrative applications (e.g. to global health or existential risk), rather than any one particular application (e.g. to global health or existential risk).
- We don’t think it’s our comparative advantage to do cause-specific community building (e.g. AI-safety discussion groups), promote particular career paths, promote effective giving, or do research.
- Obviously, we want to continue to support and collaborate with partners who have a different focus (e.g. cause-specific or talent-focused).
- What would indicate that we’re doing a good job overall?
- We think it would be harmful to evaluate CEA based on any simple metric.
- We are developing a metric that accounts for how many people are regularly engaging with and acting on EA principles and how satisfied those people are with EA.
- We think that this metric should be supplemented by qualitative feedback from community members.
- What are the top priorities for CEA over the next 1-3 years?
- For the 1-year timescale, our best guess is discussed below.
- We’re still developing longer-term plans, but if we had to guess, we expect that Groups is the program at CEA that is most likely to grow.
- Which target audience should we focus on?
- We think that we are best placed to support existing community members (of all ages) and to recruit students and young professionals. We’re keen to welcome new high-net worth individuals or mid-career professionals in the EA community, but we think that other projects (e.g. Effective Giving and 80,000 Hours) are better suited to recruit them.
We will improve our answers to these questions by consulting community members, gathering more data, and testing some hypotheses.
We also aim to share more about our strategy publicly. This may be a long process, so we're not sure whether we’ll have capacity to share more detailed plans in 2020.
We’d like to make our program objectives more explicit, test our assumptions about how to reach those objectives, and check for things we might be missing. In particular, we’d like to improve our answers to the following questions:
- How do each program’s objectives contribute to achieving our org-wide strategy?
- How do we track/measure whether we're achieving these objectives?
- Do we have a reasonable plan for how we will achieve these objectives?
- Are we executing well (e.g. meeting commitments, allocating sufficient capacity to run each program)?
In 2019, Giving What We Can members logged over $20m in donations to the charities that they believe to be most effective, and 528 people took a 10% lifetime pledge, bringing the year-end total to 4,454 members. EA Funds facilitated grantmaking of $8.5m through the four main funds, as well as $3.4m to other effective charities.
I think that both of these programs are important for EA because:
- They direct a significant amount of money to effective charities.
- They provide an opportunity for individuals to take important, concrete actions based on EA principles.
However, these projects have a fairly different focus from CEA’s other projects (which focus on community engagement rather than charitable donations), and we think that with more focus and staff time they could achieve more.
We'd like to move towards a state where these projects have the latitude and resources to accomplish more, and where CEA can focus on a narrower range of projects. Over the last few months I’ve been working with trustees and staff to plan for the future of these projects, using surveys of users and members to inform our thinking.
We’ll initially search for someone who can lead an independent Giving What We Can. If you know someone who you think might be a good fit for this role (including yourself), you can fill out this form. If we find a leader for Giving What We Can, we’ll help to onboard and advise them, and we will continue to provide operational support to both EA Funds and Giving What We Can for the foreseeable future. Once we’ve completed our hiring round for the Giving What We Can director, we will consider focusing more on plans for hiring an executive for and/or spinning out EA Funds.
EA Survey respondents report that local groups and personal connections are some of the most important ways that they hear about EA and get more involved. This suggests that support for groups is important.
CEA’s Groups team currently has two full-time staff (Katie Glass, Harri Besceli) as well as a part-time contractor (Catherine Low). With this capacity, we expect to maintain the team’s current activities, such as funding organisers with the Community Building Grants program, maintaining discussion spaces for organisers on Facebook and Slack, curating resources for groups, and responding to organisers’ requests for advice or funding.
We’ve found that university groups are especially well-positioned to engage with new people interested in EA. We therefore intend to hire a new specialist to develop customised advice and resources for university group organisers. Given the large number of university groups, we’ll start by piloting this with a subset of groups.
Our community health team works to reduce risks to the community, and to improve people’s experiences in the community. This may increase effective altruism’s robustness and potential over the long run.
The team currently has three staff (Julia Wise, Sky Mayhew, Nicole Ross) who cover the following areas:
- Researching what the biggest risks to EA are, and creating proposals for addressing them
- Responding to community members’ concerns, such as mental health struggles, safety concerns, or interpersonal conflicts
- Responding to media inquiries and developing clear, accurate public messages about complex EA topics
- Advising on organizational strategy or responding to community members’ requests for advice on best practices on a variety of topics — for example, how to host safe events, moderate online groups, and build diverse and welcoming groups or programs
We want to have clear, inspiring messages about complex EA topics, and we want the spaces we foster to be welcoming to a diverse range of people. We think that these goals are linked. We’re still figuring out how to achieve these goals, and whether we need new hires/consultants to help us achieve them.
Due to COVID-19, we could not hold EA Global: San Francisco in person. Local groups around the world have stopped holding events and retreats, and we’re still not sure how many EAGx conferences will be able to take place “on location” this year.
We pivoted EA Global: San Francisco to an online format, which resulted in over 3,000 total live views of talks (and several thousand more since), 438 one-on-one meetings, and lots of discussion on Slack and YouTube. Whilst attendees didn’t make as many connections or give as much positive feedback as they do at in-person EA Global conferences, they still reported a positive experience and increased motivation. We’re pleased with the number of meetings and new connections given the lower cost of the event. We are also providing coordination and support for local groups and EAGx organisers as they shift to online events and discussion spaces.
The connections that people have formed in these events help to maintain people’s engagement with the community. We are interested in holding more virtual events in the future, and hope to learn things that we can apply once the current crisis is over.
The EA Forum is a key resource for people who read about and discuss EA online. This year, we’d like to provide better introductory resources, and increase the number of people reading and writing excellent content. Our plans include: adding introductory material, supporting authors, and helping readers find content that interests them.
Introductory material: We plan to launch a feature which allows users to create collections of posts (e.g. on a particular theme). We will release at least one collection ourselves, curated by our team and aimed at introducing people to effective altruism. We will ask experts and a broad range of community members to give feedback on the collection. In the future, this might replace our EA Handbook or the introduction on effectivealtruism.org.
Supporting authors: We will reach out to a number of EA researchers who rarely or never post on the Forum to offer assistance with editing and formatting, which we hope will lead them to publish more of their work on the platform. For the benefit of all our authors, we will be updating the text editor so that it is much easier to use (with features including native table support and the ability to copy and paste images into a post). We’re also considering ways to give authors better feedback on how people have engaged with their posts.
Helping readers: We will add a tagging system to make it easier to browse posts on a given topic. We’ll also make back-end technical changes to make it easier to find specific posts via Google search.
(Several features mentioned in this update will be ported over from LessWrong; much of the Forum’s code is based on that site, and we work closely with their team on development.)
We will also continue to focus on increasing our core metric and will choose additional projects throughout the year to further that goal.
Currently, our records of community members are split across different systems for different programs. This year, we aim to pull together this data into one CRM.
We hope that this will enable staff to more easily access and aggregate information, and make more informed decisions (e.g. about EAG admissions). It will also help us to track community growth and monitor whether people are drifting away or continuing to engage with EA.
CEA runs operations for 80,000 Hours and the Forethought Foundation, provides operational support to several other EA projects, and routes millions of dollars of donations to effective projects and organisations. Given this, there are significant benefits to reducing risks, saving time, and investing financial reserves appropriately.
To improve our operations, we intend to
- Move into a larger, refurbished office in Oxford (shared with the Future of Humanity Institute and the Global Priorities Institute).
- Upgrade our hiring and onboarding systems.
- Set up software systems to automate grantmaking operations and to track donations to CEA, 80,000 Hours, and the Forethought Foundation.
- Improve our financial systems (e.g. automate more of our reporting).
- Invest our reserve capital in low-risk, low-rate, diversified portfolios.
For the past few years, we’ve had several remote staff, so we’ve always had lots of video calls, team retreats, and an active Slack workspace.
We plan to permanently close our office in Berkeley, with our Berkeley staff moving to remote work. This will save time and money, and establish Oxford as our headquarters.
Due to COVID-19, all our staff are currently working remotely; we’ve invested further in making sure that staff have ergonomic home working spaces, regular group social calls, time off when necessary, and tailored support from their managers.
I’m proud of how we have developed as a team over the last year. Staff at our last team retreat reported enjoying collaborating with colleagues, learning from them, and seeing them grow.
We're still developing a more explicit account of our cultural values, but some things I'd like us to keep doing are:
- Raising our standards for execution and making sure we have capacity to execute on any projects we take on
- Acknowledging and learning from our mistakes (e.g. see our updated mistakes page)
- Encouraging internal transparency and upward feedback
- Building a more detailed and specific sense of our goals, how well our projects are furthering those goals, and how we hope the EA community will develop, via discussion
- Seeking advice and input from stakeholders and feedback from a broad range of community members as we develop project goals
By the end of 2020, I hope we'll have made further progress on our org-wide strategy and have more specific program objectives that help us achieve our org-wide goals. I also hope that we’ve strengthened our groups and community support, and begun to set EA Funds and Giving What We Can up to flourish independent of CEA.
I’d welcome feedback on these plans via this form or in the comments. I particularly look forward to hearing reactions to our initial thoughts on strategy, shared above.
We plan to check in on these goals in a future annual review.