Clearer Thinking with Spencer Greenberg
Read the full transcript here.
What is longtermism? Is the long-term future of humanity (or life more generally) the most important thing, or just one among many important things? How should we estimate the chance that some particular thing will happen given that our brains are so computationally limited? What is "the optimizer's curse"? How top-down should EA be? How should an individual reason about expected values in cases where success would be immensely valuable but the likelihood of that particular individual succeeding is incredibly low? (For example, if I have a one in a million chance...