Group discussion on a recent podcast from 80,000 Hours where they interview Tyler Cowen - link here.
Tyler makes the case that, despite what you may have heard, we can make rational judgements about what is best for society as a whole. He argues:
Our top moral priority should be preserving and improving humanity’s long-term future
The way to do that is to maximise the rate of sustainable economic growth
We should respect human rights and follow general principles while doing so.
They also look at:
Why couldn’t future technology make human life a hundred or a thousand times better than it is for people today?
Why focus on increasing the rate of economic growth rather than making sure that it doesn’t go to zero?
Why shouldn’t we dedicate substantial time to the successful introduction of genetic engineering?
Why should we completely abstain from alcohol and make it a social norm?
Why is Tyler so pessimistic about space? Is it likely that humans will go extinct before we manage to escape the galaxy?
Is improving coordination and international cooperation a major priority?
Why does Tyler think institutions are keeping up with technology?
Given that our actions seem to have very large and morally significant effects in the long run, are our moral obligations very onerous?
Can art be intrinsically valuable?
What does Tyler think Derek Parfit was most wrong about, and what was he was most right about that’s unappreciated today?
How should we think about animal suffering?
Do self-aware entities have to be biological in some sense?
What’s the most likely way that the worldview presented in Stubborn Attachments could be fundamentally wrong?
Location: Wellcome Collection, 183 Euston Road, NW1 2BE
If lost you can call 07740836835