You’re welcome to join us for our next Data Ethics & Society Reading Group on Tuesday the 24th August 2021 at 12:30 - 13:30 GMT.
Following our successful event looking at the first three chapters (Earth, Labor and Data), this time we’re going to discuss the final chapters (Classification, Affect, State & Power) of Atlas of AI by Kate Crawford.
Atlas of AI presents AI as a technology of extraction: from the minerals drawn from the earth, to the labour pulled from low-wage information workers, to the data taken from every action and expression.
This book can be purchased in the UK from Blackwell’s, AbeBooks, Amazon (kindle or hardback), or an independent retailer.
We have put together some material related to some of the concepts in the book below.
Kate Crawford on “Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence”, Kate Crawford, (video, 48 mins). Crawford discusses the book in this sub one-hour video using content from the book itself. If you don’t have time to read the book, or read/watch/listen to the below, please watch this video.
Excavating AI, Kate Crawford & Trevor Paglen Contains a lot of similar material to that covered in the ‘Classification’ chapter.
Artificial Intelligence is Misreading Human Emotion, Kate Crawford, The Atlantic. This article is adapted from the book’s ‘Affect’ chapter.
Google’s artificial intelligence ethics won’t curb war by algorithm, Phoebe Braithwaite, Wired. This article explores how Google was involved with the US Department of Defense’s Project Maven, which used AI to target drone strikes.
Stop talking about AI ethics. It’s time to talk about power, Karen Hao, MIT Technology Review. This is a review article for the book, which summarises everything neatly.
Thank you to Harriet for suggesting this week’s content, and to all those who suggested content, which we we look forward to sharing at future events.
There will be time to talk about whatever we like, relating to the paper, but here are some specific questions to think about while you’re reading.