![rw-book-cover](https://substackcdn.com/image/fetch/$s_!VTm_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa07322e5-1e94-4bf0-830f-cac23a76c79d_780x438.png) ## Metadata - Author: [[Benn Stancil]] - Full Title:: Compacting... - Category:: #🗞️Articles - URL:: https://benn.substack.com/p/compacting - Read date:: [[2026-03-22]] ## Highlights > If you had a day to save your [business], where would you go looking for answers? In your comprehensive database of financial indicators, operational KPIs, customer behaviors, and market trends, where every metric and insight is a query away? Or in 25 straight hours of unedited video interviews [with 750 customers]? ([View Highlight](https://read.readwise.io/read/01km87wbvx68847kmrdkc6b9xt)) > There are two sides—a light and a shade, if you will—to the capabilities that powered Anthropic’s study. On one hand, they demonstrate how much more we can learn with AI. When a machine can have conversations and another machine can summarize them, we can hear far more voices. ([View Highlight](https://read.readwise.io/read/01km8830xq9h272x45w3npkj61)) > On the other hand, *is this listening?* We have always known that facts and figures are sterilizing. ([View Highlight](https://read.readwise.io/read/01km8835ehk18gd5ddz17p1er9)) > [From Ezra Klein](https://www.youtube.com/watch?v=smb7hy6KufQ&t=2761s), in a conversation with David Perell: ([View Highlight](https://read.readwise.io/read/01km8848nh3jsw1hphf1pbysyj)) > Part of what is happening when you spend seven hours reading a book is you spend seven hours with your mind on this topic. The idea that [large language models] can summarize it for you…you didn’t have the engagement. ([View Highlight](https://read.readwise.io/read/01km8841q1cpztfcvv40evww6r)) > Will we still understand what they’re really saying? ([View Highlight](https://read.readwise.io/read/01km885jb94ntb6mb5x9bkcffk))