• Skip to main content
  • Skip to footer
the-one-you-feed-podcast-eric-zimmer-logo-dark-smk
  • About
    • The Podcast
    • The Parable
    • Eric Zimmer
    • Ginny Gay
  • The Podcast
    • Episodes Shownotes
    • Episodes List
    • Anxiety & Depression
    • Addiction & Recovery
    • Habits & Behavior Change
    • Meditation & Mindfulness
  • Programs
    • Overwhelm is Optional Email Course
    • Wise Habits
    • Free Masterclass: Habits That Stick
    • Coaching
  • Membership
  • Resources
    • 6 Sabotuers FREE eBook
    • Sign Up for Wise Habits Text Reminders
    • Free Masterclass: Habits that Stick
    • Free ebook: How to Stick to Meditation Practice
    • Free Training: How to Quiet Your Inner Critic
    • Anti-Racism Resources
    • Blog
  • Contact
    • General Inquiries
    • Guest Requests
  • Search
Wise Habits Reminders

The Hidden Costs of Technology and Our Search for Selfhood with Vauhini Vara

September 2, 2025 Leave a Comment

The Hidden Costs of Technology
Watch on YouTube
Listen on Spotify
Listen on Apple Podcast

In this episode, Vauhini Vara discusses the hidden costs of technology and our search for selfhood. She explains how we live in a world where technology functions as both a lifeline and a trap—offering connection, convenience, and possibility while also shaping our choices, exploiting our attention, and redefining how we see ourselves. Together, Eric and Vauhini explore the tension of relying on tools we can’t seem to live without, the subtle ways algorithms alter our communication, and what it means to hold onto authentic selfhood in the digital age.

Feeling overwhelmed, even by the good things in your life?
Check out Overwhelm is Optional — a 4-week email course that helps you feel calmer and more grounded without needing to do less. In under 10 minutes a day, you’ll learn simple mindset shifts (called “Still Points”) you can use right inside the life you already have. Sign up here for only $29!

Key Takeaways:

  • Exploration of the dual nature of technology as both beneficial and exploitative.
  • Discussion on the impact of major tech companies like Amazon, Google, and OpenAI on personal identity and society.
  • Examination of the ethical implications of consumer choices in a global capitalist system.
  • Reflection on how technology alters human communication and relationships.
  • Analysis of the concept of “algorithmic gaze” and its effects on self-perception and identity.
  • Personal narratives intertwining technology with experiences of grief and loss.
  • Consideration of AI’s role in creative processes and its limitations compared to human expression.
  • Discussion on the commodification of identity in the age of social media and audience capture.
  • Insights into the ongoing negotiation between convenience and ethical considerations in technology use.
  • Emphasis on the importance of individual agency and conscious decision-making in navigating the digital age.

Vauhini Vara is the author of Searches, named a best book of the year by Esquire and a Belletrist Book Club pick; Publisher’s Weekly called it a “remarkable meditation.” Her previous books are This is Salvaged, which was longlisted for the Story Prize and won the High Plains Book Award, and The Immortal King Rao, a Pulitzer Prize finalist and winner of the Colorado Book Award. She is also a journalist and a 2025 Omidyar Network Reporter in Residence, currently working as a contributing writer for Businessweek.

Connect with Vauhini Vara: Website | LinkedIn

If you enjoyed this conversation with Vauhini Vara, check out these other episodes

Distracted or Empowered? Rethinking Our Relationship with Technology with Pete Etchells

Can Radical Hope Save Us from Despair in a Fractured World? with Jamie Wheal

Human Nature and Hope with Rutger Bregman

By purchasing products and/or services from our sponsors, you are helping to support The One You Feed and we greatly appreciate it. Thank you!

This episode is sponsored by AG1. Your daily health drink just got more flavorful! Our listeners will get a FREE Welcome Kit worth $76 when you subscribe, including 5 AG1 Travel Packs, a shaker, canister, and scoop! Get started today!

BAU, Artist at War opens September 26. Visit BAUmovie.com to watch the trailer and learn more—or  sign up your organization for a group screening.

LinkedIn: Post your job for free at linkedin.com/1youfeed. Terms and conditions apply.

patreon

If you enjoy our podcast and find value in our content, please consider supporting the show. By joining our Patreon Community, you’ll receive exclusive content only available on Patreon!  Click here to learn more!!

Episode Transcript:

Vauhini Vara 00:00:00  When I communicate. And I think this is not just because I’m a writer, I think it’s because I’m a human being. When I communicate, the gratification I get from that communication is from having made the effort of communicating myself, and it sort of does nothing for me if a machine does it for me. I mean, it doesn’t feel that different from like using a magic eight ball or something to produce words.

Chris Forbes 00:00:31  Welcome to the one you feed. Throughout time, great thinkers have recognized the importance of the thoughts. We have quotes like garbage in, garbage out or you are what you think ring true. And yet, for many of us, our thoughts don’t strengthen or empower us. We tend toward negativity, self-pity, jealousy, or fear. We see what we don’t have instead of what we do. We think things that hold us back and dampen our spirit. But it’s not just about thinking. Our actions matter. It takes conscious, consistent and creative effort to make a life worth living. This podcast is about how other people keep themselves moving in the right direction, how they feed their good wolf.

Eric Zimmer 00:01:15  We live in a world where technology is both a lifeline and a trap. Take Amazon I want swore them off after a broken blender and the most absurd customer service call imaginable, I made a big declaration. That’s it. No more Amazon. And my grand boycott lasted five days. And the worst part? I disliked myself a little when I went back because it wasn’t just the blender. This was already a company that had killed my beloved bookstores, and now it feels like they’re coming for everything else. That’s the trap we keep returning to what we wish we didn’t need. Vauhini Vara explores this exact tension in her book Searches Selfhood in the Digital age, showing how the very tools that connect us also exploit us. In our conversation, we wrestle with this ambiguity. How do we live with technology we can’t seem to live without? I’m Eric Zimmer and this is the one you feed. Hi, wahine. Welcome to the show.

Vauhini Vara 00:02:15  Thanks for having.

Eric Zimmer 00:02:16  Me. I’m excited to talk with you about your latest book, which is called Searches Selfhood in the Digital Age, and it’s really a book that explores, I think, our relationship with technology, broadly speaking, and it’s a topic that I think is a really important one because we are in deep relation to technology, most of us all the time.

Eric Zimmer 00:02:38  And so I think it’s always worth exploring that. But before we get into your book, we’re going to start with a parable like we always do. And in the parable there’s a grandparent who’s talking with their grandchild, and they say, in life there are two wolves inside of us that are always at battle. One is a good wolf, which represents things like kindness and bravery and love, and the other is a bad wolf, which represents things like greed and hatred and fear. And the grandchild stops and thinks about it for a second, looks up at their grandparent and says, well, which one wins? And the grandparent says, the one you feed. So I’d like to start off by asking you what that parable means to you in your life and in the work that you do.

Vauhini Vara 00:03:17  Yeah. I mean, it makes me think of a couple of different things. One thing that makes me think of is the way in which sometimes those two wolves are very intertwined. Like it’s actually the same wolf, right, with two sides.

Vauhini Vara 00:03:31  And I think about that when it comes to our relationship with big technology companies, products, which is the subject of my book, because I think we sometimes like to talk about that in binary terms. You know, we say these technology companies are exploiting us and they’re evil. And then the technology companies will say, but you’re using these products, so you must find them pretty useful and even fun and enjoyable. And maybe they even bring you joy. And the truth, of course, is that not only both of those things are true, but that they are deeply intertwined. Like the way in which these products are useful to us requires the exploitation. So, for example, when we use Amazon and we’re delighted that things don’t cost much and they come to us quickly. That’s because there are labor practices at Amazon and its suppliers that, you know, can be described as sort of shady and exploitative, that are responsible for making these products cheap and come to us quickly.

Eric Zimmer 00:04:39  Yeah, that’s one of the things I really loved about the book was the deep ambiguity in it, the ambiguity that you have and write about really honestly with technology and the nuance of recognizing that these are both good and bad things, and I like what you say.

Eric Zimmer 00:04:59  That can’t necessarily just be taken apart, right? Like one of the things that makes a search engine more valuable to us over time is that it knows what we want. But that’s the exact thing that is being exploited is knowing what I want. Right. And so, you know, it’s it’s very hard to envision a world in which you got one without the other. And so maybe you could just first describe for us what the book is, because there’s, there’s a lot of different ways of talking about it. And I want to make sure that you get to present it in the way that makes sense to you.

Vauhini Vara 00:05:34  Yeah. I mean, I think the I think of the book as a, a document of what it’s like to live in a world in which our consciousness has been so colonized by big technology companies and their products. And then in addition or relatedly, what it feels like to be complicit in the rising power and wealth and exploitation of that power and wealth of big technology companies. you know, the way in which all those things provide us with usefulness, the way in which that makes us feel guilty and ashamed, and also glad that these products exist in our lives.

Vauhini Vara 00:06:23  and I do that in the book in a way that I think is is sort of unusual in that I write about it. So the book has chapters where I’m just I’m talking about this in my life and all of our lives. And then there are chapters and, you know, bits that are interspersed between the chapters where I’m kind of showing this in action, using my own interactions with these companies products. So there’s a chapter, for example, made up of my own Google searches over ten years.

Eric Zimmer 00:06:50  Which I found absolutely fascinating. and made me kind of want to go find all my Google searches over ten years, because you’re right. What an interesting way to look back on your life and your interests. I loved how in you were talking about these search things that you quoted somebody John Battelle called it the database of Intentions, a comprehensive record of human desires, fears and needs that becomes raw material for corporate profit.

Vauhini Vara 00:07:20  Yeah, I love that characterization. that that comes to me from the scholar and writer Shoshana Zuboff.

Vauhini Vara 00:07:28  That idea that all this material that we put into these products is raw material that they then render into a product, and the product is actually the information about us that they’re then able to use to get marketers and advertisers to ask them to, to present ads to us. And so, yeah, I mean, there’s that. But then also, I’m a I’m a writer, I’m an artist. And so that term raw material sort of has this additional artistic meaning to me. And it’s interesting to me that my searches on Google. Anyone searches on Google function both as raw material for this corporate technological machine that companies like Google are operating and as raw material for my own work. Right. Like I know more about myself from having had Google maintain this database of my intentions over ten years or longer than I otherwise would have. And I can use that. I can go back to that database and remember what I was doing 15 years ago on this day, in a way that might serve my writing. You know.

Eric Zimmer 00:08:39  Is that something everyone can do? Is this publicly? Like my search results are available out there somewhere for me.

Vauhini Vara 00:08:45  So even I, as a tech reporter, with all these years of experience covering Google and other companies knew that Google was collecting that information, and I could have sworn that I would have turned off its ability to collect that information about me. And it seems like I did at various points in time, but then either turned it back on or I don’t know what happened. But most of the time that I’ve used Google, it’s been collecting this information about me.

Eric Zimmer 00:09:12  As I said earlier, the thing that really I felt throughout this book is this, this sort of wrestling with how we use this technology, what we allow this technology to know about us, and really weighing the cost and benefits. I mean, you’ve got a whole chapter about Amazon where you’d meet with a friend who sort of says, I don’t use Amazon because of a whole host of reasons, which I don’t want to turn this into a why Amazon is bad podcast.

Eric Zimmer 00:09:39  But there’s lots of things about Amazon that we could question as being good for us, good for the planet, good for other people. And I know that I even had an incident recently where I was like, okay, that’s it. Not only should I not do I not want to use Amazon for what to me are moral reasons, but now they’ve really pissed me off and it was something stupid I bought like a Neutra blender or something from them, and it was like the second one that broke. And the first time they said, just throw it away and we’ll send you a new one. The second time I thought that’s what they wanted me to do. So I just threw it away. And then we got into this whole thing and I was like, I have been a customer of yours for 25 years. Plus, the amount of money I have spent with you is and I’m on the phone with someone is mind boggling. Just give me the 39.99 whatever it is, refund, like any sane human, would look at this relationship as a customer and go, that’s worth keeping.

Eric Zimmer 00:10:39  And they didn’t. They wouldn’t. And I was so frustrated just by the principle of it that I was like, all right, that’s it. No more Amazon, Which lasted all of about five days. Right where it just. And then of course I’m like if I can buy it somewhere else I’ll buy it elsewhere. But things that I only seem to be able to get there, I’ll get there. And of course then that sort of erodes and before you know it it’s kind of back to the same old, same old with it.

Vauhini Vara 00:11:07  Exactly. Yeah. And I think, I mean, one thing I like about the, the one you feed parable is the way in which it emphasizes personal choice, because I think a couple of things are happening here. These companies have become so entrenched that there is an extent to which there are strong incentives for us to use their products. You know, the, the, the, the shelves at Walgreens are a lot emptier now than they were previously because fewer people are shopping at Walgreens, right? And so if I need to get the special contact solution I use for my rigid gas permeable contact lenses.

Vauhini Vara 00:11:49  I can’t find it at Walgreens anymore. Right. And so the most natural thing is to turn to Amazon. That said, I continue to feel and insist to myself and to others that we do have choice in the matter, right? Like, we can try to make the effort to decide to approach this a different way. And you know, it’s not a spoiler to say that in my book, I get to the end of my book and I’m still using all of these products I’m critiquing at the same time. What’s not on the page is the fact that I have been engaged in this process of trying as much as I can to divest myself, and it’s an ongoing process, and it involves some failures and some successes, but I think that that effort is worthwhile.

Eric Zimmer 00:12:40  Yeah. What I was struck by as you were talking and I’ve been struck by in my relationship with Amazon, is the word convenience, and how convenience has become an unstated value for so many of us that, like it, ends up being prioritized over other values of mine in an unstated way, but it has somehow become this thing that is expected, needed. Maybe it’s the pace of life. I mean, I think there’s a whole bunch of factors, but a lot of the things that you’re talking about, it’s the convenience or the time saving ropes me back in.

Vauhini Vara 00:13:20  Yeah. And I think, I mean, I think this is tied up with the sort of broader economic history of the United States to where it used to be, a country where our identity was tied up with a lot of different things. And now, as fewer things are made in the United States, we end up having this role, this sort of like our primary economic role is the economic role of consumers. Right. And so everything gets oriented around that, including antitrust law, which in the past was about all kinds of different things, and now is very much about well, as long as the consumer is doing better than they were in this arrangement, than they were before this company came along, we have to admit this is a positive outcome, and that disregards all the other negative outcomes that can come from companies becoming bigger and more powerful.

Eric Zimmer 00:14:39  Eight years ago, I was completely overwhelmed. My life was full with good things, a challenging career, two teenage boys, a growing podcast and a mother who needed care. But I had a persistent feeling of I can’t keep doing this, but I valued everything I was doing and I wasn’t willing to let any of them go. And the advice to do less only made me more overwhelmed. That’s when I stumbled into something I now call this still point method, a way of using small moments throughout my day to change not how much I had to do, but how I felt while I was doing it. And so I wanted to build something I wish I’d had eight years ago. So you don’t have to stumble towards an answer that something is now here and it’s called overwhelm is optional tools for when you can’t do less. It’s an email course that fits into moments you already have taking less than ten minutes total a day. It isn’t about doing less, it’s about relating differently to what you do. I think it’s the most useful tool we’ve ever built.

Eric Zimmer 00:15:43  The launch price is $29. If life is too full, but you still need relief from overwhelm, check out overwhelm is optional. Go to oneyoufeed.net/overwhelm. That’s oneyoufeed,net/overwhelm. Let’s explore some other areas where your relationship with technology feels ambiguous, where you’re sort of like, well, I actually really benefit this, but I also there are reasons I don’t want to do that. So Amazon is, for me, a big one. I feel like I’m perpetually wrestling with this one. What else is there for you that falls into this kind of category?

Vauhini Vara 00:16:22  Google’s obviously an example, right? Because my my Google searches are improved. My own, my own catalog of my life is improved by the fact that Google is collecting all this information about me. One can argue that’s debatable, but, you know, that’s that’s the argument I’m making. But this is true of all kinds of products. I mean, I use social media and when I use social media, I am speaking to algorithms rather rather than human beings in some ways.

Vauhini Vara 00:16:55  I’m speaking in a way that is different from authentic human to human communication, because of the role of those algorithms in mediating what gets shown to people and what doesn’t. And yet the fact that social media exists and that I’m able to use it, plays a big role in sustaining my career as a freelance journalist, where I need to develop an audience of my own, who knows about my work. I need to not rely on outside publications to publish my work. because, I’m working in a very fragile industry. And so, you know, that’s that’s another place where it comes up. Another one that we haven’t talked about yet is AI, obviously. And what I find interesting about AI is that we read a lot about all the ways in which AI can make things faster and easier, and yet the jury still seems to be out on whether it’s actually making things faster and easier or not. There was this recent study where they asked a bunch of computer programmers, experienced programmers, to use, AI models to help them in their programming, and these individuals thought that they were saving a bunch of time, like their self-reported time savings was significant, but the sort of objective time they were taking to do their work was actually measured.

Vauhini Vara 00:18:21  And it turned out they were losing a bunch of time. So it was actually taking them longer to do their work when they were using AI than when they weren’t. So that’s a really interesting one, because I think with certain past technologies, like, like Google is a decent example or Amazon, the benefit is a pretty clear and legitimate benefit, and then it has to be weighed against those costs. But here, I think even the benefit is a little is a little iffy.

Eric Zimmer 00:18:47  Yeah. I mean, AI is obviously another big one to go into and one that produces a whole variety of feelings and emotions. Yeah. And let’s come back to AI for a second. But I want to I want to pivot for a minute because as we talk about AI, you can’t talk about AI without talking about open AI. And you did a profile of Sam Altman, the founder of open AI, a number of years ago. But more broadly, you really talk and show in this book how most of these big technology companies start out with a certain idealism.

Vauhini Vara 00:19:23  Yeah.

Eric Zimmer 00:19:25  You know, Google’s Don’t Be Evil as the most prominent example, but but OpenAI too has, you know, started with a really, you know, a real premise of, of AI safety and, and all of that. And you show how almost all of these companies over time ended up getting getting co-opted into moving away from that value.

Vauhini Vara 00:19:46  That’s right. Yeah. I mean, I think as human beings, we’re often driven by our own intellectual curiosity and passion, I think, than by a desire to make a bunch of money. Right. The problem with technology companies is that they tend to be really, really expensive to build. And so if you’re the Google founders in the mid 90s, or if you are Sam Altman of OpenAI in the mid 2000, the first thing you need to do if you want to pursue this intellectual passion project that you’re so excited about, that you think can change the world, is to go out and find some people who are going to give you the money to build it.

Vauhini Vara 00:20:29  And those people are investors. And those investors certainly surely have their passions, but it’s their job to put money into a venture so that that grows into more money. And so you end up in a dynamic in which whatever the sort of intellectual or philosophical or moral or ethical goals were behind this project of yours, it immediately it very quickly becomes bound up in the goals of these investors and who become the part owners of the company. Right. And so this isn’t like just some kind of abstract situation. It’s literally the people who own the company get to determine the goals of the company. And if that goal is to make more money. There you go.

Eric Zimmer 00:21:15  Precisely. And even if you are, you know, the founder, the CEO, if the the people who own the majority of the company think that you’re acting in some way, that doesn’t further the company making as much money as possible. You can absolutely just be removed. And so it’s this really tricky and weird situation. I mean, I think we all wrestle with these things to different degrees in our lives.

Eric Zimmer 00:21:40  You know what? What trade offs are we willing to make for money? But you’re right. I think it’s the nature of technology companies being so expensive. Right. Like, I was able to build this, you know, little tiny business that I have by myself without getting investor money because it’s a little tiny thing, right? Yeah, right. But I, you know, had I gone out and gotten somebody to say like, oh, I’m going to put $1 million in the one you feed, it changes things, right? You know, I worked in technology companies, software startup companies early in my career. And I saw very clearly what VC money did to accompany. You know, and yeah, it was mostly not good.

Vauhini Vara 00:22:22  Right.

Eric Zimmer 00:22:23  It was it turned out mostly not to be good.

Vauhini Vara 00:22:25  Yeah. And I don’t I mean, I think it’s a little bit of a stretch to say they’re the same thing, but I in some ways I think the situation that these, that these corporate founders find themselves in is not so dissimilar to the situation we as users of these products find ourselves in right where it’s not as simple as saying, listen, I have ethical principles that I need to stick to their end up being these competing interests.

Eric Zimmer 00:22:52  Yeah, well, I think that that sort of motivational complexity is such a key element in life that we don’t often talk about in lots of different areas. Right? Like, you know, a lot of my work is in how people change and learning to recognize what the motivational poles actually are, and being able to be honest about what they are is so important, right? In order to actually find a way out. But we tend to not do that. We tend to try and say like, all right, I’m going to I think I should do this. So that’s what I’m going to think about. That’s what I’m going to talk about without recognizing there is a lot pulling us in the opposite direction.

Vauhini Vara 00:23:34  Yeah. And I think I mean, I’d be curious about your thoughts on this, given your expertise in all the many interviews you’ve conducted on this question. Right. Of like how people change. But, you know, it seems to me that one of the challenges when we use these products is that the benefit that we get feels really immediate and really tangible, and all of the costs and consequences end up feeling quite abstract and intangible. And so the benefit, even though it’s relatively small and one could argue that the cost is enormous, the benefit ends up feeling to us like it’s bigger or more meaningful or more actionable.

Eric Zimmer 00:24:14  Yeah. I mean, as humans, we are, we are not good at this thing. I think the, the technical term in psychology is, well, I guess it’s also the term that you would use in money, but I think they call it delay discounting. Maybe, maybe they call it something different. But it basically means I value things that have that have any delay on them less and less and less. Yeah, right. Because I just.

Speaker 4 00:24:37  Very relevant.

Eric Zimmer 00:24:38  Yeah. And I think the other one is that a lot of these things that are bad are happening to other people or might happen at some juncture, and they’re far removed.  I’ve been thinking about where rereleasing an episode with Peter Singer probably will already come out by the time you did this, but when I read Peter Singer’s book The Life You Can Save, it shook me up and it has kept me shook up for years because he he poses a question and he basically says, imagine you’re walking down a road. I might not I may not get this exactly right, but this is close. Or at least what’s stuck with me. You’re walking down a road and there’s a pond right there, and there’s a child drowning in it. If you did not go and save that child, you would rightly be considered a monster. You would think yourself a monster. What is wrong with me? How on earth did I not go save that child that was 12ft away? But there are children dying all around the world. All the time that we have the means to help write. I have the means today to save children’s lives that I am not saving. And I don’t even mean like I have to donate every last dollar I have.

Eric Zimmer 00:25:46  But I could do more. And that the logic of that has haunted me. Yeah, and I think that that’s a little of what we’re talking about here. Like if the the people whose lives were work practice lives, I got to see them come home every day from work at Amazon and feel, you know, they’re still on they’re still on public assistance because they don’t make enough money and they feel demoralized by what they do. And they’re they’re tethered. Do all these things I know, quote unquote. No, they’re still remote. And that remoteness makes it I think is the is another I don’t know what the term for that Peter Singer pri as a word for it, but that remoteness is another thing that makes it very difficult to act in accordance to our values, because the thing is not present.

Vauhini Vara 00:26:38  Absolutely. And I think that’s built in to global capitalism as it exists now. Right. And we were talking earlier about the role, like this role that we have as people living in the US of consumers, how that’s sort of our primary role and a thing that makes that helpful, the thing that makes a thing that makes it easy for us to focus on things like convenience and low price and usefulness, is that all of a significant amount of the cost is borne by people on other continents who we will never see, whose lives we don’t know about.

Vauhini Vara 00:27:13  And that’s true when it comes to the labor cost. And it’s also true when it comes to the environmental cost.

Eric Zimmer 00:27:18  Right. There’s an argument, though, that I mean, the capitalist argument is that we are creating jobs that wouldn’t exist at all otherwise.

Vauhini Vara 00:27:30  Yeah, that’s the argument. and, you know, I think I, I recently read Bill Gates’s memoir and, in a lot of ways, I think I view the world differently from somebody like Bill gates. But what I find pretty compelling is the frustration of people like Bill gates that for all of our when when we criticize capitalism and technological capitalism, this globalized system that we’ve created for ourselves, we don’t spend a lot of time talking about the significant decrease in poverty, in mortality rates as a result of global capitalism and the role of technology in global capitalism. And I think it’s a really fair point.

Eric Zimmer 00:28:35  It is interesting because if you look at so many of these measures, we would consider progress. The trends are clear. You know, more women being educated.

Eric Zimmer 00:28:46  Number of deaths in childbirth. The literacy rate, the poverty rate. I mean, you look at all of this stuff and you’re like, okay, from one perspective, more people are better off than they’ve ever been. And and I think that’s the key word. And it’s not always that simple, because what comes along with perhaps a higher literacy rate is also perhaps the undoing of a culture that supports people in a certain way. Right. We’re taking our measures of what success and goodness are, and we’re then putting them on the world as a whole. Yeah, but I do think there is there is good news in a lot of this, but I don’t think it’s the it’s the unquestioned, good news that certain people would posit. But I think so many people think the world is getting worse. And I think it’s nice to have some counterpoint to that, because I actually don’t think the world is getting worse. I think by most measures, the world for most people is getting better. It may be getting worse for a certain group of people in a certain place, mainly us Americans, who are like, what the hell is all going on when the rest of the world has been living with that sort of chaos for, you know, for, for all of time.

Speaker 4 00:30:01  Right, right.

Vauhini Vara 00:30:02  Yeah, I know I agree with you on that.

Eric Zimmer 00:30:03  Let’s talk a little bit about the negatives to this. We haven’t really put that fine a point on them with with Amazon. We did we talked about, you know, labor laws and you know, the factories where things are produced and the way, you know, drivers here are treated. And you know, there’s obviously the income inequality where, you know, less people have more and more of the wealth. But what are some of the other. Would you say problems with? Let’s just take a couple. Let’s not go to AI yet. But but you know, Google, Facebook, Instagram. You know, like what? What are the other costs for us as people that we’re not seeing.

Vauhini Vara 00:30:42  Yeah. I mean I think there are a number and they can be difficult to talk about. But something I think about, I think in part because I’m a writer, is what how communication changes depending on who we’re talking to. And so human communication has always been good. Human communication has always been used for communion, for liberation, for good. And it has also been used by those in power to further accrue their power and further accrue their wealth.

Vauhini Vara 00:31:16  And so communication has always been a complex thing. What we have now, because companies like we interact with companies like Google and meta and Amazon, in part through our use of text, is this world in which we have this new audience when we’re speaking right? So when I search for something in Google, I’m formulating it in a certain way for Google. When I’m posting on social media, I’m posting in a certain way based on what I know algorithms favor, right? I may be using emojis more than I otherwise would. I might be including a picture of a cute pet in my post when I otherwise wouldn’t. Right. And so the way in which we communicate ends up being deeply influenced, and maybe one could argue, corrupted by the fact that any time we’re communicating, if we’re using one of these platforms, we’re communicating to other human beings, and at the same time, with the same sort of communication act, we’re communicating to a big technology company’s algorithms, which is changing the way we talk.

Eric Zimmer 00:32:25  Right. And it’s really very, very subtle.

Speaker 4 00:32:28  Yeah.

Eric Zimmer 00:32:29  You know, it’s often very subtle, but but it is absolutely true. I started this business a long time ago, 11 years ago. I’ve been I’ve been doing this podcast 11 years. And how I used to be able to use social media as a way of promoting this podcast doesn’t work anymore. And I have a really hard time. Like maybe this is a benefit of being a little bit older is that I don’t I don’t want to do it the way that it needs to be done today, which is to the detriment of the business. There is unquestionably detriment to the business, right? I could go out on Instagram and post a hundred profound thoughts that will get far less interest than, like you said, a picture of my puppy, you know, or post a picture of. Lola died three months ago, but I’m I’m getting through it, you know, like and I just it’s not me and but and even with like podcasting I think more and more it’s I you know, I call it becoming YouTube ified, right? Like, you’ve got to be sensational enough in what you say to drive the algorithms.

Eric Zimmer 00:33:33  And I agree with you 100% that it’s, it does change the way we interact. And you look at like something like Instagram, as I’ve been fortunate enough to be able to travel outside the country a little bit the last few years, which I hadn’t really done in any of my life before this very much. And what I’m struck by as much as anything is the Instagram ification of everywhere I go. That common, like coffee house look that we all love, you know? But it’s everywhere. I mean, I couldn’t tell you the difference in I mean, you can find places, but a lot of places are. I’m in Lisbon, Portugal. I’m in Amsterdam, I’m in Paris. I’m. I couldn’t tell you which city I’m in based on those places.

Speaker 4 00:34:18  Right.

Vauhini Vara 00:34:18  I was reading the other day that in Barcelona, there are so many tourists as as we’ve all heard in Barcelona that they’re creating some kind of plaza with these, like Instagram backdrops where people can take their pictures with the Sagrada Familia, the famous.

Speaker 4 00:34:36  Church.

Vauhini Vara 00:34:37  In the background, so that they’re not all crowding in front of the church. And so essentially, like the actual public landscape and infrastructure gets changed for the sake of how people communicate on social media.

Eric Zimmer 00:34:51  Yeah. I mean, I was in Barcelona and I’m not Christian, and I went to Sagrada Familia because I don’t know, you just here, like you kind of have to go and it’s the thing. And I was just extraordinarily moved by it as a building, as what it did and how it does it. And, and yet at the same time, like you said, I mean, all around is just selfie taking, which, of course, I mean, I’m taking a picture in there, too. It’s I’m not trying to cast aspersions, but it does change the very nature of the places that we’re in. I mean, so there is an effect to all of this.

Eric Zimmer 00:35:27  On this topic you wrote. You know, our subtle self modification according to technological capitalism’s norms is so pervasive that certain types of performances have their own names, Instagram face, TikTok voice. And then you go on to say it recalls w e d voices description of a double consciousness, a black person’s sense of always looking at one’s self through the eyes of others. Of measuring one’s soul by the tape of a world that looks on in amused contempt and pity. And I love that idea of a double consciousness.

Speaker 4 00:36:00  Yeah.

Eric Zimmer 00:36:00  I was talking with somebody last week, and they also mentioned something. I don’t remember the name of it. It’s the something effect. And it’s because they feel like we spend so much time looking at our own face and what that does. Like, right now you and I are talking and I’m, you know, 90% of my attention, 95% of my attention is on you. But it’s not lost on me that I’m right there.

Speaker 4 00:36:23  Right?

Eric Zimmer 00:36:24  And I do so much of my work in this sort of virtual thing where I look at my face all day long, not overly intentionally, but it’s there, and that even that is starting to have an effect on the way that we we are.

Vauhini Vara 00:36:38  Yeah. Like the just the awareness of yourself as a kind of object, and not even just an object of other human beings gaze like with what Du Bois was describing, but an object of the algorithmic gaze, right? Like the object of these corporate algorithms that have these, whose determinations have these real consequences. I mean, you point out with your podcast.

Vauhini Vara 00:37:06  That the extent to which you can get people to share posts on social media about the podcast is going to have some role in determining how many people are listening to the podcast. Yeah. And that work is important to you. And similarly, you know, I don’t know that I care in the abstract how many people follow me on social media. But it is true that when I have a new book out, or if I’ve written an article that I want people to read, it helps me if I have a large social media following that, I can broadcast that too.

Eric Zimmer 00:37:36  Absolutely. I mean, it ties into your livelihood. I mean, the, the, the, the book deal you get next will be somewhat based on how well this book sells and also based on what does your platform look like? You know, I mean, I’ve got a book coming out and I think it’s a good book and I’m really proud of it and I’m excited about it.

Eric Zimmer 00:37:53  But I got the book deal I got because of the platform of the one you feed. I don’t think they were like, oh my God, this idea. I’ve never heard of a book idea so good, or this guy is the next, you know, Shakespeare. I don’t think that was, you know, the idea was good enough. The writing was good enough. Yeah, but what moved the needle more than anything was a platform. And I’m sure there are people out there who can write far better than I do, who are not getting book deals like I got.

Speaker 4 00:38:21  Sure.

Vauhini Vara 00:38:22  Well, I bet it’s going to be a really good book. But also, you’re right.

Speaker 4 00:38:25  You know, it’s.

Eric Zimmer 00:38:26  Gonna be a great book, and I have enough maybe sense of myself in the world to know. Like, I’m not like, you’re a great writer. I’m not a natively great writer. I think I’ve written a really good book, and I’ve. I’ve had people help me who are really good.

Eric Zimmer 00:38:38  And, you know, I feel really I feel far prouder of it than I thought I would. And there are people who study deeply to become writers. You just don’t become good at something because you’re like, I’m going to pick it up. You become good at something because you do it a lot and you practice and it’s a craft.

Speaker 4 00:38:55  Yeah, yeah.

Vauhini Vara 00:38:56  I know that’s true. I one of my jobs is advising and mentoring people who are working on books. And when I first started doing it 5 or 6 years ago, I was very focused on like I thought the quality of the writing was, was sort of like the only thing that you needed, right?

Vauhini Vara 00:39:11  And now there’s this baseline, you know, in order for me to work with someone, there has to be this baseline. But I recognize far more that when it comes to getting an agent and selling the book. These external factors, like how many people follow you on social media, do play a significant role.

Eric Zimmer 00:39:27  Yeah, there’s a term called I don’t know if you’ve heard it called audience capture. And it speaks to this in a way. It doesn’t mean you’re capturing audience. It means your audience captures you, meaning you begin to you do something. If you’re a creator of any sort, you do something and you get some response. It gets some people to pay attention to you. And slowly, what that audience wants, if you’re not careful, is what you become.

Speaker 4 00:39:54  Yeah. Right. Yeah.

Eric Zimmer 00:39:56  And and oftentimes it narrows and narrows and narrows. You know, you’re a multifaceted person who happened to share. This is a silly example, but who happened to share about the plants you love? And now all of a sudden, you’re the plant person.

Speaker 4 00:40:09  Yeah, right.

Eric Zimmer 00:40:10  And, because we’re playing the algorithmic game.

Speaker 4 00:40:15  You know. Exactly.

Eric Zimmer 00:40:16  You could spend a lot of money hiring people whose whole role in life, the only thing they do is know how to manipulate that algorithm to your benefit. And they’ll come in and say, here’s all the things you need to do to try and please the algorithm, which is a really dispiriting way to go about things.

Vauhini Vara 00:40:36  And we’re essentially in that process. We if we’re not careful, we essentially turn into products ourselves.

Eric Zimmer 00:40:42  Yes. 100%. Yep. I’m going to take us in a completely different direction, and then maybe we’ll come back around to AI, because the place I would like to go is this is a book about our relationship with technology, but it’s also a book about you, your life. It’s a it’s a memoir. And there’s a significant portion of it is about your sister. So I’m wondering if you could share, if you’re open to it, sharing a little bit about that story, your sister. And then I’d like to explore all of that sort of through the lens of technology also.

Speaker 4 00:41:12  Yeah.

Vauhini Vara 00:41:13  So when I was in high school and my sister was in high school. She was two years older than me. She was diagnosed with this type of cancer called Ewing’s sarcoma. And, it was a really serious form of cancer. And she started treatment right away. And so when she was in her junior year of high school, I was in my freshman year, she, you know, went into treatment where she would be in and out of the hospital for weeks.

Vauhini Vara 00:41:39  And then she went into remission and got better. And then it came back again. And then she went into remission again and got better and went away to college at Duke. I went away to college at Stanford. And then she got sick again, and she passed away when when she was in, in, in college and when I was in my freshman year of college also. She was my only sister. She was my older sister. We were really, really close. She was the person who taught me a lot about how to be a person in the world. and so it was it was a really significant.  Loss for me. Yeah.

Eric Zimmer 00:42:13  So walk us through a little bit. I’m sorry about your sister, and the way you write about it is her. And the relationship is really beautiful and how even that that led to further downstream effects in your family, like it precipitated an unraveling of many things.

Speaker 4 00:42:32  Yeah.

Eric Zimmer 00:42:34  Talk to me about how that intersects with technology in this book.

Vauhini Vara 00:42:44  So, you know, I.Think something I started using the internet when I was in middle school in the mid 90s. I’m what they call an elder millennial. I was born in 1982. So, that’s, you know, that’s where I am demographically. And I think, you know, when you’re coming of age, when you’re a teenager and you’re going through difficult things, it can be hard to talk to other people about it. I mean, for me, I was really worried about my sister, but I didn’t want to worry her by talking to her about it, and I didn’t want to worry my parents by asking them all my darkest questions about it. And so I went to Yahoo, which was the most well-known search engine at the time, and started asking it questions about what was going to happen to my sister, what her prognosis was. And then my sister passed away many, many years later, just a couple of years ago. I write that profile of Sam Altman of OpenAI and start to learn more about the technology they’re building and end up getting early access to this AI model that is a predecessor to ChatGPT.

Vauhini Vara 00:43:53  And the way the model works is you type in some words and press a button, and then it kind of finishes the thought for you. And when I started playing around with that, I was like, You know, I really have a hard time talking about my sister and her death and my grief. This thing says that it can write for me when I’m not able to write. Maybe it can communicate on my behalf. So I hadn’t noticed this. It hadn’t occurred to me until very recently, but I think that’s sort of part of the same phenomenon, the way in which, like these products by big technology companies seem to be safer than other actual human beings. which is really insidious. But I ended up like going to this language model and asking it to to write about my sister’s death and my grief.

Eric Zimmer 00:44:41  You go through in the book, sorry, you give it a sentence and it, you know, spits something out. Then you give it a little bit more and it spit. And those of us who use AI to some degree may be getting used to its strangeness, but seeing it in that way, in that book, the way you did it, is just I had another of those moments.

Eric Zimmer 00:45:01  Like, what on earth.

Vauhini Vara 00:45:04  Yeah.

Eric Zimmer 00:45:05  Is this, you know, it’s such a strange thing.

Vauhini Vara 00:45:10  It is? Yeah. I mean, what was weird to me is that at first, you know, the first sentence I wrote was when I was in my freshman year of high school and my sister was in her junior year. She was diagnosed with Ewing sarcoma. That’s the name of the cancer she had. And then I pressed the button, and the first thing it spit out was, like, quite generic. And then it ends with this line that was like, she’s doing great now, which is the opposite of what actually happened to my sister. Right. Like there could be nothing further from the reality. So then I deleted all that. I kept my initial line, and then I wrote more myself. And I kind of did that process over and over. And the thing that was interesting to me is that the further along I got in this process, the more material of my own that I gave to this model, the closer it seemed to get to generating text that did seem to have some relationship with actual grief, with actual loss, with actual sisterhood.

Vauhini Vara 00:46:09  Right. And there were some Lines like the lines that it generated. Like many of them were. Were ridiculous and, you know, nonsensical. But then some of them were very poetic sounding. For me as a reader, I was able to read meaning into them. And the reason I phrase it that way is that the language model itself wasn’t trying to do anything in particular. It didn’t have doesn’t have consciousness. Right? It wasn’t like trying to write about grief. It just was generating language. And then I was making meaning from that language. But I was able to make significant meaning from it.

Eric Zimmer 00:46:47  Yeah, I think that’s the one of the strange things about it is how it can write in ways that make it seem very intelligent, very sensitive, very poetic. And it’s gotten that by basically stealing. That’s one word for it. Or gathering all of the world’s knowledge. Right. And so in many ways, it is a reflection of us. What I was looking for is I had a guest and I cannot remember the name of their book now, but it was a book where one of their parents had died and they they had like a seven year old child and they weren’t quite sure how to talk about death with their child.

Eric Zimmer 00:47:25  So they started asking AI about how to do it, and it turned into a long spiritual conversation where it was, you know, you could tell what AI was pulling from. It was pulling from the Bible and the Dow de Ching and the and, you know, the baga de vida. And I mean, it’s, you know, and it’s, it’s synthesizing all that and in ways it was, it was remarkably good at what it said. So it’s just such a such an odd thing. Yeah. You’re a writer. And so I think, you know, those who create content, maybe the ones who are most directly spooked by AI, although I think everybody is, you know, might might do well to be spooked to some degree. But you describe in the book both your revulsion at the idea of it and your curiosity and that your curiosity had won out. What’s your relationship with it like now?

Vauhini Vara 00:48:16  You know, one thing I will say about that experience of writing about grief and asking trying to ask this technology to produce language on my behalf, is that ultimately what became obvious? And maybe, maybe it should be, should have been obvious from the start, but was that this technology was clearly incapable of expressing something about my reality because it wasn’t me, and it wasn’t even a human being.

Vauhini Vara 00:48:47  It was a machine, right? and so ultimately, even though there was all this language it generated that I could read meaning into when I communicate. And I think this is not just because I’m a writer, I think it’s because I’m a human being. When I communicate, the gratification I get from that communication is from having made the effort of communicating myself. And it sort of does nothing for me if a machine does it for me. I mean, it doesn’t feel that different from like using a magic eight ball or something to produce words.

Eric Zimmer 00:49:19    Yeah, I think that’s true to an extent. But as you mentioned, when the more information you gave it, the more it began to create something that was like your experience. And my experience has been a very similar phenomenon. The more I give it of me and my thoughts and what I felt, the more it can create something that in some ways approximates me. Now I think it’s much more useful for for me, I find it much more useful to have it ask me questions.

Eric Zimmer 00:49:56  Yeah, right. You know, again, as somebody who’s not natively a great writer, it was a really useful tool to be like, ask me questions about X, Y, and Z. And then I would start answering those and and that, you know, that turned out to be really helpful for me. It was like having a collaborative partner. So here’s convenience again. We talked about convenience earlier. For you as a writer, you recognize that the value in writing often is the wrestling with the words themselves.

Vauhini Vara 00:50:26  Exactly. And to be clear, you know, I thought this AI model over time started to generate texts that seemed to say something about about grief, about loss, about sisterhood. But it was none of it was about my experience specifically. It didn’t feel like it was saying something about my experience. and. Yes, exactly. I think being a writer, maybe I’m especially attuned or I especially find value in being the one to express it myself. but then that brings up this other question, which is, you know, for me as a writer or for you as a podcaster, Our livelihood depends on the people who are ultimately choosing to read my books or listening to listen to the podcast.

Vauhini Vara 00:51:14  And so that raises this question of like, if an AI model could hypothetically generate text in the style of my writing that, you know, ended up creating a plausible text that could compete with one of my books, or if an AI model could use your voice, right, you can generate a podcast that, competed with your podcast. Would people find it compelling? Would people pay for that? Would people listen to it? Would people read the book? And I think that’s where the jury’s out.

Eric Zimmer 00:51:47  I think we’re a real interesting inflection point there, because you could train an AI model on my voice, and it would sound more or less exactly like me. And I’ve done 800 episodes so you could train it on how to interview like me. And my guess is today with the a technology, today it would be 75% as good as what I do, which is deeply disconcerting. And I asked myself that question about who who would care and who wouldn’t. That there’s a human here. Yeah, there’s studies around AI as a therapist, and the ones that I’ve seen seem to point towards this, that people will generally rank an AI therapist as more empathetic, listens to them better.

Eric Zimmer 00:52:35  They they like it better up until the moment that they know that it’s an AI therapist, at that moment, the whole thing crumbles.

Vauhini Vara 00:52:44  That’s really interesting.

Eric Zimmer 00:52:45  But I don’t know that five years from now, ten years from now, that holds true in the same way that like kids who grow up with AI, I think there’s going to be a certain percentage of people who are going to say, I want authentic humans. That matters to me. But will most people I don’t know. And that is deeply disconcerting over time that if a machine can do the thing that a human does as well as the human does it, do you need the human right? By a certain logic, now there’s a humanist logic that, you know inside is bristling at every bit of that, of course, but I think it’s a I think it’s an interesting thing to wonder what this all looks like in five years. I mean, I feel so completely uncertain what five years is going to look like and in, in a way that I never have before in my life.

Vauhini Vara 00:53:38  Yeah, I mean, I. One thing I find interesting about that question is that it brings us back, in some ways to the conversation about choice and agency and the whole. Right, the whole subject of of the parable in some ways of your show, which is like to acknowledge that we do have agency, even when it at times feels as if we don’t. And I think we’re at this really interesting inflection point with AI where I was looking at a I think it was a study from Pew recently that said that I think it’s something like 36% of adults in the US have used ChatGPT, and this was like from 2025. So it was it was relatively current. And then that figure surprised me, because if it seems in the zeitgeist, as if this technology, this product is like so much more deeply entrenched than it actually is, but it’s not. And I think, you know, if sometimes feels as if the way in which our culture moves is, sort of like happens in a way that’s divorced from our intentions or our will, but in reality, like we make choices as individuals, as communities, as societies to create the world we want to live in.

Vauhini Vara 00:54:58  And so I’m interested in sort of pausing in 2025 and saying, okay, well, like what is inevitable? What’s not inevitable? I think most of the times, like so much more, is not inevitable than than is inevitable because we don’t know what the future holds. And a lot of it depends on the choices we make. And so I think we could decide now to define a podcast as something that a human podcaster produces with human guess. And we could decide to define a book, a novel, as something that a human novelist writes for human readers. and that way, you know, regardless of how the technology changes, regardless of the extent to which the technology can convincingly sound like me or sound like you, we as humans have drawn a line in the sand saying, here’s what we consider acceptable, here’s what we’re interested in, and here’s what we aren’t.

Eric Zimmer 00:56:00  Yeah, well, I think that is a beautiful place to wrap up. I think you brought us kind of all the way around to where we started and left us with a message of hope that we have a say in, in the direction this all unfolds.

Eric Zimmer 00:56:12  Thank you so much for coming on. I really enjoyed the book. I’ve enjoyed your various writings, and I appreciate the subtlety and the nuance with which you’re writing about these things.

Vauhini Vara 00:56:21  Well, and I appreciate that about your line of questioning, too. So thank you for having me.

Eric Zimmer 00:56:25  Thank you so much for listening to the show. If you found this conversation helpful, inspiring, or thought provoking, I’d love for you to share it with a friend. Share it from one person to another is the lifeblood of what we do. We don’t have a big budget, and I’m certainly not a celebrity, but we have something even better. And that’s you just hit the share button on your podcast app, or send a quick text with the episode link to someone who might enjoy it. Your support means the world, and together we can spread wisdom one episode at a time. Thank you for being part of the One You Feed community.

Filed Under: Featured, Podcast Episode

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Footer

GET YOUR FREE GUIDE

Sign-up now to get your FREE GUIDE: Top 5 Reasons You Can’t Seem To Stick With A Meditation Practice (And How To Actually Build One That Lasts), our monthly newsletter, The Good Wolf Feed, our monthly email teachings about behavior change as well as other periodic valuable content.

"*" indicates required fields

Name*

The One You Feed PRACTICAL WISDOM FOR A BETTER LIFE

Quicklinks

  • Home
  • About Eric Zimmer
  • About Ginny Gay
  • About the Parable
  • About the Podcast
  • Podcast Episode Shownotes
  • Contact: General Inquiries
  • Contact: Guest Requests

Programs

  • Free Habits That Stick Masterclass
  • Wise Habits
  • Wise Habits Text Reminders
  • Membership
  • Coaching
  • Free ebook: How to Stick to Mediation Practice

Subscribe to Emails

Subscribe for a weekly bite of wisdom from Eric for a wiser, happier you:

"*" indicates required fields

Name*
This field is for validation purposes and should be left unchanged.

By submitting your information, you consent to subscribe to The One You Feed email list so that we may send you relevant content from time to time. Please see our Privacy Policy.

All Materials © 2025 One You Feed | Terms | Privacy Policy |  A Joyful Site