By Dustin Rowles | TV | November 17, 2025
Last week, we unpacked the plot of Pluribus: An alien code is implanted in every human on Earth, except for 13 people, including Carol (played by Rhea Seehorn). This code links the minds of nearly everyone into a single collective consciousness, creating hyper-intelligent pod people who share all knowledge. Carol wants no part of it and is determined to reverse the transformation. The pod people, meanwhile, are intent on integrating her, not maliciously, but because they believe she’s missing out on a higher state of being.
“Carol, if you saw someone drowning, would you throw them a life preserver?” Zosia asks in this week’s episode. “Of course you would. You wouldn’t think. You wouldn’t wait. You wouldn’t try to get consensus. You’d just throw it.”
“So, now I’m drowning?” Carol responds.
“You just don’t know it.”
That exchange reinforces my initial take: Pluribus feels like the dramatic sibling of The Good Place, an exploration of free will and the tension between individuality and the collective. But theories abound, including one that even speculates the show is actually about influencers.
One of the more compelling theories is that Pluribus is a commentary on artificial intelligence. Vince Gilligan has openly criticized AI as a “detriment” to creativity, and that discomfort is woven deeply into the series. AI, like the Internet itself, gives us all access to infinite knowledge. But does it make us more creative? Does it truly make us smarter? If everyone knows the same things, what distinguishes us? What makes us unique? In Pluribus, individuality has dissolved. What’s left is a hive mind. They’re benevolent, fortunately, because they could easily eliminate the 13 holdouts. Indeed, as the second episode suggests, those with free will are the ones capable of malevolence.
After this week’s episode, I’m also convinced that Pluribus is about the nature of happiness. What makes us happy? Do we even want to be happy? The logline itself is revealing: “Carol, the most miserable person on Earth, is tasked with saving the world from happiness itself.” From the literal cold open, it’s clear that Carol’s baseline is misery. In a way, she seems to crave it. Helen brings her to a stunning ice palace, one of the most beautiful locations imaginable, and the only happiness that Carol can find in it is her own misery.
“This is completely your bag,” Helen tells her. “You love feeling bad.”
So, isn’t this new paradigm, one that gives Carol endless things to despair over, her ideal state? Misery is her comfort zone, even as she admits the hive mind offers a certain appeal. “It’s all beautiful scenery, and you feel nothing but contentment. Wave after wave of bliss. Everything is perfect,” she tells Zosia, describing what she imagines it feels like. “Like living inside a postcard every second of every day.”
“Or you’re in Norway above the Arctic Circle… in a hotel made of ice.”
It sounds idyllic. But that’s never what Carol has wanted, even in the best of times. In a perfect world, there’s nothing to complain about, and complaining is her identity. Her inability to feel content is part of who she is. It’s not enough to make the bestseller list; she has to be number one, and even then, she loathes it because it means the people reading her romantasy novels are dumb.
Carol resists contentment. She resists happiness. She doesn’t want to join the hive mind because it means losing her identity. And if she must be happy, she wants to earn it. Would she be fulfilled if she wrote the novel she truly wanted and it hit number one? Probably not. She’d find a new grievance. That’s her bag. She “loves feeling bad.”
Happiness, intelligence, wealth, power: None of it means anything if it’s handed to you without effort. That’s why Carol feels hollow when dozens of pod people reconstruct an entire grocery store for her within hours. We derive meaning from work, from solving problems, from making choices. Take that away - whether through a hive mind or artificial intelligence - and what remains of us?