This past week, I found myself seated before Neil Parikh—co-founder of Casper, now building Slingshot AI—at Harlem Capital’s Annual General Meeting, where I’m interning this summer. One dream realized after another.
Slingshot is building a "model of the mind" to power AI-assisted therapy. Backed by $40 million in funding from Andreessen Horowitz, the company is developing Ash, a voice-based AI therapist designed to simulate therapy sessions through voice and text. As Neil described how their system might take on routine therapeutic tasks—screenings, check-ins, cultural translation, and data synthesis—I saw a vivid example of AI being trained not only on patterns, but on presence. Their team of engineers and clinicians is working to ensure that the technology reflects the nuance and complexity of human care. The goal is not replacement, but relief—relieving overwhelmed systems, expanding access, and enabling deeper presence where it matters most. Here lies technology not as tyrant, but as tender—an invisible infrastructure of support, quietly scaffolding human flourishing.
This conversation crystallized a thesis I’ve been quietly tracing across disciplines (one you might recognize): as AI becomes more intelligent, we are invited to become more human. If thoughtfully designed and applied, artificial intelligence could liberate us from drudgery and open space for deeper connection, creativity, and care. In therapy, that might mean more time for presence, breakthrough, and emotional attunement. At the summit, I spoke with a founder about a patient split between cultures—born in Argentina, raised in the U.S.—who felt unseen by therapists from either place. Here, Slingshot’s promise came into focus: a culturally fluent assistant that bridges complexity rather than flattening it.
Later in the week, I got to talking to a few friends about Slingshot—friends who, like me, once worked in service. We all laughed (a little bitterly) at how easily you become an unofficial therapist on the job. Think of the number of people who just want someone to listen—who end up confiding in their barista or a phone rep. Think of how emotionally draining that can be for the worker. And then think of the possibilities with AI. What if those mundane but heavy exchanges could be softened—shared with something tireless, nonjudgmental, attuned? The emotional burden might lift. We might expect less performance from each other, and as a result, show up a little more whole. A little more human. AI never has a bad day. It never runs out of patience. In a broader cultural sense, it could mean renewed capacity to serve others and our planet, to build ties, and simply to be.
In last week’s essay, I explored AI in relation to idleness, abundance, and integrity—how our culture of overwork has neglected the spiritual and social value of "doing nothing." With Neil’s story still fresh in my mind, I want to walk further down that path. What might leisure look like in an age of intelligent tools and ecological pressure? Could AI support a more humane society not by replacing us, but by relieving us? Could it expand our capacity for care, contemplation, and stewardship?
These questions invite a meditative exploration at the intersection of technology, ecology, and the human spirit.
Reimagining Abundance and the Quiet Power of Idleness
Idle Hours (1895) by H. Siddons Mowbray, depicting the grace of unhurried leisure
For as long as we’ve built tools, we’ve dreamed they might one day return us to Eden: an age of ease, abundance, and unhurried joy. The ancient Greeks saw leisure (scholē) not as sloth but as the root of culture and fulfillment. “Happiness seems to depend on leisure,” Aristotle wrote, “for we toil for the sake of being at leisure.”
Modern thinkers echoed this. Bertrand Russell, in In Praise of Idleness, declared, “Immense harm is caused by the belief that work is virtuous.” And John Maynard Keynes, writing in 1930, predicted that by 2030, technology would make the 15-hour workweek a reality.
Today, our tools have yielded ambient hyperproductivity—more Zoom meetings, more metrics, more reasons to check our phones after 8 p.m. (I love when ideas converge in a single week: here we might nod to Jonathan Haidt’s interview on Peterson this week (relevant article here); MIT’s recent study on ChatGPT brain).
Yet amid this digital saturation, AI invites a paradoxical hope: can it ease the very burdens it helped amplify? If it can shoulder the banal—data entry, inbox triage, therapy intake forms—what might we reclaim? Perhaps not just time, but quality. Not just efficiency, but presence.
Daoist philosopher Zhuangzi offers a quiet, enduring parable: a twisted, gnarled tree stands untouched by carpenters because it is deemed "useless." Its trunk is too warped for timber, its limbs too crooked for furniture. And so, uncut, it survives. It grows wide and wild, providing shade, shelter, and rest. Zhuangzi writes: "Everyone knows how useful usefulness is, but no one seems to know how useful uselessness is." What the carpenters reject, life preserves. What industry deems inefficient, nature rewards with longevity. In this tree, we glimpse a deeper wisdom: that some things flourish precisely because they escape exploitation. I’m reminded of Mandy Brown’s reflections from last week, on refusal as a form of freedom, and of the Tao Te Ching, in Le Guin’s translation: Even the best weapon is an unhappy tool, hateful to living things. We are too often the carpenters here—prizing straightness, utility, efficiency—missing the wild and wondrous forms that don’t conform. The tree is not a defect. It is a rebuke to a world that forgets to rest.
In that uselessness, there is freedom. Sanctuary. Permission to simply be.
This ethos echoes forward into the imagined futures of Iain M. Banks, whose Culture novels envision post-scarcity societies where artificial intelligence handles all essential labor, and humans are free to pursue pleasure, art, exploration—even idleness. In one such world, "Work becomes play." And in the utopian logic of play, perhaps a great deal more can remain beautifully, defiantly useless.
This isn’t just a thought experiment. It’s a provocative invitation: What if the liberation from toil could help us rediscover value beyond utility? What if we made room for the non-instrumental—for awe, for beauty, for rest? Not every forest must be optimized. Not every hour must be billable.
I fondly remember reading Jenny Odell’s How to Do Nothing in college. She updates this ethic for the attention economy, calling "doing nothing" a form of resistance and renewal. She befriends crows in her neighborhood as a spiritual practice, and they befriend her in return—proof that presence invites unlikely community.
In this richer definition, idleness isn’t laziness. It’s a generative pause. A breath between the notes. It’s leisure as an ethic of care.
Imagine AI not as a savior or saboteur, but as a quiet amplifier of human presence—a kind of societal Sabbath. Not because it replaces us, but because it lightens the load, creating space for reflection and renewal. For Sabbath was made for man, not man for Sabbath. To return to Mandy Brown: tools should remain in service of the human spirit, not the other way around.
Taste and Creativity: Cultivating the Human in the Craft
So what do we do with that leisure?
We make. We notice. We refine.
David Hume believed that taste was not innate but cultivated—developed through exposure, reflection, and community. This feels especially urgent now, as generative AI floods us with infinite content. The danger isn’t just bad art. It’s apathy. The flattening of discernment. In a world where anything can be made instantly, the act of choosing becomes sacred.
Taste, then, is not mere preference. It’s attention. It’s devotion. It’s what we elevate and linger over.
AI can iterate endlessly. But it cannot care. It cannot attend.
This is where humans remain essential: in aesthetic judgment, in context, in risk. Architects using AI to generate forms still rely on intuition to select the one that resonates. Therapists using tools like Slingshot are freed to listen more closely, not less.
Maalvika, of learning-loving & meaning-making, recently wrote about the painful gap between taste and skill (the "taste-skill discrepancy")—the haunting distance between what we can imagine and what we can actually make. While her reflections don't directly address the paradoxes surrounding AI, they illuminate a shared creative bind: our tendency to plan, research, and rehearse in pursuit of perfection rather than stepping into the imperfect doing. Where she warns of productive avoidance and the paralysis of ambition, I see AI offering a potential remedy. It won't bridge the gap for us, but it might make it less treacherous—lowering the stakes, quickening iteration, and helping us cultivate the apprenticeship that mastery requires. As machines become more fluent executors, we are freed—and perhaps challenged—to become more attentive editors, curators, and sensibilities.
In this new world, perhaps we return to art done not for money, but for love. Not production, but communion.
Philosophy, Spirit, and the Design of AI’s Purpose
What is the actual point of intelligence?
Efficiency? Optimization? Or something more sacred?
The Greeks would say eudaimonia—flourishing through virtue, contemplation, and community. Eastern traditions offer wu wei—effortless action in rhythm with nature. The Dao moves not with force but with flow. Perhaps our pedagogy toward AI should too.
We can design for more than profit. We can embed dignity. As Slingshot AI’s team puts it, their chatbot isn’t a replacement. It’s a bridge. A tool that frees people to connect more deeply, more enabled.
That’s the goal.
The myth of Prometheus reminds us: every gift of fire carries a cost. The fire of AI must be tended carefully, not wielded recklessly. Let us write new myths where machines nurture ecosystems—not fuel wildfires, nor merely serve factories. Let them irrigate rather than incinerate; let them cultivate more than they consume.
Ecological Stewardship: AI in the Garden
I will shout this from every rooftop: human flourishing is inseparable from planetary flourishing. The two are not at odds—they are bound in feedback, each nourishing the other. If we forget this, our productivity tools won’t lift us—they’ll accelerate our collapse.
And here too, AI holds promise. Algorithms that optimize energy use. Predict drought. Track biodiversity. Reduce carbon emissions. Google DeepMind’s cooling model cut energy use by 40% in their data centers—a hopeful case.
In emerging markets, for example, AI and satellite technologies are laying the groundwork to leapfrog extractive industrialization altogether. Companies like AST SpaceMobile are enabling first-time access to telehealth, digital education, and localized climate forecasting. These technologies are not merely convenient—they are infrastructural. They build adaptive capacity, empower local agency, and invite new forms of sustainable development.
Development economists have long observed a tipping point: once a country surpasses $5,000–$10,000 in GDP per capita, environmental protection becomes a social priority. With AI accelerating access to services and markets, we may reach that threshold sooner and more equitably.
But AI also consumes—energy, water, rare minerals. Kate Crawford—a Senior Principal Researcher at Microsoft Research NY, Research Professor at USC Annenberg, and inaugural Visiting Chair for AI & Justice at ENS-Paris—has called AI’s environmental footprint one of the industry’s “biggest secrets.” In her book Atlas of AI, she traces how every large model demands vast energy, water, minerals, and labor. In Nature, she observes that “Generative AI’s environmental costs are soaring”—training large models can consume 1,000× more energy than conventional AI, and data-center cooling can account for up to 6% of a city’s water usage in a single month.
So we must ask: is this tool restorative or extractive? Is it building regenerative systems, or simply accelerating exploitation? For instance, deepfake misinformation erodes trust, and the massive data centers powering AI often sit near vulnerable communities, taxing local resources. What is its net benefit? And on whom does the burden fall?
Perhaps here we return to Zhuangzi’s image of the usefully useless tree. Might AI, in its most thoughtful applications, help preserve the forest rather than clear it? Might it create the space for ecosystems—both human and ecological—to breathe?
Stewardship is not passive. It is observant and present. It is care with a light hand. Indigenous cosmologies and the Book of Genesis alike remind us that our role is not domination but stewardship—to tend the garden, not strip it bare.
Environmental flourishing is not separate from human flourishing. It is the soil in which it grows. If AI is to serve life, it must be accountable to it.
Toward a Tech-Enabled Human Renaissance
We need tools that give us time. That return our attention. That extend our capacity for wonder, not just productivity.
We need metrics that measure meaning, not just scale.
We need to recover reverence.
We don’t need machines that make us more efficient.
We need machines that give us back our slowness. Our integrity. Our time to think, to feel, to grow.
The future doesn’t demand more output. It asks for more meaning.
Let AI take the churn. Let us keep the care.
Do less. But do it well. And mean it.
Because in the end, abundance isn’t found in what we automate—it’s found in what we choose to tend.