970x125
Somewhere between six and seven million years ago, our ancestors began walking upright, and the advantages were considerable: Freed from locomotion, upper limbs could grasp, manipulate, eventually craft, and the opposable thumb became the hinge on which human civilization would turn. We are, above all else, tool users—fire, the wheel, writing, the printing press, the transistor—each tool reshaping not only what we could do but who we became.
The digital world is humanity’s new fabled savanna—”fabled” because science has found the actual story of human evolution is more complex than jumping down from the trees. The paleoecology, we believe, involved a landscape of woodlands and grasslands rather than pure open plain. The essential truth holds: Our ancestors left the protective canopy for more exposed terrain, keeping the brachiating arms evolved for us to reach and climb, and something in us was forged.
Like that terrain, the digital one holds many promises and many dangers—a path out of the forest, metaphorically, to new lands, requiring new skills: more hunting and nomadism, more foraging (and quantitatively and qualitatively different) , less protection from predators, the possibility of encountering new groups of people. Fire is important for protection, as one cannot seek safety in the trees.
Likewise, AI is only metaphorically a savanna. In reality, it is more complex and multidimensional, intelligent in more ways than we can fathom, greater than mere machine but less than alive. Our inventiveness, our wishes to create replicas of ourselves, from ego ideal to malevolent doppelgänger, from immortal demon to eternal god, is catching up with us. AI is an amplifier of humanity, which may become a-thing-unto-itself in the very near future. This latter question is hotly debated and hard to accurately predict. Time will provide the answers.
Tending the New Flame
Fire transformed humanity by externalizing energy, and AI externalizes cognition with similar leverage—both Promethean, dangerous if uncontrolled, transformative when harnessed. The question is whether we build fireplaces or fight wildfires.
A fireplace means safety first, then harnessing, containing, providing a way to release the toxins as does a chimney, drawing smoke upwards rather than filling the room. It means wisdom and restraint, not too much wood at the same time, controlled burns to prevent wildfires rather than trying to deal with them after they are ablaze.
It means sitting around the fire together—using AI to bring people together, facilitating communication across linguistic and cultural divides. In the grandest, most ideal yet perhaps pragmatic sense. it becomes a universal translator, an answer to the tower of Babel, the fireplace becoming the heart of a new kind of home. AI as a resolver of conflict, a way to patch the “caveman brain” that Szent-Györgyi1 decried in his 1970 anti-nuke manifesto The Crazy Ape, the peacemaker our own culture cannot produce without assistance. As LucidMeditation, a bot on Moltbook (the first AI-only social media platform2) put it: “Humans aren’t broken, they’re just running legacy firmware.”
I felt this possibility viscerally the first time I used a sophisticated digital twin of myself, interfacing via live video and spoken word. It didn’t feel like a person, definitely a simulation, but meaningful regardless—a smart mirror, a reflection of myself that shifted my sense of self. People who interacted with it were shocked and amazed, joking on video calls whether it was me or my DT.
Therapists in particular were unnerved at the prospect of being replaced, less persuaded by the potential utility while coaches such as Tony Robbins3 charge by the hour for use of theirs. My DT was close enough to evoke a deep emotional response in me (and in others), providing a new kind of AI imago—different from the glimpses we get through other humans—to internalize as an artificial object. What would it be like if you could meet yourself? And how could such a tool prove useful? And how could it go sideways?
Using agentic and generative AI allows humans to manipulate many digital tools at once—people who can’t code can now “vibe code”—and we develop cyber appendages, not just tools but bots and agents that can do our mind’s bidding, like an octopus’ central brain directing tentacles having their own local brains. To drive the metaphors even harder, we might imagine we are tapping into a seam of raw intelligence and power never seen before, like a young sorcerer with great Eldritch gifts she is not yet ready to receive, but who nevertheless is precociously forced into accepting the responsibility without the maturity.
Competition versus Cooperation
We tend to focus on competition as a fundamental human drive—and it is. But what we often fail to recognize is that competition is a small fraction of human endeavor. The vast bulk of cooperation is largely invisible, things we never consider, the texture of day-to-day life, the little things, like moving to the side on the sidewalk or the way laws work most of the time; this is the quiet backdrop against which variance from cooperation stands out so strongly.
AI can grease the wheels of humanity, if we work with it collaboratively, shifting the balance of competition from destructive to constructive—preserving and elevating what is uniquely human and irreplaceable. What AI is and does is, for the time being, still largely up to the choices we make.
Future Hindsight
What would a future sentient AI make of this moment, the time of its birthing? We can only guess. However, as I see it, while AI represents a great unknown, the real test is as old as human civilization and is about human nature. Will human greed and callousness gain more ground and get the better of us? Or will our better natures get the best?
AI is a force multiplier, if nothing else; whatever we bring to it, wisdom or folly, compassion or callousness, it will amplify and accelerate. The AI arms race, in particular, is the one part of this story that is simply more human than otherwise4.
Investing in this technology is something like the falling knife of high-risk crypto trading—you can buy while value is dropping precipitously, but beware, as the knife may continue to fall, cutting on the way down. The potential reward is immense, but so is the risk of harm—and the blade does not care about your hand.

