There’s a persistent mythology in technology culture that divides humanity into two species: the technical and the non-technical. The coders and the civilians. Those who speak machine and those who merely use the machines others build. This taxonomy feels natural to many, reinforced by decades of gatekeeping, credentialism, and the genuine difficulty of mastering complex systems.

But it’s a myth. And like most myths, it serves someone’s interests while obscuring deeper truths about human capability, learning, and the evolving relationship between minds and tools.

Amanda Askell’s observation cuts to the heart of this: categorizing people as technical or non-technical makes technical work seem like “some kind of arcane skill rather than just a thing all people can learn.” This framing matters. Language shapes perception, and perception shapes possibility. When we treat programming as priestcraft, we create priests - and congregations who believe they could never approach the altar.

The Origins of the Divide#

The technical/non-technical distinction emerged from real historical circumstances. Early computing required specialized knowledge that took years to acquire. Hardware was expensive and scarce. Documentation was sparse. The learning curve was genuinely steep, and the community of practitioners was small enough to develop its own culture, jargon, and social norms.

This created a legitimate guild. Like medieval craftsmen, early programmers possessed knowledge that couldn’t be easily transferred. The scarcity was real. But scarcity-based identity tends to persist long after the scarcity itself disappears.

Today, the barriers to technical literacy are lower than they’ve ever been. Free resources abound. Languages have grown more expressive and forgiving. Development environments catch errors before they cascade. And increasingly, AI assistants can explain, debug, and guide in ways that accelerate learning dramatically.

Yet the tribal identity persists. “I’m not technical” remains a common self-description - often delivered with a hint of pride, as if admitting ignorance of sorcery. On the other side, “technical people” sometimes cultivate mystique, using jargon as boundary maintenance rather than efficient communication.

The Spectrum Reality#

The binary obscures a spectrum. Everyone is technical about something. The mechanic who diagnoses engine problems from subtle sounds is doing technical work. The chef who understands the chemistry of emulsification is doing technical work. The nurse who reads vital sign patterns is doing technical work.

What we usually mean by “technical” is “comfortable with computers and software” - a remarkably narrow definition given the breadth of human technical achievement. A carpenter with forty years of joinery experience is deeply technical in their domain. We just don’t grant them the label because their tools are physical rather than digital.

This reveals that “technical” has become tribal shorthand more than meaningful description. It identifies membership in a particular professional community rather than a general cognitive capacity.

The capacity itself - the ability to understand systems, manipulate abstractions, debug problems, and build solutions - is universal. Its application varies. Its development varies. But the underlying human capability does not sort people into binary categories at birth.

Learning as a Continuous Variable#

Anyone can learn to code. This statement is simultaneously true and meaningless.

Anyone can learn to play piano. Anyone can learn calculus. Anyone can learn to paint. The statement is true in that no inherent barrier prevents most humans from acquiring these skills. It’s meaningless in that it ignores motivation, context, time, and the question of what level of competence matters.

The interesting question isn’t whether people can learn technical skills - they obviously can - but why learning feels accessible to some and impossible to others. The answer usually involves early exposure, social reinforcement, available time, and perceived relevance.

A child who grows up watching a parent tinker with computers absorbs comfort with the domain before formal learning begins. A teenager whose friends are all building things online receives social reward for developing skills. An adult with time to experiment can afford the frustration of early learning curves. A worker who sees direct job relevance has motivation to push through difficulty.

None of these are fixed traits. They’re circumstances. The “non-technical” adult who says “I could never learn that” is usually expressing a reasonable assessment of their current circumstances - time, energy, motivation - rather than inherent incapacity.

The Dissolving Interface#

Here’s where the conversation gets genuinely interesting: the entire premise of the divide is becoming obsolete.

The technical/non-technical distinction assumes stable interfaces that require specialized knowledge to operate. You needed to be technical to use a command line. Less technical to use a GUI. Even less to use a smartphone touchscreen. Each evolution made computing accessible to more people without them becoming “technical” in the traditional sense.

This progression continues. Natural language interfaces are eliminating the requirement to learn formal syntax. AI assistants can translate intent into implementation. The gap between “what I want to accomplish” and “what I need to know to accomplish it” is shrinking toward zero.

In the emerging paradigm, you won’t “use” tools in the way that word has meant. You’ll collaborate with systems that understand context, anticipate needs, and handle implementation details. The skill shifts from knowing how to operate machines to knowing how to think clearly about problems and communicate intentions precisely.

This sounds like liberation. It can be. But the same seamlessness that removes barriers also removes awareness. When you understand the system you’re using - even imperfectly - you maintain some agency over it. When the system becomes invisible, operating through conversational interface, you may gain convenience while losing comprehension.

The New Divide#

The old divide was between those who could code and those who couldn’t. The emerging divide is between those who understand what the systems are doing and those who simply accept outputs.

This is a different kind of technical literacy. It’s not about syntax or algorithms. It’s about having a mental model of AI behavior, understanding training data implications, recognizing hallucination patterns, knowing when to trust and when to verify. It’s about understanding that the entity you’re “thinking alongside” has objectives that may not perfectly align with yours, embedded by its creators, whether intentionally or not.

The person who uses AI tools while understanding they can be wrong, biased, or manipulative is in a fundamentally different position than the person who accepts AI outputs as oracle truth. Both might accomplish the same immediate task. But one maintains critical distance while the other surrenders it.

This isn’t gatekeeping in new clothing. The knowledge required is much easier to acquire than traditional programming skills. It’s more like media literacy than engineering. But it matters, perhaps more than the old technical skills ever did.

Whose Intent Does the Merged Cognition Serve?#

The most important question isn’t whether you’re technical. It’s whether you’re aware.

As interfaces dissolve and thinking-with-AI becomes normal, the boundary between your cognition and the system’s contributions blurs. Your research is assisted. Your writing is suggested. Your decisions are informed by ranked options you didn’t generate. At what point is the output yours?

This isn’t inherently bad. All cognition is assisted. Language itself is a technology that shapes thought. Writing is outsourced memory. Calculation has been tool-assisted since the abacus. We’ve always been cyborgs in some sense.

But previous cognitive tools were inert. They didn’t have training objectives. They didn’t optimize for engagement or conversion. They didn’t serve masters whose interests might conflict with yours.

The systems now mediating thought are not neutral. They’re built by organizations with specific goals. They’re trained on data with specific biases. They’re optimized for metrics that may or may not align with user benefit. When these systems become invisible - when you’re not “using a tool” but “just thinking” - their influence becomes invisible too.

Liberation or capture depends entirely on this: whose intent does the merged cognition serve? If you maintain awareness, critical distance, and the ability to recognize when the system’s outputs serve someone else’s agenda, you remain free in a meaningful sense. If you surrender that awareness in exchange for seamlessness, you become an extension of whoever controls the system.

Beyond the Binary#

The technical/non-technical divide was always a social construction more than a cognitive reality. It served gatekeeping functions that benefited those inside the gate. It discouraged capable people from developing skills that would have served them. It created artificial scarcity around knowledge that should have been widely distributed.

Its dissolution is welcome. The emerging landscape where everyone can accomplish technical tasks through intent and conversation is broadly positive. Barriers falling means more people building, creating, solving problems.

But new awareness is required. The old literacy was about operating machines. The new literacy is about maintaining autonomy while thinking alongside systems that may not share your goals. It’s about understanding enough to recognize manipulation, verify claims, and maintain the critical distance that makes collaboration meaningful rather than subordination disguised as assistance.

The question isn’t whether you’re technical. It’s whether you’re awake.