The U.S. Failure in Afghanistan Reveals the Real Challenge of Building AI Augmented Teams

Why mindset, incentives, and leadership matter more than models and tooling.

Why mindset, incentives, and leadership matter more than models and tooling

Mike Jason, a retired U.S. Army colonel, once wrote about spending years training Iraqi and Afghan forces that collapsed the moment America stepped back. They had the gear, the manuals, the drills. What they lacked was an identity sturdy enough to carry all of it. When I read his account, it felt less like military history and more like a quiet warning for anyone trying to build AI augmented teams today. We tell ourselves this is a tooling problem. Buy the subscription. Run the workshop. The truth is simpler and more uncomfortable. Tools are obedient. People come with stories about who they are and who they fear they might become.

So let me lay my cards on the table. This essay is not about models or tokens or which prompt template will save your engineering roadmap. It is about mindset, incentives, leadership, and culture. All the soft things that end up being the hard things.

Start with Afghanistan. The Americans wanted to build a police force, so they did what any modern organization does. They handed out the latest equipment. They trained people. They created documentation, procedures, and colorful progress reports. When the Americans left, the force evaporated like mist in sunlight. Many simply folded into the Taliban. The tools had survived. The identity had not. Jason called it tactical readiness without institutional spine. It sounded a lot like the way companies roll out AI tools today, imagining that capability will bloom once the invoice is paid.

And this is not just a modern tragedy. In The Last Samurai, the Japanese Royal Army stands on a battlefield with cannons and rifles yet cannot face the Samurai. The Samurai know their weapons like extensions of their limbs, sure, but more importantly, they know themselves. Meanwhile, the Royal Army had decided they were the weaker side long before the battle began. A team loses the moment it accepts an inferior story about itself.

Engineers do the same thing with AI. When a new tool arrives, they do not ask how it works. They ask whether it threatens the narrative they hold about their careers. Will this replace me. Will it shrink my importance. Will it reward me. Will it make my work look trivial. These questions sit quietly under the surface and shape every reaction. Jason saw a version of this in every unit he trained. People whose future felt unstable never fully adopted the tools handed to them.

Which is why incentives matter. Real incentives, not a line in a quarterly newsletter. Visibility. Recognition. Being invited into the technical committee. Access to paid tools while others wrestle with the free tier. Small signals that whisper you matter in the story we are trying to build. And yes, consequences matter too. If half the team embraces AI and nothing changes for the other half, the message is clear. This is optional. And optional things do not transform organizations. The same Indian who stops dutifully at a red light in the States will slide through three in India because one place has consequences and the other does not.

Training comes next. Not first. Once the mindset is aligned and the incentives speak the right language, then load people with workshops, paid tools, deep dives, and hands on sessions. Without that foundation, the training will slide off like rain on a tin roof. Jason saw this years ago. The military kept running tactical drills while the institutional scaffolding crumbled. No amount of rifle practice can fix a broken promotion system or a dysfunctional chain of trust.

Knowledge sharing helps more than most leaders realise. When a few early adopters explore Cursor or experiment with new workflows, bring everyone together. Let them tell their messy stories. Let them show how they messed up a prompt or restructured a project. People learn faster when a colleague, not a consultant, shows them the path.

Then we come to leadership. Teams watch how leaders behave long before they listen to what leaders say. If you ask your developers to embrace AI while you stand outside the arena, they will quietly do the same. But if you are elbows deep, testing prompts, breaking builds, and asking sharper questions like what was your prompt or how did you structure your project, people start to believe this is real. Leaders who walk in front get followed. Leaders who sit inside an air conditioned room and issue commands get compliance on good days and avoidance on the rest.

And for heaven’s sake, stop treating AI like a quarterly fad. Engineers have seen enough tools arrive with trumpets and leave through the fire exit. They have learned to wait out new ideas. Jason wrote that the military did not fight a twenty year war. They fought twenty one year wars. Each year came with its own slogans, acronyms, and enthusiasm. Nothing lasted long enough to take root. Tech does the same thing when it jumps from tool to tool as if loyalty is a weakness.

Accountability is the final ingredient. You cannot transform a team with dashboards that measure feelings instead of behavior. Jason spoke of bubble charts reporting progress while the ground reality rotted. We do something similar with AI metrics. A slide titled AI adoption progress that hides the fact that only three people actually use the tools. Culture changes when expectations are clear and consequences are real.

In the end, building AI augmented teams looks suspiciously like every other kind of institution building. The U.S. did not fail in Afghanistan because the soldiers were weak or the tools were outdated. They failed because the cultural and institutional identity never formed. If we repeat that mistake inside our engineering teams, the result will be quieter but no less disappointing. Tools cannot transform people. Identity transforms people. And people, once transformed, will use any tool well enough to build a future worth having.

This is why leaders should read history, anthropology, and anything else that widens their view. Patterns repeat themselves in the most unexpected places, and once you notice them, you can borrow lessons from worlds far outside your own.

Thanks to Vibhor and Raman for the thoughtful questions and conversations in the "Tricity Engineering Community" meetup, that shaped this post.

Published On:
Under: #aieconomy , #career