When AI does Everything: What Is the Leader’s Role?
- Nigamanth Srivatsan
- 20 hours ago
- 4 min read
AI has moved rapidly from experimentation to operational reality. Today, it assists teams in drafting strategies, analyzing customer data, generating financial models, refining communication, and improving execution across functions. Tasks that once required years of experience can now be supported in minutes.
This creates a subtle but important leadership question.
If AI tools can generate insights across domains, optimize messaging, and enhance decision-making speed, what distinguishes leadership? Does accumulated experience still carry the same authority? If “20+ years of experience” once signalled expertise, what happens when that experience can be simulated, summarized, and stress-tested through intelligent systems in seconds?
The answer does not lie in competing with AI on speed or analytical depth. It lies in defining direction.
AI does not independently set goals. People use AI to pursue goals within the incentives and cultures organizations establish. The real question is not what AI can do, but what we ask it to optimize.
AI amplifies intent. Leadership defines intent.
If an organization is oriented toward short-term growth at any cost, AI will enhance that. If it is oriented toward long-term value creation, AI will enhance that as well.
The deeper question, therefore, is: what should the intent be anchored in?
Svadharma as the North Star
Classical Indian thought frames human pursuit through the lens of Purushartha — Dharma (responsibility and order), Artha (material prosperity), Kama (aspiration and desire), and Moksha (liberation). Artha and Kama are not rejected; they are regulated by Dharma. Prosperity and ambition are legitimate — but only when aligned with responsibility.
Modern organizations, especially in competitive markets, tend to prioritize Artha — measurable economic outcomes. Dharma is often reduced to compliance rather than treated as a strategic direction.
Yet Dharma in the classical framework is not merely an abstract moral idea. It operates through roles. Each actor — individual or institution — expresses Dharma through its specific function within a larger order.
It is at this level that the concept of Svadharma becomes relevant. Svadharma, loosely translated as “one’s own duty,” refers to role-aligned responsibility. In organizational terms, it can be understood as intrinsic purpose — the responsibility that arises from the role the organization chooses to play in society.
For an organization, Svadharma answers:
Why do we exist beyond profit?
What responsibility do we hold toward our stakeholders?
What outcomes define genuine success?
What lines will we not cross, even if they are profitable?
Without clarity of Svadharma, a high-paced organization risks accelerating without orientation. In the AI age, this risk intensifies — because AI increases the speed and scale of execution, but it does not determine what is worth executing.
Hence, Svadharma becomes the organization’s North Star — the reference point that gives direction to scale.
The EduTech Example : When the North Star Drifts
Consider an EduTech company operating in a highly competitive environment.
We have seen instances where such companies struggled because:
Sales teams, under aggressive targets, misrepresented long-term benefits to parents.
Marketing selectively highlighted exceptional student outcomes.
Financial teams treated one-time subscriptions as recurring revenue to signal predictable growth.
Now imagine the same environment in an AI-enabled world.
AI can analyze behavioral data to identify parents most emotionally susceptible to conversion pressure.
It can generate highly persuasive, personalized sales scripts calibrated to maximize agreement rather than comprehension.
It can model revenue recognition scenarios to present growth curves that appear stable while masking underlying volatility.
If the underlying orientation is revenue maximization at any cost, AI will strengthen that orientation. Scale increases — but in the wrong direction.
What Could a Svadharma-Aligned North Star Look Like?
For an EduTech organization, it might include:
Improvement in measurable learning outcomes
Demonstrable skill gains
Academic or career progression
Access to quality education for underserved segments
Reduction in cost per learner
Increased reach among low-income or rural students
Long-term student success
Retention and course completion integrity
Application of learning beyond certification
This shapes how AI is used.
Marketing would target students who genuinely benefit — not merely those most likely to convert. Product decisions would focus on learning depth rather than superficial engagement. Reporting would reflect durable outcomes rather than short-term optics.
Many organizations anchor themselves to metrics such as MAU, DAU, time spent on apps, or short-term revenue growth. These are useful indicators, but they are behavioral signals — not measures of impact.
In the AI age, this distinction becomes more critical.
AI systems are particularly effective at increasing engagement, refining nudges, personalizing persuasion, and improving conversion flows. If activity metrics become the primary compass, organizations may appear successful while drifting from their deeper responsibility.
A meaningful North Star must reflect value created — not merely activity generated.
The more powerful the tools, the greater the need for clarity in what they serve.
Synthesis: The Leader as Custodian of Svadharma
If we analyse the word for leader in Sanskrit, it is Neta (नेता), derived from the root √nī — “to guide” or “to lead.” From this root also comes Neeti (नीति), which refers to the right set of guiding principles and Netritva (नेतृत्व), which is leadership.
The linguistic connection is simple but powerful. Leadership is not merely about occupying a position of authority. It is about establishing the principles that guide action.

One such leadership principle is clarity of Svadharma — the organization’s role-aligned purpose.
When Svadharma is treated as the North Star, AI becomes an instrument of responsible scale. When it is absent, AI simply amplifies whatever incentives dominate the system.
AI can assist execution. It cannot decide what is worth executing.
In the AI-enabled organization, leadership is therefore not about knowing more than the system. It is about defining and upholding the principles that ensure intelligent tools serve long-term responsibility.
That responsibility remains firmly in the hands of the leader.



Comments