“The AI Cauldron”

Everything’s On the Boil – People, Process, Productivity and ROI

The speed at which AI is proliferating across industries, domains, departments, and functions appears to be a moving target that cannot be caught up with, at least for now. There is a continuous evolution of AI models, tools, and methodologies as software development becomes increasingly democratized. A small group of individuals are able to come up with reasonable production-ready applications and platforms which earlier, typically required specialists like architects, leads, business analysts, developers, testers, deployers and the like (Eg. Emergent, Vidmox, Xano).

Today, every organization wants to use AI internally as well as provide services powered by AI to customers. To the extent that there is an indefinable urgency that something needs to happen around AI, often followed by surprise and exasperation as to why not everyone is on board or is able to fully understand the significance of this AI wave!

AI Vocabulary

AI seems to be at the center of everything in software organizations. There isn’t a day which goes by without AI terms like Large Language Models (LLM), Claude, ChatGPT, Copilot, Grok, Context engineering, Model Context Protocol (MCP), Retrieval-Augmented Generation (RAG), Chunking, Vector Database, and the like being thrown around.

Take for example, RAG. From its humble beginnings as “Simple RAG”, it kept evolving into RAG = RAG + 1 etc. At last count, there were 25 variants! Just imagine how many such combinations can happen with all these terminologies and methodologies in pursuit of accuracy and speed.

Traditional KPIs like Definition of Done (DoD) and defects are increasingly being replaced by AI-centric metrics such as “intent” and “tokens.” Engineers who are not using or living by this vocabulary risk being seen as legacy and not evolving with the times. There is both peer pressure and FOMO. Meanwhile, those who are immersed in the AI journey showcase their knowledge and expertise, advising on roadmaps and leading from the front. But RAG does not stop at RAG. Does it?

Here’s a sobering thought…The AI journey has already sped past a lot of this vocabulary.

Productivity

There are claims across software engineering that productivity has improved by 20% to 70% or more, with code being generated at 2x or even 3x speed from inception to production.

Traditional Software Development Life Cycle (SDLC)-oriented estimation models using function points, story points, epics, or work breakdown structures are gradually being replaced by assumed productivity in Agentic Development Life Cycle (ADLC) models for new proposals. However, the agentic world itself is evolving, and there is an ongoing debate on how estimation should be done using tokens, agents, and human-in-the-loop approaches.

Questions remain about productivity

  • For instance, what is the productivity for a totally greenfield platform development using ADLC?
  • Or how different is the productivity for a brownfield platform utilizing assisted AI development embedded into traditional agile methodologies?
  • Or even what it takes to transform from an assisted AI model to a fully Agentic AI development model for large existing codebases?

There are frequent questions about how much productivity these approaches will truly deliver, whether they can offset the millions spent on token consumption, and whether they can still ensure margins on investments.

Benchmarks often come from external snippets, analyst reports, or comparisons between tools like Claude, Copilot, and Gemini for the same use case. However, keeping up with these ever-changing benchmarks requires significant and rapid internal organizational transformation.

ROI

There is growing recognition that a significant portion of open positions will be for roles such as AI full-stack developers, AI orchestrators and the like. While there are multiple choices for experienced AI engineers, who can pick and choose where they want to work, the salary costs of a full-stack AI developer is going through the roof along with the increasing cost of consuming millions of tokens during development.

There are POCs and research data quoted about how a full-stack engineer is able to drive 2x or 3x outcomes with ADLC ways of software development to justify high salary increases. However, the ROI to justify such a high increase is a challenge as there are no new budgets getting approved to cater to such a demand for AI engineers.

Organizations are moving towards an AI-first approach for every function and lowering headcount across functions in the hope that this will somehow make these organizations be relevant and future-ready. This is evident from the news we see or hear every other day about organizations laying off employees!

But those companies that make it their mission to invest in people at this time, building cross-functional collaboration, and transforming existing employees into AI-competent professionals will be far more successful in navigating the AI challenge than those who are heavily trying to hire from outside to augment their team!

Data for AI

Over the past few years, data has been seen as the new oil. However, the combination of data and AI has triggered a frenzy in the market on how to monetize data.

For example, in the publishing space, content at the level of a search query is expected to be metered and monetized. Gone are the days when the whole book had to be subscribed to get information. Users may only need 10% of the book’s content for their research.

Simply chunking the contents of books and vectorizing them has become a core service, and AI is used to organize and piece information together. There are agentic solutions where the entire research paper is written, evaluated and published totally with AI, where the process is very close to the way a human would have researched, reviewed and published!

The People

While the AI model, methodology, productivity, and the like have made headlines, it is ultimately the employees who will make the organization successful or not, even in the AI world.

There is increasing pressure and expectation for traditional backend and frontend developers to become AI full-stack developers, capable of producing 2x or 3x code while having a superlative understanding of context engineering and ADLC models. A “Tech lead” is expected to work with a combination of agentic virtual employees and a diminishing proportion of real employees to deliver high-quality and production-ready code. An architect is expected to become an orchestrator of agents, with optimal use of tokens and achieve desired outcomes as validated by systems engineers. All of this transformation is expected while employees continue working in billable roles, often for customers who may not yet have begun their AI journey.

To enable this shift, organizations need structured AI curricula, competency assessments, labs, token access, AI-integrated development environments, formal ADLC frameworks, governance models, guardrails, standards, and real-world use cases. These are essential for employees to thrive and make meaningful contributions to themselves, their organizations, and their customers.

HR Play

The job descriptions are changing! AI is now being used to construct JDs, search for candidates, rank them, schedule interviews, evaluate performance, and shortlist candidates, leaving talent acquisition teams to act more as orchestrators finalizing onboarding. This shift can free up time for HR to engage more deeply with business and industry trends, becoming strategic partners in talent development.

But it has also posed some tricky questions.

  • Does HR realize there will be virtual employees among them possibly counseling and having one-on-ones with their team on real and virtual employee challenges?
  • How does the performance evaluation of a real employee outcome vs a virtual employee consuming tokens delivering outcomes happen?
  • How does the employee journey, training, and evaluation look like from their current roles to evolving roles within the AI landscape?
  • How does HR support the employee to cope with this massive transformation?
  • Considering the expected productivity gains with AI, how does HR engage with talent who may be stressed between juggling with billable work while becoming AI-ready?
  • How does the experience level pyramid change?
  • What about the talent who may not be able to cope with the AI journey?

This article reflects thoughts and perspectives on how the AI cauldron is shaping organizations, employees, and customers, written entirely by a human, at least for now.

Authored by Vinod S

Related Blogs

Your “All Green” Accessibility Report May Not Be Telling You the Whole Truth.

April 7, 2026

Authored by: Vincent Emerald

Your “All Green” Accessibility Report May Not Be Telling You the Whole Truth.

Ensuring Accessibility in AI-Generated Learning Experiences: Why Accessibility Must Be Engineered into Learning from Day One

March 31, 2026

Authored by: Madhuprasad Sathrawada

Ensuring Accessibility in AI-Generated Learning Experiences: Why Accessibility Must Be Engineered into Learning from Day One

Why Standards Publishing Isn’t Standard

March 23, 2026

Authored by: Barry Bealer

Why Standards Publishing Isn’t Standard

Why Accessibility Failures Are Becoming Business Risks in Publishing and Education

March 13, 2026

Authored by: Dhanalakshmi B and Rahi Sarkar

Why Accessibility Failures Are Becoming Business Risks in Publishing and Education

AI Agents Are Redefining Enterprise Systems: From Passive Records to Active Intelligence

March 5, 2026

Authored by: Ravikiran SM and Rahi Sarkar

AI Agents Are Redefining Enterprise Systems: From Passive Records to Active Intelligence

Monetization Trends in the Knowledge Economy: Subscriptions, Credentials, and Beyond

February 24, 2026

Authored by: Barry Bealer

Monetization Trends in the Knowledge Economy: Subscriptions, Credentials, and Beyond