With more specialisation in artificial intelligence (AI) roles, the talent market is no longer hiring on the basis of general experience alone. Candidates are being assessed for their stack readiness — the combination of tools, applied skills, and the ability to deliver against production-level AI requirements, according to a Quess report.
During the period March 2024 and March 2025, job descriptions across GenAI, data engineering, cloud and analytics have clearly started demanding clearly what they expect new hires/candidates to know in terms of platforms and practices. That means, pairing of tools and skills is in demand. Why? Because clearly roles have matured, teams are more sophisticated than ever before, and companies are at some stage of AI deployment or other. They are either experimenting with it, or transitioning into it or have already scaled in that area.
Some role clusters are in high demand, and there are certain technical skills that are mandatory or sought after for these. And of course there are also certain job platforms that are named in these job postings.
Let us take a look at some:
For the Gen AI engineering role cluster, the key skills/practices are prompt engineering, model evaluation, RAG workflows and RLHF tuning. The common tools and platforms cited are: Open AI API, LangChain, Hugging Face, Gradio, RAG , Transformers.
For data engineering, the key skills and practices include, pipeline orchestration, vector ingestion, token optimisation. The tools include Apache Airflow, Dagster, dbt, Snowflake, Pinecone, Weaviate.
For the AI/ML engineering cluster, the key skills and practices sought after include, feature engineering, model experimentation, life cycle tracking. The tools and platforms commonly used are, Scikit-learn, TensorFlow, PvTorch, MLflow, XGBoost, DVC.
For Cloud AI engineering, the key skills and practices include GPU provisioning, cost optimisation and scalable LLM hosting, while the common tools cited are Vertex AI, SageMaker, Azure OpenAI, GCP/AWS, Terraform.
About 18–22 per cent of data engineering job postings reference orchestration or vector DB tools (Airflow, Dagster, Pinecone). This indicates the infrastructure is mature enough to deploy GenAI. About 16 per cent of cloud engineering roles now include AI-specific infrastructure expectations such as GPU orchestration and cost-efficient scaling of LLMs.
Skills such as prompt engineering and retrieval-augmented generation (RAG) are now embedded in both GenAI and BI roles, indicative of how user-facing and infrastructure-facing capabilities have converged. SHAP, LIME, Fairlearn and other governance tools are becoming more visible in BFSI and healthcare job descriptions (JDs) pertaining to risk, compliance, or platform governance roles.
Yes. Hiring for artificial intelligence (AI) roles is now seeking out stack specificity. Talent is assessed for not just the soundness of concepts but for the ability to deliver value with hands-on fluency across modern AI toolchains. Organisations focusing on building GenAI capabilities, need to be clear about what the role demands, what their hiring criteria is and how much they need to invest on reskilling.