Google Cloud’s Agentic AI Stack is Changing How Indian GCCs Build for the Future

Google Cloud is doubling down on its commitment to empower India’s global capability centres (GCCs) with AI infrastructure and tools that go far beyond basic automation. From LLMs to purpose-built data agents, the company is positioning its unified agentic data-to-AI platform as the foundation for next-generation enterprise transformation. 

At the heart of this strategy is the belief that GCCs aren’t just back offices; they’re becoming innovation hubs capable of driving autonomous decision-making, end-to-end product development, and real-time intelligence at scale.

At the MachineCon GCC Summit, Arun Ramamurthy, who leads the GCC charter at Google Cloud, offered a comprehensive vision of how the company is making advanced AI tools, including Gemini, Agentspace and Vertex AI, directly accessible to developers, data teams, and operations leaders within global capability centres.

“When we started engaging with GCCs five or six years ago, it was all about developer tools and infrastructure optimisation,” Ramamurthy said. “But today, it’s [about driving] business transformation through critical technologies like data and AI.”

Google’s new unified approach—what Ramamurthy called a “vertically integrated, top-down platform”—brings together decades of AI research into production-ready products. At the core is its agentic data-to-AI platform, designed not only to support teams but to act on their behalf.

What Lies Beneath

To illustrate what the future looks like, Ramamurthy played a demo of Project Astra from DeepMind—a single-shot video featuring a multimodal AI assistant that can hear, see, reason, and respond in real-time. 

The underlying technology is now being integrated directly into Google Cloud’s stack, Ramamurthy explained.

“Generative AI is like an assistant—it waits for instructions. But agentic AI is more like a CEO you never knew you had,” Ramamurthy said. “It can take autonomous decisions, collaborate beyond human boundaries, and still operate within constraints defined by you.”

Google’s suite of tools extends from its most performant LLM, Gemini 2.5, which supports up to 1 million context tokens, to its proprietary AI infrastructure. 

Ironwood, the company’s seventh-generation Tensor Processing Unit (TPU), delivers a staggering 42.5 exaFLOPs per pod—25 times faster than the world’s fastest supercomputers. 

For businesses exploring hybrid AI strategies, Google’s AI Hypercomputer allows seamless orchestration of TPUs alongside NVIDIA GPUs, enabling performance at scale without sacrificing cost efficiency.

There’s also Willow, Google’s quantum chip that tackles complex mathematical problems as well as video and audio generation models like V2 and Lyria. “With Lyria, we recently partnered with Grammy Award winner Shankar Mahadevan to explore Indian classical music with AI,” Ramamurthy shared.

Creating AI for GCCs

These capabilities are not theoretical. Enterprises like Vodafone are already using Gemini and Imagen to optimise their telecom infrastructure, while L’Oreal is tapping into generative AI to reshape marketing content. 

“These are examples of AI at scale,” Ramamurthy said. “But scale needs more than just models—it needs the right platform.”

That platform is Vertex AI, which serves as the orchestration layer for Google’s AI ecosystem. Developers can integrate Gemini, open-source models like Gemma, or third-party LLMs such as Claude or Llama, all while maintaining interoperability across their tech stack. 

“We’re giving GCCs the choice to pick the right model for their use case and integrate it seamlessly,” Ramamurthy added.

The highlight of the session was Google’s new Agentspace—its platform for building agentic AI solutions. 

Already in live demos, Agentspace enables enterprises to create internal assistants that combine enterprise search, conversational interfaces, and connectors to third-party tools like SAP and SharePoint. “We’re enabling you to give your employees the experience of using Google-quality search within your enterprise,” Ramamurthy said.

But the story doesn’t stop at assistants.

Google is Moving Beyond Assistants

GCCs are data-heavy operations. If we’re not solving for the data ecosystem, we’re missing the larger picture. To address this, Google Cloud is building data agents—agentic systems deeply embedded into data platforms like BigQuery. 

These agents aren’t standalone bots; they’re integrated intelligence layers built for specific personas, like data engineers, analysts and scientists, across the entire workflow, from data exploration to pipeline creation and ML deployment.

“Normal AI agents perform standalone tasks,” Ramamurthy explained. “But data agents are embedded and real time. They provide contextual recommendations and insights without needing to be prompted. It’s a shift from query-based interaction to proactive intelligence.”

BigQuery itself is evolving into more than just a warehouse. With intelligent agents layered in, the platform becomes an active participant in data workflows, accelerating productivity and enabling faster decision-making grounded in enterprise data.

Ultimately, Google Cloud’s message to the GCC community is one of partnership and empowerment. Whether it’s through advanced chips, expansive LLMs, multimodal capabilities, or embedded data agents, the goal is to give Indian GCCs the tools they need to lead global innovation.

“We are building an AI-optimised platform to help you leverage open, multi-cloud ecosystems and create interoperable, collaborative models. The future of AI isn’t just about scale; it’s about strategy,” Ramamurthy concluded.

The post Google Cloud’s Agentic AI Stack is Changing How Indian GCCs Build for the Future appeared first on Analytics India Magazine.

Scroll to Top