

Virtual outfit try-ons have become the modern equivalent of window shopping. Many online retailers find them helpful in reducing return rates and enhancing customer experience.
But they are not a new phenomenon.
One of the earliest examples of virtual try-on technology was Webcam Social Shopper, launched in June 2009 by Zugara, a US-based augmented reality (AR) company. It harnessed webcams and enabled shoppers to ‘try’ clothes by digitally placing them on their live video feeds.
Cut to December 2025, and the virtual try-on landscape looks vastly different, thanks to AI.
Google’s new “try it on” feature, which recently entered India, allows shoppers to upload a full-body photo and see apparel from billions of listings realistically mapped onto their bodies. Available under the Shopping Graph for apparel products, its AI model enables shoppers not only to upload full-body pictures but also to try on clothes using just a selfie. It is integrated across Search, Google Shopping, and Google Images.
“Now, if you don’t have a full body photo of yourself, you can use a selfie and Nano Banana, our Gemini 2.5 Flash Image model, will generate a full body digital version of you for virtual try on,” Lilian Rincon, VP of product and consumer shopping at Google, said in a blog.
The feature was launched in the US in July 2025 and is an upgrade of an earlier virtual try-on feature that focused on showing apparel on a diverse range of models. It also launched the Doppl app in December, specifically for virtual try-ons.
The question is: how different is Google’s offering from existing try-on tools, and do we really need another one in the space?
Stand Out Tech
Earlier tools from fashion and beauty brands typically worked only within a single retailer’s app and relied on basic AR filters or preset avatars. Google’s system is vastly different.
It analyses the shopper’s full-body or selfie photo to detect body shape, pose, and landmarks such as shoulders, waist, and legs, and processes them on-device or in controlled environments for privacy. At the same time, it studies retailer product images to interpret garment cut, proportions, and how fabrics drape, fold, and stretch.
Jaspreet Bindra, Co-founder of experiential learning platform AI&Beyond, said the feature marks an important shift from catalogue-style visualisation to AI-driven realism.
“Unlike most existing e-commerce try-on tools that rely on static overlays, limited mannequins or heavily stylised avatars, Google’s approach uses generative AI to simulate how garments behave on diverse human forms, factoring in drape, stretch and proportion. The biggest USP here is scale and intelligence—Google is not just building a feature, but a learning system that improves with data across brands, categories and geographies,” he told AIM.
For Indian shoppers, many of whom remain hesitant to buy apparel online, the feature helps build trust. They can experiment with looks, share outfits with friends, and recreate a fitting room-like experience. Ankush Acharya, a 35-year-old marketing professional, found the tool quite accurate. “It came really close to capturing the overall fit and style of shirts. It even changed my pants and shoes to better match the product.”
What Works, What Doesn’t
As Google integrates try-on directly into search listings, users can try items from multiple brands, compare fit instantly, and discover products faster. Its Nano Banana generative AI model uses diffusion for high-quality image editing to show how fabrics fold, clothes stretch, textures react to lighting, and garments behave across body types. Unlike AR tools that need 3D assets, Google requires only 2D catalogue images, reducing friction for brands and marketplaces.
However, the feature has limitations. AI-generated previews often struggle with complex patterns, multi-layered outfits, or nuanced fabric behaviour, leading to approximations rather than perfect simulations. Acharya also noticed a glitch when trying on a shirt. “It was a little more slim-cut than it was supposed to be,” he observed.
It also does not provide size recommendations, stock checks, or guarantees of real-world accuracy, which doesn’t encourage high-value purchases.
“AI still struggles with real-world complexity. Fabric physics, lighting variations, body posture and individual fit preferences are hard to perfectly model, which means virtual try-on will remain probabilistic rather than precise. These tools reduce uncertainty, not eliminate it. Long-term viability will depend on how seamlessly they integrate into discovery and decision journeys, rather than being novelty add-ons,” Bindra explained.
Variety is another constraint. Google’s tool currently supports tops, bottoms, dresses, jackets and shoes, but leaves out lingerie, bathing suits/swimsuits, accessories (other than sometimes on diverse models), costumes, and traditional/religious wear.
The feature, powered by Search Labs, is limited to 18+ users in the US. Rendering times of 10–15 seconds per outfit also slow down the experience. Moreover, as the tool runs on Google’s platform rather than retailer websites, brands have little control over data, analytics, or shopper behaviour insights.
Privacy concerns loom large. Camera access raises fear of misuse, and earlier versions reportedly produced inappropriate alterations such as body enhancement effects or accessories. While Google did not participate in the story, a spokesperson earlier told CNET, “Your uploaded photo is never used beyond trying things on virtually, nor is your photo used for training purposes. It is not shared with other Google products, services or third parties, and you can delete or replace it at any time.”
But data quality remains critical. “While Google has made progress on representing diverse body types, true inclusivity requires continuous, region-specific data inputs, especially in markets like India, where body morphology and clothing styles vary significantly,” Bindra said.
Also, the proposed AI licensing and copyright guidelines from the Department for Promotion of Industry and Internal Trade (DPIIT) threaten to create a regulatory rigamarole for Google using catalogues from Indian brands.
“On regulation, frameworks such as the proposed DPIIT guidelines should ideally enable trust and transparency without stifling innovation. If designed thoughtfully, they can actually strengthen adoption by setting clear guardrails rather than caging the technology for Indian users,” he added.
Just Another Tool?
Meanwhile, Indian brands have been experimenting with their own virtual try-on tools. Nykaa launched India’s first major virtual try-on for makeup in December 2021, using L’Oréal’s ModiFace. Myntra introduced its “Looks Virtual Try-On” in late 2024, and Lenskart innovated with its 3D eyewear try-on after partnering with US startup Ditto. There’s also Myntra Style Studio and Flipkart’s Fit Finder.
Internationally, Sephora’s Virtual Artist and Warby Parker’s eyewear try-on have been early innovators. Walmart introduced AI-powered apparel try-on in 2022, later upgrading it to support customer photos. Amazon’s virtual try-on, meanwhile, is far more practical for categories like shoes and eyewear because of its real-time AR approach.
These tools are expected to make an impact on India’s booming e-commerce apparel market, projected to hit $98.5 billion by 2032, according to a CoherentMi study. For merchants, the feature may help reduce returns—apparel has some of the highest return rates (35–40% as per a 2024 Return Prime report) due to poor fit expectations. The tools can also help D2C brands without brick-and-mortar stores seeking higher conversions and lower expenses by cutting photoshoot costs. But that can only happen once Indian brands come aboard.
The post Google’s Try-On Feature: Good for Indian D2C Growth or Just Another Tool? appeared first on Analytics India Magazine.


