It all began with that scene from the 90s movie Clueless, featuring Cher Horowitz in front of her computer, scrolling through a digital wardrobe until the perfect outfit materialised on screen. That moment transcended cinema and became a collective fantasy: the idea that our wardrobes might stop being chaotic piles of fabric and become something intelligent, organised and deeply attuned to us.
What made the fantasy so compelling wasn't simply the novelty of the technology, it was the emotional promise behind it. The promise of mornings without style panic. Of clothes that understood our bodies and moods. Of an archive of pieces always ready to remix rather than replace. It hinted at a subtle kind of sustainability, the sense of a friend-stylist in the closet, the reassurance of curation and control. That image stuck because it solved something deeper than shopping: it solved the anxiety of standing in front of a full wardrobe and feeling like you have nothing to wear. It promised that technology could know you better than you know yourself.
For decades, the Clueless Closet defined what fashion meets tech could mean. And now, finally, the gap is closing, but not in the way we imagined. The technology has arrived, yet it looks nothing like Cher's desktop interface. It's fragmented across apps, platforms, and AI agents. It's conversational, generative, and increasingly social. It's built by startups, tech giants, and luxury brands, each approaching the same fantasy from radically different angles. The Clueless Closet is no longer a dream. It's being built in real time, piece by piece, right in front of us, just in a dozen different shapes at once.
By 2025, virtual try-ons have evolved from a novelty into a core layer of the fashion and beauty ecosystem. What once felt like gimmicks (overlaying garments on selfie-photos or filtering lipstick shades on a phone screen) is now unfolding into mature systems: deeply embedded in retail, powered by AR and AI, and increasingly adopted by brands as a strategic pillar rather than a side experiment.
Market data tells the story: the global virtual try-on market is projected to grow at a compound rate of over 26 percent, moving from around USD 9 billion in 2023 to well beyond USD 45 billion within the decade. Virtual fitting-room solutions alone are forecast to expand from approximately USD 6.6 billion in 2024 to nearly USD 19 billion by 2030. What this means is that consumers now expect interactive try-ons as part of their journey, not an optional feature. By 2025, virtual try-ons have become standard in fashion, beauty, and furniture commerce.
Yet the landscape is neither complete nor uniform: accuracy still varies, hardware and software integration remain uneven, and many brand ecosystems are just beginning to stitch them together. In short, the Clueless Closet fantasy is no longer on the far horizon, but it's already being built, piece by piece, right in front of us.
Here's a quick deep dive into the main players in the industry that you should know about:
1. Google’s Doppl and AI Shopping Try-On
What it is:
Google is approaching the future of virtual try-ons from two directions at once. Doppl, developed within Google Labs, is an experimental app available in the U.S. that lets users build a digital avatar, upload screenshots or photos from social media, and try on any outfit they find online. It’s not about transactions yet, but about self-expression and creative play by animating looks, testing identities, and gathering data on how people actually want to engage with digital fashion. Doppl also lets users created AI-generated videos so they can see how the outfit would look on them in real life.
Running in parallel is Google’s AI-powered Shopping Try-On, a feature built directly into Search and Shopping results. It shows garments on diverse real-world models and is currently rolling out from the U.S. to more countries like Australia, Canada, and Japan. This version focuses on purchase confidence and retail efficiency, linking AI-generated realism to actual products, brands, and measurable conversions.
Why it matters:
Together, the two approaches capture the split personality of fashion’s digital future. Doppl is Google’s creative lab for embodied style. It's playful, social, and culturally exploratory, while the AI Shopping Try-On is its commercial engine, built for scale. One gathers emotional data, the other drives transactional trust. For brands and strategists, they mark two horizons of the same evolution: a future where AI learns from both what we buy and what we dream of wearing.
@mkbhd Googled new AI “Try On” feature
♬ original sound - Marques Brownlee
2. DRESSX Agent
What it is:
DRESSX has evolved from a digital fashion marketplace into a fully fledged AI stylist. Its new feature, DRESSX Agent, lets users build a hyper-realistic digital twin from a single selfie. The system then functions as both a virtual try-on tool and a stylist that curates looks through AI-driven “mix & match” engines, adapting to mood, occasion and taste. Integrated with a luxury marketplace of more than 200 brands, it enables direct shopping and checkout through verified retailers, creating a seamless bridge between digital identity and real-world fashion.
Why it matters:
DRESSX Agent is often described as a kind of "ChatGPT for fashion": frictionless, intelligent and social. Rather than isolating try-on technology, it interlaces self-expression, styling intelligence and commerce into one experience. By merging your digital twin with infinite outfit generation, it redefines fashion discovery as creative dialogue. What began as a novelty for the metaverse is becoming a layer of daily styling intelligence. It aims to reduce decision fatigue, strengthen brand affinity and reframe fashion as a space for play.
3. Alta
What it is:
Alta, created by Harvard graduate and fashion tech pioneer Jenny Wang, is a multimodal AI styling platform that combines wardrobe digitisation, avatar-based try-ons and shopping inspiration. The platform’s main distinction is how it positions virtual try-ons: as engines for self-expression rather than as purely functional tools. Users can upload and digitise their wardrobe, receive outfit recommendations tailored to weather, travel or specific occasions, and view cost-per-wear insights. Alta also enables digital mood-boarding, allowing people to experiment with aesthetics and visualise how pieces work together.
Why it matters:
Alta marks a shift from transactional tools to styling intelligence. It lives between your closet and the store, turning everyday decisions (what to wear today, what to pack for a trip) into moments of discovery and creative play. For brands, this opens a new kind of engagement layer: one where you're not just competing at the point of purchase, but embedding yourself into the daily rituals of getting dressed. The data here is different, too: It's not just about what people buy, but how they actually live in their clothes, what they remix, what sits untouched. That context becomes as valuable as the sale itself.
@jennaootd No way this app is FREE!😍😍 #alta #altadaily #fashiontrends #fashiontiktok ♬ original sound - .p.r.o.x.y
4. Doji
What it is:
Doji blends digital avatars, AI styling and luxury fashion experimentation. Users build a lifelike digital twin from selfies, try on garments virtually, import items for experimentation and then purchase, share and even resell them through linked retail ecosystems. The platform uses user history, style preferences and body data to provide targeted outfit suggestions, bringing together styling, resale and social-sharing in one interface.
Why it matters:
Doji reframes the wardrobe not as a static collection, but as a living system of use, reuse, and remix. By weaving together virtual try-ons, resale capabilities, and social sharing, it positions fashion as circular by design. For brands, this is both opportunity and warning. Opportunity because it creates new touchpoints beyond the first sale: styling services, resale partnerships, avatar-based marketing. Warning because it accelerates a reality where ownership matters less than access, and where your product's second life might be more visible than its first. The question becomes: are you building for one transaction, or for ongoing presence in someone's rotating wardrobe?
5. Stitch Fix Vision
What it is:
Stitch Fix Vision combines generative AI with computer vision to produce realistic images of clients styled in complete looks. The system draws on Stitch Fix’s extensive dataset of user preferences, purchase history and stylist input to generate personalised outfit suggestions that reflect lifestyle, setting and anticipated needs. Each look is shoppable or saveable, turning the styling process into an interactive visual experience.
Why it matters:
Stitch Fix Vision narrows the gap between personal shopping and digital styling by giving both consumers and human stylists a shared creative tool. It allows users to see themselves in new styles before buying, reducing uncertainty and improving fit confidence, while stylists gain an augmented perspective on what clients might respond to next. Brands that can plug into these hybrid human-AI systems will reach customers at the exact moment they're open to something new, rather than waiting for them to search.
5. ChatGPT-Powered Shopping (Conversational Try-Ons)
What it is:
Conversational AI has quietly become the newest fitting room. Within ChatGPT and other large-language-model interfaces, users can now upload an image of an outfit they love, ask how to recreate it, or even request to “see themselves” in it (with mixed results still). The model blends dialogue, visual generation and search into a single flow: you describe your style, share a selfie or mood reference, and instantly receive AI-built outfit options, complete with shoppable suggestions. It’s not a formal product so much as an emerging behaviour: people are already using ChatGPT as a personal stylist, creative partner and visual try-on engine in one.
Why it matters:
The shift here isn't the technology, it's the interface. Talking to an AI about style feels less like shopping and more like texting a friend who happens to have infinite taste and instant access to every store. It's low-friction, iterative, and emotionally intuitive. For brands, this signals the arrival of AI-native commerce: conversational, contextual, and capable of blending inspiration, advice, and transaction in a single thread. The challenge? Your brand needs to be legible to language models, not just search algorithms. If an AI can't find you, recommend you, or explain why you matter, you're invisible in this new layer of discovery.
6. Revolve x Zelig
What it is:
Revolve has partnered with Zelig to integrate its AI-powered “Build a Look” experience, enabling shoppers to mix, match and virtually try on outfits in real time using either existing model libraries or their own uploaded photos. Zelig’s technology generates photorealistic garment simulation, capturing true fit, drape and layering. It also offers personalized, occasion-based recommendations and digital closets where users can curate and share looks. The system is optimised for speed and engagement, with reported results showing up to three times higher interaction rates and significant reductions in returns.
Why it matters:
This goes beyond try-on: it’s interactive styling at the crossroads of fashion, AI and community. For Revolve’s Gen Z and Millennial audience, it delivers instant, immersive outfit experimentation.It provides a data-rich layer of personalization through custom avatars and contextual filters. Together, Revolve x Zelig are shaping what next-generation e-commerce could look like: participatory, predictive and emotionally engaging.
For all the innovation happening in virtual try-ons, the experience is still incomplete, and in some cases, quietly reproducing old problems in new interfaces.
Start with the body itself. Most platforms still rely on limited body-type datasets, meaning avatars are only as diverse as the training data allows. If your proportions don't match the narrow range of reference models, the garment drape becomes an approximation at best. This isn't just a technical limitation. It's a trust issue. The promise of virtual try-ons is that you'll see yourself. But if the reflection doesn't feel accurate, the entire system collapses into novelty again.
Then there's the question of privacy and data ownership. To create a hyper-realistic digital twin, you're handing over detailed measurements, photos, purchase history, and behavioral patterns. Who owns that data? How is it being used beyond styling recommendations? Most platforms remain vague on these questions, and users are left to trust by default rather than by design.
There's also cultural bias embedded in how these systems define style. AI-driven styling engines are trained on existing fashion content: magazines, e-commerce sites, influencer feeds. This means they inherit the aesthetic hierarchies already baked into those spaces. Virtual try-ons risk becoming echo chambers of taste, reinforcing narrow beauty standards rather than expanding them.
Overall, we are seeing virtual try-ons evolve from utility to expression. What began as a technical fit check is now becoming a cultural interface where people explore identity, curate taste, and connect through shared aesthetics. Across platforms like Doppl, DRESSX, Alta, Doji, Stitch Fix, and ChatGPT, personalisation is accelerating: hyper-realistic digital twins, predictive styling, and AI-driven curation are setting new standards for engagement, conversion, and loyalty.
At the same time, the lines between shopping, inspiration, and community are dissolving. Fashion is turning into a participatory experience where users co-create, remix, and share, blurring the boundary between owning and imagining. For brands, this moment demands new playbooks that treat try-ons not as a utility, but as the new architecture of digital retail.
The Clueless Closet is nearly here. What we do inside it will reshape not just how we shop, but how we see ourselves.