Up to date on July 24 at 9:30 a.m. ET — Mashable’s Tech Editor Timothy Beck Werth initially tried the beta model of the Google Purchasing “Strive it on” characteristic in Might, again when it first grew to become out there for testing. And as of this writing, Google is launching the characteristic to all customers in america on desktop and cell units. You’ll be able to do this digital Clueless closet for your self inside Google Purchasing now — simply click on on an attire product and search for the “Strive it on” button.
At Google I/O 2025, the tech firm introduced a ton of latest AI options, and one of the crucial fascinating is a digital clothes try-on software.
The Google Purchasing “Strive it on” characteristic lets customers add a photograph of themselves after which just about attempt on garments, principally the IRL model of the Clueless closet millennials have been dreaming about since 1995. Or, as Mashable Purchasing Reporter Haley Henschel put it, “Google’s newest purchasing characteristic makes Cher Horowitz’s computerized closet a actuality.”
Nearly as quickly because the characteristic was launched, customers began attempting to “jailbreak” the software, which is turning into a enjoyable little custom for tech writers each time a brand new AI mannequin or software is launched. On Friday, The Atlantic reported that “Google’s new AI purchasing software seems keen to present J.D. Vance breasts.” Hilarious, proper? What’s much less hilarious — the identical software may even generate breasts for photographs of underage customers, once more per The Atlantic.
I made a decision to present the “Strive it on” characteristic a check spin, and I am going to discover the nice, the unhealthy, and the mortifying beneath. As a purchasing software, I’ve to say I am impressed.
Easy methods to use Google’s “Strive it on” AI purchasing software
The digital try-on characteristic is among the free AI instruments launched by Google this week, and customers can signal as much as take part now. Formally, this product is a part of Google Labs, the place customers can check experimental AI instruments. Signing up is easy:
-
Check in to your Google account
-
Head to Search Labs and click on to show the experiment on
-
Take a full-body image of your self and add it
-
Navigate to Google Purchasing and click on a product you wish to “attempt on”
-
Search for the “Strive it on” button over the product picture

The “Strive it on” button seems over the product picture.
Credit score: Screenshot courtesy of Google
As a trend software, Google’s “Strive it on” characteristic actually works
Purely as a software for attempting on garments, the brand new digital try-on expertise is fairly rattling spectacular. The software makes use of a customized picture technology mannequin skilled for trend, per Google.
I am at all times skeptical of latest AI instruments till I’ve tried them myself. I additionally care about my very own private fashion and contemplate myself up-to-date on males’s trend developments, so I wasn’t certain what to anticipate right here. Nonetheless, the software does work as marketed. In a flashy I/O presentation, Google confirmed fashions seamlessly attempting on one outfit after the subsequent, and whereas the precise software is somewhat slower (it takes about 15 seconds to generate a picture), the precise product expertise is similar to the demo.
To indicate you what I imply, let’s evaluate some selfies I lately took on a visit to Banana Republic right here in New York Metropolis to the AI photographs generated by Google for a similar garments. For reference, here is the unique photograph I uploaded (and do not forget that I am a Tech Editor, not a trend mannequin):

The photograph I used to just about attempt on garments.
Credit score: Timothy Beck Werth / Mashable
On this first photograph, I am carrying a blue cashmere polo, and the AI picture appears to be like roughly like the true one taken within the Banana Republic dressing room:
Mashable Mild Pace

Attempting on a blue polo…
Credit score: Timothy Beck Werth / Mashable

And here is how Google imagined the identical shirt. AI-generated picture.
Credit score: Timothy Beck Werth / Mashable
I discovered the AI purchasing software got here fairly near capturing the general match and elegance of the shirts. It even modified my pants and footwear to higher match the product. If something, the digital try-on software errs on the facet of creating me slimmer than I’m IRL.

I ended up shopping for this one.
Credit score: Timothy Beck Werth / Mashable

AI-generated picture.
Credit score: Timothy Beck Werth / Mashable

Yeah, I purchased this one, too.
Credit score: Timothy Beck Werth / Mashable

AI-generated picture.
Credit score: Timothy Beck Werth / Mashable
On this photograph, Google added a necklace round my neck that I’d by no means put on in actual life, and the AI-generated shirt is a bit more slim-cut than it is presupposed to be, however on the whole the general fashion is correct.

I made a decision this is not my fashion.
Credit score: Timothy Beck Werth / Mashable

Neither is the imaginary necklace, watch, and matching white sneakers.
Credit score: Timothy Beck Werth / Mashable
Whereas the pictures are producing, you see a message that claims: “AI photos might embody errors. Match and look will not be precise.”
However for an experimental software, it is surprisingly on level. Individuals have been hoping for a software like this for many years, and due to the age of synthetic intelligence, we lastly have one.
In fact, not all the errors made by this software are so flattering…
Google additionally eliminated my shirt and imagined my chest hair
Here is the place issues get fascinating. In The Atlantic piece I discussed earlier than, the authors discovered that in case you requested the software to generate a picture of a revealing gown or prime, it might typically generate or increase breasts within the unique photograph. That is significantly more likely to occur with ladies’s clothes, for causes that needs to be apparent.
After I used this software with a pink midi gown, the outcomes have been mortifyingly correct. I wager that is just about precisely what I’d appear like carrying that exact low-cut midi gown.
I am going to spare you from the precise picture, however to think about me within the gown, Google needed to digitally take away most of my shirt and film me with chest hair. Once more, I am shocked by how correct the outcomes have been. Now, once I “tried on” a pink ladies’s sweater, Google did give me some additional padding within the breast part, however I’ve additionally been open about the truth that that is not totally Google’s fault in my case. Fortunately, this characteristic was not out there for lingerie.
What might be completed about these issues by Google? I am undecided. Males have each proper to put on cute pink midi attire, and Google can hardly prohibit customers from selecting cross-gender clothes. I would not be shocked if Google finally removes the software from any product that exhibits an excessive amount of pores and skin. Whereas The Atlantic criticizes Google for altering photos of them once they have been underage, they have been those who uploaded the pictures, and in violation of Google’s personal security insurance policies. And I believe the offending outcomes would even be the identical with nearly any AI picture generator.
In an announcement to Mashable, a Google spokesperson mentioned, “We’ve robust protections, together with blocking delicate attire classes and stopping the add of photos of clearly identifiable minors. As with all picture technology, it received’t at all times get it proper, and we’ll proceed to enhance the expertise in Labs.”
Might individuals abuse the digital try-on software to cyberbully their friends or create deepfakes of celebrities? Theoretically, sure. However that is an issue inherent to AI on the whole, not this particular software.
In its security pointers for this product, Google bans two classes of photos, along with its common AI content material pointers:
-
“Grownup-oriented content material, baby sexual abuse imagery, non-consensual sexual content material, and sexually express content material.”
-
“Inappropriate content material corresponding to harmful, derogatory, or stunning.”
Once more, you’ll be able to check out this software at Google Search Labs.
Matters
Synthetic Intelligence
Google