Geenee AR, creator of a software platform of augmented reality (WebAR), announced that it has combined its body sensing technology with Stable Diffusion, a generative artificial intelligence. The purpose? A "virtual fashion" system that allows you to see how a dress fits before even wearing it, and to "dress" it on all enabled devices.
What does it mean?
Today, it means more efficiency in purchasing clothes online (a real virtual dressing room). Tomorrow, it means pervasive virtual fashion: we will buy clothes visible on us through AR viewers which, in the intentions of many companies, should take the place of current smartphones.
AI and AR together for virtual fashion: how this 'marriage' was born
In recent months, text-to-image AI (artificial intelligence) applications have been released to the public, including From and 2, Midjourney and Stable Diffusion. These powerful generative authoring tools can instantly transform any written sentence into a series of stunning visual images. This, needless to say, dramatically simplifies the asset creation process. At the same time, Geenee AR has improved the experience of virtual proof providing full body tracking with its augmented reality (AR) engine.
By combining these two technologies, it is now possible to create instantly customizable and wearable virtual clothing in a matter of seconds.
An infinite virtual closet
The new technology of Geenee AR shows the infinite possibilities of customization. We start from a neutral dress and by typing any sentence we always obtain new images, which become clothing motifs.
While waiting for everyone to stroll in the "augmented" reality with equally augmented clothes, virtual fashion will be integrated into marketing campaigns (imagine a prize for the ugliest Christmas sweater created and "sewn" virtually by the customers themselves), NFT clothes to collect and customization of in-game avatars.
If you want to take a tour, find the demo here.