A new technology is set to revolutionize advertising of clothing companies while eradicating the need for real-life models. The new algorithm generates models that are so realistic that the images can be mistaken to have been taken by a photographer with real-life models posing for the shoot.
DataGrid, a tech company from the Kyoto University in Japan developed the new algorithm. It was based on Generative Adversarial Network (GAN), AI that is usually used to produce new imitations of something that already exists. These include video game levels or images. GAN essentially has two AI systems dueling with each other to create the realistic imitation of something.
Previously, the technology that used GAN focused only on generating facial images. These images, however, were not without flaws especially since the images were produced with a dark background which bleeds out the flaws in the images. The new algorithm was able to eliminate these flaws and presented the realistic models with a white background complete with realistic lighting. The result was so good that you wouldn’t notice the images to have been generated by an AI.
The new deep learning algorithm is set to be licensed to clothing companies and advertising agencies who would want to use it. With the new technology, these companies will be able to lessen their advertising costs as there will be no need for other costs such as lighting or even catering budget to feed people working on a traditional photo shoot. The work process will also be improved as the algorithm can easily manipulate AI-models and transform the outfit they are wearing to another.
While this may sound good for advertising companies, it could, however, be tantamount to fooling the public into thinking that the models are real and not just a product of digital technology. That’s how realistic the images produced from the said new technology.
Hopefully, this kind of algorithm will remain within the confine of the advertising world since its improper use may lead to the creation of propaganda that can otherwise confuse the public to believe something to be real.