AI camera turns portraits into nude photos
News + Trends

AI camera turns portraits into nude photos

Samuel Buchmann
9/4/2024
Translation: machine translated

The NUCA art project shows how easily fake nude images can be created using generative artificial intelligence. The camera creates AI portraits from real images within seconds and then transforms them into unclothed deep fakes.

Two Berlin artists have developed a camera that virtually takes people's clothes off. Mathias Vef and Benedikt Gross call the prototype of their provocative project "NUCA". It is intended to encourage reflection on generative artificial intelligence (AI) and the potential for misuse of deep fakes. The camera is not intended as a commercial product.

According to the website, NUCA analyses portraits of clothed people according to 45 parameters - for example gender, age, ethnicity, hair and body shape. This results in a prompt for the Stable Diffusion image generator, which then spits out a "source photo" (e.g. the left and right photos in the Header image). In the final step, the AI image is individualised with the face and pose of the real person. The result: fake nude pictures within a few seconds. They match the real people differently from case to case.

The camera's hardware is simple: a small sensor in a 3D-printed housing and a lens with a focal length of 37 millimetres (35 mm equivalent). In addition to the final result, the rear display also shows the original image and the parameters determined.

The clothed images are sometimes very different from the naked ones. This gentleman here, for example, suddenly has twice as muscular thighs and a different face.
The clothed images are sometimes very different from the naked ones. This gentleman here, for example, suddenly has twice as muscular thighs and a different face.
Source: NUCA

"Speculative consumer product"

The art project plays with the old science fiction dream of Superman's X-ray vision and shows both its fascination and its dangers. Fake nude images are already a reality on the internet. For example, deepfakes of Taylor Swift circulated on X in February. In contrast to such non-consensual images, NUCA is intended to remove anonymity from the artists according to the process: Photographer and subject are physically in the same place and see the result directly.

All images in the art project were created with the explicit consent of the people portrayed. Even the
All images in the art project were created with the explicit consent of the people portrayed. Even the

Vef and Gross clarify: "NUCA is a speculative consumer product and is not intended to be used uncontrolled in the wild. The images generated will not be used in any way unless explicit consent is given. To ensure the safety of minors, the project is intended for adults only."

To draw attention to their project, the pair launched a disguised TikTok campaign in which they posed as a tech start-up. In the short video, NUCA is described as a "new kind of camera" that "undresses everyone". Reactions ranged from fear of the potential for abuse to enthusiasm for a fun new gadget. <p

Header image: NUCA

246 people like this article


User Avatar
User Avatar

My fingerprint often changes so drastically that my MacBook doesn't recognise it anymore. The reason? If I'm not clinging to a monitor or camera, I'm probably clinging to a rockface by the tips of my fingers.


These articles might also interest you

Comments

Avatar