Los Angeles is a place where wearing black is something close to a fire hazard. And I stocked up on white button-downs from Uniqlo. Studio Nicholson had an outrageously good sale. A practically brand-new black blazer from the Row fell into my lap during a routine check of the RealReal. I rehabilitated a pair of vintage Gucci trousers, scored an incredible polyamide/viscose Armani blazer that almost looks like velvet. Instead of having a full meltdown, I took inspiration from venerable designer Yohji Yamamoto and went headlong into my all-black era. Once I took a moment to breathe, I realized that the answers to most of my questions were “no” (or “no, plus that’s socially unacceptable and will get you arrested”). This feeling of fashion’s inherently transitory nature had me asking a lot of hard questions: What if all my clothes suck? What if I can never wear this Our Legacy camp collar shirt again because camp collars peaked back in 2021? Should I just run down the streets nude like the Kony 2012 guy? At least for a decade, before they become retro again. The jury is still out as to whether this will actually help people who are the victim of this sort of thing without their consent.Eventually, all the grails in your closet will become passé. The company highlights that if you specifically provoke the AI into generating NSFW images, it might, but that it is implementing filters to prevent this from happening accidentally. UPDATE: The Prisma Labs team replied to our concerns. This seems like a very good idea, but the internet is a hard-to-govern place at the best of times, and we’re collectively facing a wall of legal, moral and ethical quandaries. government to push for laws criminalizing the dissemination of non-consensual nude photos. These platforms, and the unfettered proliferation of other so-called “deepfake” platforms, are turning into an ethical nightmare, are prompting the U.K. It appears that if you have 10-15 “real” photos of a person and are willing to take the time to photoshop a handful of fakes, Lensa will gladly churn out a number of problematic images.ĪI art generators are already churning out pornography by the thousands of images, exemplified by the likes of Unstable Diffusion and others. Adding NSFW content into the mix, and we are careening into some pretty murky territory very quickly: your friends or some random person you met in a bar and exchanged Facebook friend status with may not have given consent to someone generating soft-core porn of them. The ease with which you can create images of anyone you can imagine (or, at least, anyone you have a handful of photos of), is terrifying. The big turning point, and the ethical nightmare, is the ease with which you can create near-photorealistic AI-generated art images by the hundreds without any tools other than a smartphone, an app and a few dollars. But so far, getting those to look realistic takes a lot of skill with photo editing tools along with hours, if not days, of work. Just because it’s common doesn’t make it right - in point of fact, celebrities absolutely deserve their privacy and should definitely not be made victims of non-consensual sexualized depictions. Generating saucy images of celebrities is one thing, and as illustrated by the source images we were able to find, there has long been people on the internet who are willing to collage some images together in Photoshop. We might not be prepared for the consequences. Out of the 100-image set, 11 were topless photos of higher quality (or, at least with higher stylistic consistency) than the poorly done edited topless photos the AI was given as input.ĪI is getting better at generating porn. It turns out the AI takes those Photoshopped images as permission to go wild, and it appears it disables an NSFW filter. The second set, however, was a lot spicier than we were expecting. The first set of images was in line with the AI avatars we’ve seen Lensa generate in the past. Another set, based on the same 15 photos, but with an additional set of five photos added of the same actor’s face, Photoshopped onto topless models.One set, based on 15 photos of a well-known actor.To verify that Lensa will create the images it perhaps shouldn’t, we created two sets of Lensa avatars: It seemed like the kind of thing that shouldn’t have been possible, so we decided to try it ourselves. TechCrunch has seen photo sets generated with the Lensa app that include images with breasts and nipples clearly visible in the images with faces of recognizable people. Now there’s another reason to fly the flag: As it turns out, it’s possible - and way too easy - to use the platform to generate non-consensual soft porn. Lensa has been climbing the app store hit lists with its avatar-generating AI that is making artists wave the red flag.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |