Apple is caught between a stone and a difficult position when it comes to non-consensual video creators.

Face swap ad found to have altered image
Apple is unable to stop the influx of “dual-use” apps, which look innocent at first glance but allow users to create deepfake sex — for a steep cost. Apple is proud of its App Store regulation, which includes preventing pornographic applications. There are limitations to this control, however, as some apps offer features that can be easily abused by users — without Apple’s knowledge. Apple is struggling with a “dual-use” problem in apps that allow features like face swapping, according to a 404 Media report. The feature may seem innocent at first, but users are swapping faces on pornography and sometimes using minors’ faces. A reporter stumbled across a paid advertisement on Reddit that promoted a face-swapping app. Face swapping is easy to find and usually free, so an app like this would need a model that allowed paid advertising. They found an app that allowed users to swap any face on a video from their “favorite websites” with an image suggesting Porn Hub. Apple does not allow porn-related applications on the App Store. However, some apps that are based around user content may feature these images and videos to create a loophole. Apple pulled the app when it was informed of its dual-use. Apple was unaware of the issue, so the link to the app had to be shared. It’s not the first time that innocent-looking apps have made it through Apple’s app review process and offered a service in violation of Apple’s guidelines. Apple was not aware of this violation, which isn’t quite as blatant as turning a children’s game into a casino. However, the ability to create nonconsensual images (NCIIs) was clearly not on their radar. Face swap apps are popular on the App Store. Artificial intelligence features can create incredibly convincing deep fakes. It is important that companies like Apple get ahead of this problem. Apple can’t stop these use cases, but it can implement a policy which can be enforced during app review – clear guidelines and rules regarding the generation of pornographic images. Apple has already blocked deepfake AI sites from signing in with Apple. No app should, for example, be able source videos from Porn Hub. Apple can also set up specific rules for dual-use apps. For example, it could ban apps that are found to be creating such content. Apple Intelligence has been carefully monitored to ensure nude images are not created. However, that is not the end of Apple’s oversight. Apple claims to be the best arbiter for the App Store. It should therefore take responsibility for things like NCII generation in ads. Face-swapping applications are not the only ones with problems. App review allows even apps that promote infidelity, intimate chat, adult chat or other euphemisms. Reports have suggested for years that app review has been broken, and regulators tired of platitudes. Apple must take control of the App Store, or it risks losing control.

 

Add a Comment

Your email address will not be published. Required fields are marked *