Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.
This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.
Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.
This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.