So why are you downvoted? Maybe because your view is too optimistic? And the problem isn’t only with autocratic regimes. But much more general.
How do we validate anything, when everything can be easily faked?
With AI video also getting increasingly impressive and believable, I worry that we will soon live in a world where you could have actual video evidence of a murder, and that evidence being dismissed or cast into doubt because of how easy, or supposedly how easy it would be to fake.
I think they are both equally scary. I’m imagining cases where photo and video evidence have played major roles in proving police abuses of power for example. We will certainly have an onslaught of people making faking evidence of all sorts of things to push a political narrative, but equally in any politicized narrative, any politically inconvenient photos or videos of real things that really happened might be swept under the rug as “someone probably just faked that for political gain.” Sure you could have an investigation to look into the authenticity of the evidence, or look at other forensic evidence, but probably only if you can afford to have such an investigation done, or enough public attention gets drawn to it. I fear we are reaching a scary time where, in a sense, reality will be whatever people want it to be, and we will increasingly be unable to trust anything we see as real with absolute certainty. We have been headed down this road for a very long time, but this will just make it much worse
“Photoshopping” something bad existed for a long time at this point.AI generated images doesn’t really change anything other then the entire photo being fake instead of just a small section.
I’d disagree. It takes, now, zero know-how to convincingly create a false image. And it takes zero work. So where one photo would take one person a decent amount of time to convincingly pull off, now one person can create 100 images or more in that time, each one a potential time bomb that will go off when it starts getting passed around as evidence of something. And there are uncountable numbers of bad actors on the internet trying to cause a ruckus. This just increased their chances of succeeding at least 100-fold, and opened the access to many, many others who might just do it accidentally, for a joke, or who always wanted to create waves but didn’t have the photoshop skills necessary.
Yeah some of these would be like 100 layer creations if someone was doing it themselves in photoshop – It would take a professional or near-professional level of skills.
Honestly, it looks like the picture on the left is fake, like the guy was inserted into it. Just look at his outline, compared with the rest of the background.
(I’m no Stalin fan, just commenting on the picture itself.)
I can Imagine such regimes novadays to develop some sort of cryptographic photo attestation, so any photo not signed by them is going to be shown as untrusted, regardless if it’s fake or not. And all the code from processor to camera app would need to be approved by their servers in order to get a sign.
Oh wait! Our great friends at Adobe, Intel, Google and Microsoft are already working on just that: https://c2pa.org/
I have to agree, I would not be able to spot a single one of them as fake. They look really convincingly authentic IMO.
Stalin famously ordered people he had killed erased from photos.
Imagine what current and future autocratic regimes will be able to achieve when they want to rewrite their histories.
This checks out, here’s an article about it: https://www.history.com/news/josef-stalin-great-purge-photo-retouching
So why are you downvoted? Maybe because your view is too optimistic? And the problem isn’t only with autocratic regimes. But much more general.
How do we validate anything, when everything can be easily faked?
Probably just because some people really like Stalin, and have become convinced his accounts are the truthful ones and everyone else lies about him.
That’s a scary thought!! But all kinds of crazy exist, and I mean people have to be literally crazy to want to live under a regime like Stalin made.
lemmygrad dot ml
With AI video also getting increasingly impressive and believable, I worry that we will soon live in a world where you could have actual video evidence of a murder, and that evidence being dismissed or cast into doubt because of how easy, or supposedly how easy it would be to fake.
Absolutely, only video from trusted sources can be used. But isn’t that already the case?
Better than having people get convicted based on fake evidence, though.
I think they are both equally scary. I’m imagining cases where photo and video evidence have played major roles in proving police abuses of power for example. We will certainly have an onslaught of people making faking evidence of all sorts of things to push a political narrative, but equally in any politicized narrative, any politically inconvenient photos or videos of real things that really happened might be swept under the rug as “someone probably just faked that for political gain.” Sure you could have an investigation to look into the authenticity of the evidence, or look at other forensic evidence, but probably only if you can afford to have such an investigation done, or enough public attention gets drawn to it. I fear we are reaching a scary time where, in a sense, reality will be whatever people want it to be, and we will increasingly be unable to trust anything we see as real with absolute certainty. We have been headed down this road for a very long time, but this will just make it much worse
“Photoshopping” something bad existed for a long time at this point.AI generated images doesn’t really change anything other then the entire photo being fake instead of just a small section.
I’d disagree. It takes, now, zero know-how to convincingly create a false image. And it takes zero work. So where one photo would take one person a decent amount of time to convincingly pull off, now one person can create 100 images or more in that time, each one a potential time bomb that will go off when it starts getting passed around as evidence of something. And there are uncountable numbers of bad actors on the internet trying to cause a ruckus. This just increased their chances of succeeding at least 100-fold, and opened the access to many, many others who might just do it accidentally, for a joke, or who always wanted to create waves but didn’t have the photoshop skills necessary.
It changes a lot. Good Photoshopping skills would not create the images as shown in the article.
Yeah some of these would be like 100 layer creations if someone was doing it themselves in photoshop – It would take a professional or near-professional level of skills.
The easy and speed with which AI created photos, of a quality most photoshoppers could only dream, can be created of does very much change everything.
Digital image editing has been really good for this kind of stuff for quite a while. Now it’s even easier with content aware fill.
Unless you’re the PR manager for the British Royal family. Then you somehow lack the basic skills to make convincing edits.
Wikipedia page on the dude: Nikolai Yezhov
Like 1984.
Honestly, it looks like the picture on the left is fake, like the guy was inserted into it. Just look at his outline, compared with the rest of the background.
(I’m no Stalin fan, just commenting on the picture itself.)
I can Imagine such regimes novadays to develop some sort of cryptographic photo attestation, so any photo not signed by them is going to be shown as untrusted, regardless if it’s fake or not. And all the code from processor to camera app would need to be approved by their servers in order to get a sign.
Oh wait! Our great friends at Adobe, Intel, Google and Microsoft are already working on just that: https://c2pa.org/
It won’t help for uncles on Facebook spreading lies.
It would not help with anything.
https://www.facebook.com/Channel4Lifestyle/videos/the-it-crowd-break-up/1232943160173701/