HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Pinscreen Deepfake Live Prototype

Hao Li · Youtube · 142 HN points · 0 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Hao Li's video "Pinscreen Deepfake Live Prototype".
Youtube Summary
This is a state-of-the-art demo of a real-time deepfake face swapping technology developed by Pinscreen. This demo will be exhibited at the World Economic Forum in Davos to raise awareness of the danger of deepfakes and advanced video manipulation technologies, when misused for the purpose of disinformation. This is the demo running back at our Pinscreen offices in LA.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Jan 27, 2020 · 142 points, 60 comments · submitted by freediver
jokoon
The human brain is very skilled at recognizing faces, it's the result of evolution.

While deepfake are high quality, I think most people can notice that it's not entirely authentic. A deepfake cannot really account for the shape of the skull, face mimics, face muscles, how the head and jaw move, etc, there are many things that the human brain works on to identify a face. I'm not really sure that deepfakes measure and render all those things reliably.

So I'm not really convinced by deepfakes in general. It's mostly a glorified silicon mask, nothing more: it will trick people if they're far away enough. Those might be able to fool face recognition softwares, but not humans, even those who are aware of deepfakes.

mbel
While it's true that there are some artifacts, I think that they can be mitigated with some tricks like: avoiding sudden head movements by the actor or artificially decreasing the quality of the video. Low quality photos of big foot entertained people for years.

I generally agree though that with time we will learn to see red flags like this and treat video as probably fake.

danlugo92
Actors with similar skull shapes might be used for more convincing deep fakes.
giarc
I think people are mostly able to tell because they are told "this is a deepfake video of arnold schwarzenegger" (and the technology is early). Now imagine someone deepfakes a video of a gathering of people taken from across the room. It purportely shows a politician talking to some constituents and espousing some whacky ideas. It gets posted and before the politician can address it, it's already gone viral and there are pitchforks at their door. The video quality issues are hidden by the fact that it is a grainy cell phone video in a dimmly lit room with digital zoom enabled.
FailMore
Can someone explain the difference of this and a typical Instagram / Snapchat filter? Sorry I'm a bit of a deepfake n00b
speedgoose
This looks slightly better. It's also a bit more in the uncanny valley in my humble opinion.
mkagenius
Another technical difference is deepfake is end-to-end and nobody knows the exact maths applied to each pixels, while filters are more or less hand crafted functions.

This particular article is not that impressive, looks like animation to me.

vernie
I guess the sole purpose of this video is to demonstrate that the processing is happening in realtime. Every other choice (video-of-a-screen, limited range of expression, muted audio) seems to be made to hide what look like pretty bad results.

Hao Li has done some impressive research in his career but he's been running his startup like a bit of a charlatan. He lends his name to numerous scare pieces (e.g. https://www.nytimes.com/2018/03/04/technology/fake-videos-de...) and a former employee has even claimed that he fakes his demos (https://www.latimes.com/business/technology/la-fi-pinscreen-...)

rusticpenn
The George Bush example was not impressive. I havent seen many DeepFakes, but is hair generally not part of the equation?
actimia
I don't think hair is usually included in the faked part. Generally for a convincing result you would use a base video of someone who has at least some semblance to the target (including hair).

It is also possible that hair was disabled to get real-time performance for this (quite impressive) demo.

kavalg
Hmm, I thought deep fakes consume lots of computational power both at training and inference time. Is there some tech/science breakthrough here or they have just scaled it up to more hardware?
jorgemf
Training yes, but for inference a commercial GPU can do that at 14fps with a small latency (very noticeable in the video)
mavhc
I think it's the training of the model that requires most of the time, once that's done it just needs to decide what to swap with swap.

So I'd expect it to only work on the 1 person in the demo, with the 4 famous people depicted.

huling0
though I have to admit this looks like the best in class, there is a public real time deep fake service already available: https://www.aphrodite.ai/
z3t4
Damnit he skipped the female models. Which I think is the most funny. Thinking about monetization - what about animal models? Who wouldn't want to look like a panther...
darepublic
Most of the debate here is around evil use cases for this but I just keep thinking how it will aid in the creation of memes
etiam
I don't necessarily see the contradiction...
owl57
Why did he choose celebrities of different races? Did more similar faces fall even deeper into the uncanny valley?
speedgoose
Ethnicity.
durpleDrank
OH NO! PEOPLE WILL HAVE TO START USING SKEPTICISM AND CRITICAL THINKING WHEN WATCHING VIDEOS FROM SHADY PARTS OF THE INTERNET!
darwinreally
How can we achieve this with what's available in open source world? Or is it only available as a proprietary system?
Erlich_Bachman
Some recent links and models and code, updated early 2020 Q1

https://github.com/iperov/DeepFaceLab ("More than 95% of deepfake videos are created with DeepFaceLab. DeepFaceLab is used by such popular youtube channels as Ctrl Shift Face, Sham00k, Collider videos, VFXChris Ume")

https://github.com/deepfakes/faceswap

https://faceswap.dev/

https://forum.faceswap.dev/

https://dfblue.com/

https://pub.dfblue.com/pub/2019-10-25-deepfacelab-tutorial

kavapebumazh
It scares ma a lot!
cpr
Funny how all this deep fake scare is happening just as important people like Epstein & cohorts start being brought down...
lqet
Before anyone is panicking, please remind yourself that whatever medium was used by humans as a form of communication in the last few thousand years, was also used for fakes. You only need a half-decent voice actor to pretend you are someone else on the phone. Fake photographs are as old as photography. Even the probably most immediate form of interpersonal communication, seeing another person, can be faked, which is demonstrated by the many Doppelgängers used by politicians in the past. The power of the house Habsburg basically had its roots in several imperial documents and bulls faked by Rudolf IV. in 1358 (which was only discovered in the 19th century).

This isn't a new phenomenon that now means that we cannot trust anything anymore - it's only that the fakes are finally catching up with live video.

prox
One rule there was in old times you always had 2 witnesses with you for any important event to vouch for. Reasoning was that it is a lot more difficult to do any trickery when more people are involved.

Just because we scaled a lot of services in a digital network means our trust services need to adapt as well.

philbarr
Reminds me of this story from WW2 [0]. I'm sure I saw a documentary about this but that's the only reference I can find.

"In late December 1942 Grimson and Allan Morris walked out of the camp with a group of German servicemen but were recaptured 2 days later, when they showed more courtesy than was expected from a German Officer. The prisoner forgery department had produced the documents necessary for the escape and the German uniforms to go over their civilian clothing. Both Grimson and Morris had again noticed that they had near doubles serving as officers in the German guard company who they impersonated and both had lain in wait for 3 nights fully disguised in their German uniforms waiting for the right moment when the two clones attended the show. Outside the wire they changed into civilian clothes disguised as foreign workers, walking to Sagan station and catching a train to the outskirts of Leipzig before being caught."

[0] https://en.wikipedia.org/wiki/George_Grimson

numlock86
> You only need a half-decent voice actor to pretend you are someone else on the phone.

You can even digitalize this part already. All you need are 5 seconds of recording and a pre-trained model in the desired language/accent.

https://www.youtube.com/watch?v=0sR1rU3gLzQ

scarejunba
Very cool. Thanks for sharing. I wonder why the warbling is easier to detect in the male voices. They sound like a computer. But very cool advancement.
saurik
Right... but the point is that you _don't_ need fancy neural networks, you _only_ really need someone with an actually not that uncommon skill.
numlock86
Yes. But I think there is another way to look at this: You _don't_ need fancy people with a not so uncommon skill, you _only_ need a bit of computing power.

Point is: I totally agree! The entry level to fake something is really low. No matter the technical value or skills you bring to the table.

saurik
But the point of the comment being responded to--which I also think is a very important point that it feels like you missed--is this isn't some new issue, caused by computers.

Zeitgeist: panic, "the concept of truth is being destroyed by advancements in computers!!!"

OP: sigh, "no... you have always been able to trick people, such as with audio, using a trained voice actor"

You: "but, even with audio, you can also trick people with fancy new computers!"

Me: ?!?

feanaro
The OP seems as if he understood that, but his argument is that this new technology makes the costs significantly lower. The lowering of cost has a strong impact in itself.

In other words, almost no one would have bothered seeking out and then paying a talented individual with a somewhat uncommon skill in the vast majority of cases. Once this kind of software is available for free and packaged in such a package that almost zero effort is needed in order to achieve the same effect, a significantly higher number of people will do it.

So something important has changed. I agree that the alarmist stance is still unwarranted, though.

_Microft
Using voice actors and doppelgängers does not scale. It is similar to the argument that automated surveillance is not bad because you could as well observe someone yourself. Sure that is possible but in reality it is limited to the niche of police and private detectors because it does not scale well.
giarc
In 1358 you could fake your identity and communicate with a handful of people. In 2020 you can fake your identity and communicate with 7.5 billion people at once. I think there is a difference.
keiferski
Similar things happened in Russia and the Ottoman Empire - people claiming to be so-and-so, heir to the throne. As with anything else, if you are/aren't validated by society + the "gatekeepers of power", it doesn't really matter how good/bad your fake evidence is.

https://en.wikipedia.org/wiki/False_Dmitry

mikorym
I'm glad to see this point being made fairly often.

I see the "base reality" debate in a similar situation. Base reality could be the same thing as a religion, where the "god" is the system operator [1]. That way, you are "only" as scared of multiple levels of reality as you are of religion. But perhaps more importantly, that way you have 2000+ years of debate and reflection on the topic.

[1] Not my idea.

ramraj07
In the past, you needed to put effort to fake things. Of course, people still exist that believe the moonlanding videos were faked, but their hypothesis hinges on the government spending extraordinary amounts of money to produce the fake.

Deepfakes are deeply troubling, because anyone with a computer (a requirement that's getting optional as well quite fast) can fake any video almost instantly.

Consider just one sinister motive that's already making women in many countries cower in fear: if you reject a dude he can just make a deepfake porn video with you and release it as revenge. What's the equivalent of this in the distant past? All a guy could have done is "allege" the woman is so-and-so, which to be sure, probably did ruin some women's lives, but it's almost as if our society has devolved back to that same Stone age at best then.

And we live in a time when "facts" already don't matter much anymore and people are just looking to find evidence that supports their claim alone rather than the truth. To not take this technology seriously, to brush it off because "sensible" people will still pursue to seek the truth, is the kind of smug arrogance about questionable tech that has led us to the current overall shit:sandwich state of affairs to begin with. Let's hope we learn some lessons at least.

mbel
With deep fakes easily available I believe the pornographic video material ultimately will have the same weight as a rumor, it's somewhat true for photos today. We definitely need a common way to propagate trusted information, but keeping videos as absolutely trustworthy is not possible anymore.
washadjeffmad
Right, and I'd say the sooner this is demonstrated to the public at large (perhaps not with pornography), the better.

Next is legislation that outlines the rights on individuals and provides a process for reporting and removal of content, a la DMCA. We have existing laws for non-consensual pornography, but it might be interesting to allow the option to treat them like copyrighted content. If a video is profitable, pornographic or not, why not allow people to collect on that?

The next catch-up phase is to release client-side tools to allow people to seamlessly render template content on their home equipment, circumventing any public distribution / "make available" clauses.

I also think long term this'll be a boon for sex worker acceptance and reintroduce some skepticism into the public evaluation of media.

My grandfather was a retired military officer, and late in his life they shot a battle scene for a movie down the street from where he lived. He saw the set and effects in action and was deeply affected by it, saying it was indistinguishable from reality to him, and if that was the level of control they had over video these days, how could he ever trust anything he saw on the news again?

Interesting times.

belorn
> "What's the equivalent of this in the distant past?"

There are so many versions depending what time and what place. If its an elder in ancient times then what they say is the truth. If its in a caste society then the truth is determined by your blood. In chivalry, truth is based on reputation. 19th century truth is written in a book. 20th century it was written in the paper. World war 2 is a period where truth was what was said by leaders, both political and scientists. Post world war 2 until 21th century and truth was anything said on the TV (in many places a single channel).

With the exception of the caste system, I would say that there is a need for a healthy balance between trust and distrust. It also seems that balance generally will occur after a while, but not before a lot of damage happens. Deep fake in that sense is not ushering in a new dark age, but it will likely take a bit of time before people find a balance in how to perceive video evidence.

amelius
> if you reject a dude he can just make a deepfake porn video with you and release it as revenge

But the woman in this example could take the same video and put the faces of a thousand other people on the person in the video.

sharot4
>Of course, people still exist that believe the moonlanding videos were faked, but their hypothesis hinges on the government spending extraordinary amounts of money to produce the fake.

Are you truly suggesting here that the government would never make a fake video about a moon landing because it would cost too much to make this video?? So instead they sent humans to the moon and brought them back safely to earth because that's all their budget would allow?

pvinis
> because anyone with a computer

How is that different from anything else nowadays? That's what computers do for us.

Making calculations, decrypting text, generating art, developing a software.

All of these used to be harder and more manual and time-consuming. Now, anyone with a computer or a phone and an app can do these things way easier, and I'm sure there are better examples I failed to think of.

It only makes sense that even the "bad" uses of a computer will be benefitted by computer progress.

overthemoon
I challenge anyone worried about how facts don't matter to people anymore due to the tech bubble/fake news/propaganda campaigns to prove they ever really did.
keiferski
What the original commenter is saying, and what I will agree will happen, is that the behavior of automatically trusting video as evidence will go away. I think this is actually a good thing if it encourages people to dig deeper into the sources of information.

> If you reject a dude he can just make a deepfake porn video with you and release it as revenge.

This will be troubling at first, but again in the long run it will likely just lead to people not taking these sort of videos seriously and interpreting them as a joke / weird stuff on the Internet. This already essentially happens with Photoshop and celebrities.

> And we live in a time when "facts" already don't matter much anymore and people are just looking to find evidence that supports their claim alone rather than the truth

This is how human beings have been since the beginning of time. There is no "Golden Age of Rationality" that we are exiting.

novaRom
> it will likely just lead to people not taking these sort of videos seriously

Will courts interpret it seriously? It is quite different from funny GIFs if you can completely fake appearance of a politician in an appropriate place doing bad things. We did have different scandals in recent years ('grab'em by pussy', austrian leader of right party, etc.) and it is certainly more to come in upcoming years. How much of that content will be not fake with 100% certainty?

koolba
> Will courts interpret it seriously?

I doubt it. All it takes is a defense attorney showing the jury their own faces merged into a deep fake porn.

ramraj07
I disagree and propose that we are indeed exiting a golden age of objectivity that evolved alongside Western Civilization the past century at the least if not longer.

We developed as a world because the most important nations did not peddle obvious distortions of facts in matters of fundamental importance (that all people are equal, that no one deserves to die, that the world needs preservation, and recently, that man has indeed harmed this planet irreversibly). Of course, every nation in history has been hypocritical and done grave acts that go against these beliefs, but at the least they'll not try to reason it as right, but often just hide them from public.

Now, we have the major leader of most of the important countries misrepresenting fundamental facts for their political gain, and close to a majority of their population is ready to believe them for myriad reasons. Facts have always been misrepresented by politicians but never this brazenly so, and never at this scale.

"Fake news" is not a phrase that involves any technology in and of itself, and if you were right, then this phrase should have been in lexicon for ages. But it hasnt, has it? People a decade or two back would contest facts, but never had I heard anyone say "facts don't matter" till 2015 or so. Things have changed, and not to be iterative, ignorance of this change again shows the inabilities of even the "enlightened" or whatever it is the people around here would like to call themselves to cope with change. Only time will tell where this will lead us.

honestoHeminway
Even back then, pseudo-science that seemed to fit to humans feelings was pretty rampant. The Physiognomy was a accepted science http://www.theijes.com/papers/v3-i12/Version-3/G031203039047... , racial quak-science and pseudo-physics where rampant. The truth is, we where always prone to believe in non-sense that felt good or was close to our preception.

We just had a large surplus to bribe everyone into shutting the hell up and let the pros run the show. Now it has run out, and the whole ilusion of a nicer, better world with nicer, newer, better humans - goes into the bin.

keiferski
This is a very historically-ignorant view. Every culture and state since the beginning of time has manipulated information to suit its needs.

We don’t even need to back that far: the beginning of the Iraq war, not even twenty years ago, was rife with “fake news.” The fact that the buzzword didn’t exist doesn’t mean that the phenomenon didn’t exist.

everdrive
Certainly the sheer volume of fake information available today is orders of magnitude different.
Moghammed
The modern world with its social networks makes it so much easier to do on a large scale, though.

https://medium.com/@francois.chollet/what-worries-me-about-a...

BiteCode_dev
> What the original commenter is saying, and what I will agree will happen, is that the behavior of automatically trusting video as evidence will go away.

This is doubtful. People still believe in pictures despite it's so easy to forge. People believe obvious oral lies. Hell, people believe tweets.

The video medium is incredibly powerful, and the damage it can do is huge.

We are going to suffer for this, and most people won't even understand why.

novaRom
I am observing increasing number of people is now seriously believe in "Moon Hoax". Definitely more than back in 70-80th. Why is that? Generation of YouTube? Deliberate propaganda?
Krasnol
Don't forget the Flat Earth movement.

I don't think anything about that is "deliberate propaganda".

It's just nutjobs gathering and gatherings of nutjobs creating more nutjobs. It's much easier these days with "social" media.

WilTimSon
Well, it's also the fact that people have learned how to monetize this stuff effectively. I mean, even the first fakes were monetized (stuff like shocking photos of fairies would be sold, the people who took them would travel the country and speak about their 'amazing discoveries) but, nowadays, you can just write a book about the Earth being flat because all you need to make money off of it is a few contrarian people or conspiracy theorists.

Even Alex Jones, a figure so obviously deranged with claims that sound like satire, has been selling supplements and merch. All because these days anybody has a platform and monetizing lies is easier than ever. People will believe anything, especially if it's 'published in a scientific magazine', nevermind that the publication has no credibility, reliable sources, or academic integrity. This is how anti-vaxxers got their start, after all.

sharot4
>I don't think anything about that is "deliberate propaganda".

Based on what? "Flat earth" is deliberate psy-op propaganda to discredit anyone talking about "conspiracies". You're concerned about the health effects of fluoride in the water supply? You're not sure the "facts" about 9/11 quite add up? You must also think the Earth is flat. Ergo, you are a complete idiot who cannot be trusted. QED.

That's how "flat earth" works in my opinion. People are scared to use the word "conspiracy" nowadays, as if "conspiracies" do not exist. It's a result of government-directed mainstream media propaganda to discredit dissenting opinion and/or general inquiry.

overthemoon
What is the actual social impact of the flat earth thing, though? Is flat earth creating new nutjobs? Does it really have the power to draw from the ranks of the rational, or is it drawing in people who would have fallen for something absurd eventually? I'd be curious to find out what social conditions predispose people to believing it.
everdrive
This might not be a popular answer, but I think the hope is that a rube would "fall for" something positive that happens to be correct and/or beneficial. eg, suppose somebody believes in modern physics, not because they understand it all, but because they know this is the "right" idea.

This is obviously less ideal than someone actually understanding an idea they profess to hold, but it's far better to have support around beneficial ideas. It's not hard to imagine a lot of important issues fitting into this: support for liberalism and demoracy, support for global warming, support for coherent ideas about vaccines, etc. These all require broad public support, and the truth of the matter is that many people generally don't spend much time evaluating why they believe what they do.

I understand this is not a perfect point of view: eg, who gets to decide what the "best" ideas are.

DarthGhandi
They have big conferences about it, as long as no one is hurt all good with that, but truly doubt that people aren't being hurt by this delusion.

It's an illness that demands rejection of overwhelming proof, it leads down dark paths, much like how arsonists become murderers these things need to be cutoff at the stem. There's nothing wrong with that

Davertron
OK, but are there actually more people who believe in this now, or are you just more aware of it?
ckastner
>The video medium is incredibly powerful, and the damage it can do is huge.

I fully agree. Even if rationally, one rejects a video is fake, it does have the potential to influence the subconscious.

HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.