Here’s how algorithms can protect us against deepfakes

2019-7-13 15:12

Deepfake videos are hard for untrained eyes to detect because they can be quite realistic. Whether used as personal weapons of revenge, to manipulate financial markets or to destabilize international relations, videos depicting people doing and saying things they never did or said are a fundamental threat to the longstanding idea that “seeing is believing.

” Not anymore. Most deepfakes are made by showing a computer algorithm many images of a person, and then having it use what it saw to generate new face images. At the same time, their voice is synthesized, so it both looks and sounds like the…

This story continues at The Next Web

.

Similar to Notcoin - Blum - Airdrops In 2024

origin »

Growers International (GRWI) íà Currencies.ru

$ 0.1476 (+0.00%)
Îáúåì 24H $0
Èçìåíåèÿ 24h: 0.00 %, 7d: 0.00 %
Cåãîäíÿ L: $0.1476 - H: $0.1476
Êàïèòàëèçàöèÿ $178.298k Rank 99999
Äîñòóïíî / Âñåãî 1.208m GRWI

videos deepfakes use having many person showing

videos deepfakes → Ðåçóëüòàòîâ: 5


Deepfakes are being weaponized to silence women — but this woman is fighting back

Fake sex videos aren’t a new phenomenon, but advancement in AI is worrying as ‘deepfakes’ are becoming increasingly harder to distinguish from real videos. Deepfake tech has become easily accessible and videos can be made via FakeApp or on affordable consumer-grade equipment, which is partly why earlier this year the web was flooded with pornographic films of high-profile female celebrities.

2018-10-5 14:34