iPads, MakerEd and More in Education
1.2M views | +0 today
Follow
iPads, MakerEd and More  in Education
News, reviews, resources for AI, iTech, MakerEd, Coding and more ....
Curated by John Evans
Your new post is loading...
Your new post is loading...
Scooped by John Evans
Scoop.it!

Most Deepfakes Are Porn, and They're Multiplying Fast | WIRED

Most Deepfakes Are Porn, and They're Multiplying Fast | WIRED | iPads, MakerEd and More  in Education | Scoop.it
In November 2017, a Reddit account called deepfakes posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. The technique has drawn laughs on YouTube, along with concern from lawmakers fearful of political disinformation. Yet a new report that tracked the deepfakes circulating online finds they mostly remain true to their salacious roots.

Startup Deeptrace took a kind of deepfake census during June and July to inform its work on detection tools it hopes to sell to news organizations and online platforms. It found almost 15,000 videos openly presented as deepfakes—nearly twice as many as seven months earlier. Some 96 percent of the deepfakes circulating in the wild were pornographic, Deeptrace says.
No comment yet.
Scooped by John Evans
Scoop.it!

'Perfectly real' deepfake videos are 6 months away: report

'Perfectly real' deepfake videos are 6 months away: report | iPads, MakerEd and More  in Education | Scoop.it
A deepfake pioneer said in an interview with CNBC on Friday that "perfectly real" digitally manipulated videos are just six to 12 months away from being accessible to everyday people.

"It's still very easy you can tell from the naked eye most of the deepfakes," Hao Li, an associate professor of computer science at the University of Southern California, said on CNBC's Power Lunch. "But there also are examples that are really, really convincing."

He continued: "Soon, it's going to get to the point where there is no way that we can actually detect [deepfakes] anymore, so we have to look at other types of solutions."
No comment yet.
Scooped by John Evans
Scoop.it!

This viral Schwarzenegger #deepfake isn't just entertaining. It's a warning. - NBC News

This viral Schwarzenegger #deepfake isn't just entertaining. It's a warning. - NBC News | iPads, MakerEd and More  in Education | Scoop.it

"The video starts like dozens of others on YouTube — with former “Saturday Night Live” star Bill Hader offering up a celebrity impression, this time of Arnold Schwarzenegger.

The impression is spot-on, but that’s not why the video has almost 6 million views in the last month. About ten seconds into the video, Hader’s face slowly, almost imperceptibly starts to morph into Schwarzenegger’s face. The full transformation takes about six seconds, but the changes are so subtle that it seems like magic. Suddenly, it looks like Schwarzenegger, albeit a skinnier version, is doing an impression of himself."

No comment yet.
Scooped by John Evans
Scoop.it!

Google Releases Deepfake Dataset to Help Create Detection Methods - Interesting Engineering

Google Releases Deepfake Dataset to Help Create Detection Methods - Interesting Engineering | iPads, MakerEd and More  in Education | Scoop.it
The big tech company has released the videos and images in the name of fighting deepfake technology.
No comment yet.
Scooped by John Evans
Scoop.it!

Down the Rabbit Hole with Deep Fakes | Knowledge Quest

Down the Rabbit Hole with Deep Fakes | Knowledge Quest | iPads, MakerEd and More  in Education | Scoop.it
Do you know what a deep fake is? A workshop I attended defined a deep fake as “the alteration of images, videos, and audio files with the intent of maliciously deceiving an audience into thinking they are real.” Workshop participants watched in equal parts horror and fascination at the Jordan Peele sample of his audio spliced onto a video of President Obama. Other examples included faces transposed over videos of others, an animation of the Mona Lisa (a still image!), and a counterfeit audio clip created from a snippet of an original. In the workshop we discussed worst-case scenarios of people exacerbating public discord with fake videos and images or politicians dismissing real videos and audio by labeling them “phony.” Who will people trust? What institutions will have credibility? Will reality apathy and cynicism set in? Will student bullying and mental health issues increase with viral rumors evidenced with supposed video and audio?

Many educators were so horrified by the possibilities of this emerging technology they didn’t even want to mention it to their students for fear of injecting these malicious ideas into their devious minds. But what does this say about us if we assume students and people, in general, are going to use these digital tools for debased purposes? Why do we have so much fear and paranoia?
No comment yet.
Scooped by John Evans
Scoop.it!

Deepfakes are getting better—but they’re still easy to spot - Arstechnica

Deepfakes are getting better—but they’re still easy to spot - Arstechnica | iPads, MakerEd and More  in Education | Scoop.it

"Last week, Mona Lisa smiled. A big, wide smile, followed by what appeared to be a laugh and the silent mouthing of words that could only be an answer to the mystery that had beguiled her viewers for centuries.

A great many people were unnerved.

Mona’s “living portrait,” along with likenesses of Marilyn Monroe, Salvador Dali, and others, demonstrated the latest technology in deepfakes—seemingly realistic video or audio generated using machine learning. Developed by researchers at Samsung’s AI lab in Moscow, the portraits display a new method to create credible videos from a single image. With just a few photographs of real faces, the results improve dramatically, producing what the authors describe as “photorealistic talking heads.” The researchers (creepily) call the result “puppeteering,” a reference to how invisible strings seem to manipulate the targeted face. And yes, it could, in theory, be used to animate your Facebook profile photo. But don’t freak out about having strings maliciously pulling your visage anytime soon.


“Nothing suggests to me that you’ll just turnkey use this for generating deepfakes at home. Not in the short-term, medium-term, or even the long-term,” says Tim Hwang, director of the Harvard-MIT Ethics and Governance of AI Initiative. The reasons have to do with the high costs and technical know-how of creating quality fakes—barriers that aren’t going away anytime soon."

No comment yet.