We can define Save money with meme videos as a component that carries symbols, practices or traditional ideas. These ideas and concepts spread within a society from one person to another. These imitable phenomena that have a mimicked theme can be conveyed through the sources of speech, gestures, writings or videos. These memes have the capacity to self-replicate and mutate just like genes of human body. Want to earn money through internet?
Barack Obama fake Digital video Speech in University regarding save money with meme videos:
University of Washington’s Paul G. Allen School of Computer Science and Engineering have been working on the production of fake or forged videos. They have been successful in producing a fake digital video of Barack Obama. Nobody was able to identify that the speech given by fake digital Barack Obama was not real. Since the society has already been divided into categories politically, this innovation in fake and save money with meme videos is an alarming situation for the society.
While the incredible applications for this technology are only starting to be explored, recent press coverage of fake videos shows us that malicious use of this type of technology is around the corner.
Florida pediatrician uncovered suicide tips covered up in recordings on YouTube and YouTube Kids
Free Hess, a pediatrician and mother, had found out about the chilling recordings over the mid-year when another mother spotted one on You Tube Kids. She said that minutes into the clasp from a kids’ computer game, a man showed up on the screen – giving directions on the best way to end it all.
Popular Nintendo Games
“I was shocked,” Hess said, noting that since then, the scene has been spliced into several more videos from the popular Nintendo game “Splatoon” on YouTube and YouTube kids, a video app for children. Hess, from Ocala, Florida, has been blogging about the altered videos health experts, who say such visuals can be damaging to children. One on YouTube shows a man pop into the frame. “Remember, Kids,” he begins, holding what appears to be an imaginary blade to the inside of his arm.
“Sideways for attention. Longways for results.”
“I think it’s extremely dangerous for our kids, “Hess said about the clips Sunday in a phone interview with The Washington Post. “I think our kids are facing a whole new world with social media and internet access. It’s changing the way they’re growing, and it’s changing the way they’re developing. I think videos like this put them at risk.”
The Deep Fake algorithm was subsequently released on GitHub, giving anyone with sufficient know how and a decent enough computer the means to make pretty decent fakeries. In parallel with the development of deep fake technology, AI is also being developed to counter this threat: machines trained to detect malicious alterations in video for the inevitable future where we find ourselves unable to detect the forgeries ourselves.
Work Done to detect Deep fakes:
In New York, a team under the supervision of Siwei Lye, at the University of Albany, found the imperfection in the forgery. The videos are created by the Deep Fake Algorithm with the help of images that are being fed to it. Although this technology is very accurate but artificial intelligence has failed to produce all the physiologic signals that are produced by natural humans. The most important thing that is being focused by the team at University of Albany is blinking. Naturally humans blink unintentionally after every two or three seconds. But in the pictures people have rarely closed their eyes, therefore while training the algorithm with these pictures mean that people will rarely blink.
Team Designed Algorithm under the supervision of Lyu
The team under the supervision of Lyu have designed the algorithm that will detect those videos in which blinking is not present. The algorithm consists of two neural networks. The first part detects the face in the video and checks whether the eyes are closed or not in the frame. The second part serves as the remembering system that memorize the decision from the previous frames to check whether the blinking was present or not over a particular duration of time. Firstly, they trained the algorithm on the dataset of eyes having the labels of eyes opened and closed. For the testing purposes of the algorithm, the fake videos were generated by them, and they also did post-processing to make the forgeries finer.