Thing 5

Thing 5. Synthetic Media

Purpose – The Real Behind the Fake

Synthetic media is an umbrella term that covers all creation, manipulation, and modification of media, including video, photographs, and audio. This includes memes, deepfakes, cheapfakes, and other generated media. 

Introduction

Seeing isn’t always believing. Since the advent of photography and motion pictures, people have been manipulating images to remove or add individuals, change the backgrounds to create a new ‘reality’, or otherwise change the reality captured by the original media. With advances in technology this is easier than ever. Anybody who has a computer and access to the internet can technically produce a deepfake video. Technology can be used to make people believe something is real when it is not.

Long before ‘photoshopping’ became a verb describing image manipulation, people were changing images to fool people for various reasons. Arthur Conan Doyle was fooled by two girls who created a photo claiming to show fairies in their garden. A famous picture of Abraham Lincoln is a composite of his head and another politician’s body. Joseph Stalin often air-brushed his enemies out of photographs.

Adobe Photoshop made it possible for anyone with access to the program and some training to manipulate photos. Anyone could be a photo editor and improve the technical aspects of an image. Recently, fashion magazines have changed images to reflect an ‘improved’ reality, by manipulating fashion photos to change models’ bodies to be thinner, lighter, younger.  

With the development of more advanced technology, image manipulation has entered a new era of deception, one where detection of the fakes is harder than ever.

Deepfakes are so-named because they use deep learning technology, a branch of machine learning. They rely on massive amounts of data to create the fakes with sophisticated tech tools. Deepfakes go beyond the simple manipulation of images and use artificial intelligence and machine learning to create new images, called deepfakes, that are almost impossible to detect. 

Cheapfakes, also known as shallowfakes, rely on readily available technology tools or apps to create a meme or image. Most of us have access to the simple editing tools that make it possible to create a cheapfake. iMovie, Blender, or Adobe Premier for video editing and tools like PicMonkey, Canva, & Pixlr for photos make it possible for even inexperienced users to create a cheapfake. Specific tools & apps have popped up just to create face swaps or other image modifications. Even GIFs & memes can be considered a form of cheapfakes and weaponized as disinformation. 

Here is a post on Bored Panda showing Photoshopped images. Both the original image, as well as the cheapfake, are shown. https://www.boredpanda.com/photoshop-battles/?utm_source=google&utm_medium=organic&utm_campaign=organic

And here is a look at the “best” memes so far of 2021, many of which are responsible for disinformation spreading. https://www.esquire.com/uk/culture/a35115913/best-memes-2021/

Deepfakes are more insidious. Take a look at the examples in this post from Creative Blog ‘12 Deepfake Examples That Terrified and Amused the Internet’ https://www.creativebloq.com/features/deepfake-examples. Some of the examples, like Deepfake Roundtable: Cruise, Downey Jr., Lucas & More – The Streaming Wars | Above the Line (15:13), are ‘scary good.’ https://youtu.be/l_6Tumd8EQI

Regardless of the technology used to create the new images, the intent is to twist reality. Both types use pre-existing images, videos, or audio content as the basis for the changes that result in something new. Here are three common types of deepfakes/cheapfakes:

  • Face replacement (or face swap) is a technique that transposes an individual’s face onto another body.
  • Face generation is the production of realistic faces that do not actually exist.
  • Speech synthesis is the creation of human speech using AI. 

Use of deepfakes isn’t always malicious. Filmmakers can use deepfake technology to add actors to scenes after filming. Carrie Fisher was added this way to Star Wars after her death. Or Forrest Gump interacting with historic figures. Even ALA READ posters are a cheapfake by letting you place yourself or someone in a background you choose.https://www.flickr.com/photos/alastaff/albums/72157622522029611

However, deepfakes and cheapfakes can be used to defame, malign, or otherwise damage an individual or organization by creating an untrue scenario that is then blasted through the media landscape. Deepfakes have been used in financial fraud, hoaxes, and fake news. The most damaging deepfakes to date have been those created to as non-consensual pornography using celebreities’ or others’ heads on different bodies.

Audio fakes are not yet as widespread or sophisticated as the visual versions, although AI technology continues to advance.

Should you be worried about deepfakes and cheapfakes? Deepfakes and cheapfakes both have the potential to be dangerous for society as major sources of misinformation and deception. Since most online users believe stuff without verifying the source or accuracy, deepfakes pose a threat to the truth. As information professionals it is important that we understand these potential hazards and know how to dig deeper to find the truth. 

The technology is not there to convincingly fool anyone (everyone) right now. Potentially, in a few years, both governments and tech companies will have to work together to tackle this problem. In the meantime, take time to learn how to detect deepfakes and share that knowledge widely with patrons, family, and friends. 

Videos

These videos offer an introduction to deepfakes and their potential for damage. 

Readings

Dig deeper into synthetic media with these articles.

Activities

Look at the examples in this post from Creative Blog ‘12 Deepfake Examples That Terrified and Amused the Internet’ https://www.creativebloq.com/features/deepfake-examples. Which ones do you think are the most convincing? Most worrying? Add your comments. 

Try your hand at creating a cheapfake using one of these tools or one you have tried in the past. You can post a link to it in the comments section or you can email it to minn23@gmail.com. There might be a prize for the best one!

Suggested Tools

There are dozens of apps for iPhone and Android, as well as websites, that allow creation of cheapfakes for various uses. 

Sign or Poster Creators

GIF Creators

Face Swap/Face Swap Lite 

  • Snapchat on Android or iOS.

Meme Makers

Explore

https://filmora.wondershare.com/video-editor/best-face-swap-apps.html

  • 8 Best Meme Maker App to Create Memes with Your Own Picture

https://filmora.wondershare.com/meme/best-meme-maker-app.html

How could you use these tools in your library either with staff or patrons or as part of an educational or PR program? 

Conversation Starters

How are synthetic media similar to or different than traditional plagiarism? Are there elements of how we address the latter – in terms of recognition and verification – that could inform how we address the former?

How would you suggest that we address synthetic media while still respecting the 1st Amendment?

Is the debate on these issues robust enough? How do we tailor awareness-raising, training and/or education to each audience?

Evaluation

Additional Readings

If you missed the Media Landscape presentation on Deepfakes with John Mack Freeman, you can view it here

2 thoughts on “Thing 5”

  1. The article from Gizmodo (Deepfake Lips Are Coming to Dubbed Films) in the recent ML:23 newsletter predicts that deepfakes may be coming to movies to replace/augment subtitles. I would welcome this–I do not like subtitles. While I understand that subtitles–and closed captioning–are important for accessibility, I don’t usually only watch a movie/show–I am doing something else at the same time, eg, cooking or sewing–and don’t want to have to focus on subtitles. Lucky for me I don’t need closed captioning. I subscribe to Acorn TV which is mostly UK, NZ, and other English-speaking countries’ shows, but there are a few in Swedish or Dutch I would like to watch if the ‘deepfake lips’ were added. What about you–do you like subtitles? Or do you see other positive uses for this or similar technology?

    1. I would welcome this, too, Ann. I hate when the visual and audio don’t sync up! But it does raise some questions. Would an actor need to give consent to have their face/performance altered? Is this different than altering just their voice? How much alteration is too far?

      I just came across this article. Check out the pretty amazing use of deepfake dubbing: https://arstechnica.com/gaming/2021/05/robert-de-niro-speaks-fluent-german-in-taxi-driver-thanks-to-ai/?utm_source=join1440&utm_medium=email&utm_placement=newsletter

Leave a Reply