Thursday , June 24 2021

& # 39; The Cleaners & # 39; looks for someone to clean up the toxic content on the Internet: NPR



The content moderator is responsible for deciding what we see and what is not in social media.

Courtesy of Gebrueder beetz filmproduktion


Hide Captions

Toggle subtitles

Courtesy of Gebrueder beetz filmproduktion

The content moderator is responsible for deciding what we see and what is not in social media.

Courtesy of Gebrueder beetz filmproduktion

Thousands of content moderators work 24/7 to ensure that there are no toxic content left on Facebook, YouTube, Google, and other online platforms. This may include fishing, sexually explicit photos or videos, or violent threats.

Such an effort, performed by humans and algorithms, has been heated debate in recent years. In April, Mark Zuckerberg addressed the Congressional Committee on how Facebook would work to reduce propaganda, hate speech, and the spread of other harmful content on the platform.

"By the end of the year, more than 20,000 people will be doing security and content review," Zuckerberg said.

The Cleaners, A documentary produced by filmmakers Hans Block and Moritz Riesewieck, we want to know exactly how the work is done. The film follows the five content mediators and realizes that their job is actually accompanied.

"I've seen hundreds of beheading, and sometimes they are lucky because their blades are so sharp," says a content moderator in a clip of the movie.

Block and Riesewieck became content mediators in the following interviews to explore the harsh reality. Consider everything.

Interview Highlights

On a typical day for Facebook content moderators

They see everything they do not want to see online in social media. It can be terrorism. The voice can be the same voluptuous video as I said before. It may be pornography, sexual abuse, or sneezing.

On the other hand, it can be useful in raising awareness of political debates or war crimes. So they have to look at thousands of photos every day, and they have to be quick to reach the score of the day. … Sometimes there are so many photos in one day. Then they need to decide whether to erase them or let them stay.

Facebook Pulitzer Prize Winner "Napalm Girl" Photo Removal Decision

This content facilitator has decided to remove it because it describes a young nude child. So he applies this rule against nudity children who are strictly forbidden.

So you always need to distinguish many different cases. … Too much gray area remains. Content moderators sometimes told us that their intuition should make a decision.

The Importance of Separating News Images or Art and Harmful Content

This is an overwhelming jump. Distinguishing all these kinds of rules is too complicated. … These young Filipino workers have been trained for three to five days, which is not enough to do this.

About the impact of content operators on daily exposure to toxic content

Many young people are very hurt because of their work.

The symptoms are very different. Sometimes people are afraid of going into public places because they review the terrorist attacks every day. Or watch the daily sexual abuse videos, so they are afraid to keep intimate relationships with boys or girlfriends. So this is the kind of influence that this work has ….

Manila [capital of the Philippines] Analog toxic waste was sent from the western world and spent several years on container ships. And digital garbage brought today. Now, in the air-conditioned office tower, thousands of young content coordinators click on images of infinite poisonous waters and numerous intellectual junk.

Emily Kopp and Art Silverman edited and produced this story for the broadcast. Cameron Jenkins created this story digitally.


Source link