Behind the screens, these unpaid moderators are keeping online communities safe - Action News
Home WebMail Friday, November 22, 2024, 04:57 PM | Calgary | -10.8°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Edmonton

Behind the screens, these unpaid moderators are keeping online communities safe

Despite hate directed at them personally, Facebook and Reddit moderators combat harmful content online daily to keep online communities safe. Some say they know that without their efforts, these spaces would collapse.

Volunteers face challenges daily filtering harmful content from Facebook, Reddit

A woman with long blond hair looks into the camera lens.
Lisa Rufiange spends at least two or three hours a day moderating two Facebook groups, one for mothers and the other for prospective renters. (Submitted by Lisa Rufiange)

Lisa Rufiange wakes up with her partner at 5 a.m. and checks her Facebook.

Specifically, she reviews the two groups she moderates: Edmonton Moms Community Group and Edmonton Apartment, Houses, Rooms For Rent.

She sorts through requests to join the group, and other requests from users to make anonymous posts,then checks overnight posts to ensure none broke anyrules and need removing.

When she's finished, she goes back to sleep until her alarm goes off at 7:30 a.m.

"I know if I don't check it, it will just be more overwhelming later in the morning," Rufiange said in an interview.

"It's best to bite it off in small increments."

LISTEN | Why do moderators take on a stressful, time-consuming job that does not pay?
If you're part of a Facebook group, or if you post on Reddit, you're likely familiar with moderators. They help run social media sites and remove content that doesn't belong. Often, they are volunteers and their jobs aren't always easy. Our producer, Kashmala Fida Mohatarem, has been talking to local moderators.

Depending on the volume of content, Rufiange spends at least two or three hours a day refereeingherFacebook groups.

She'sone of manymoderators on Facebook, Reddit and other social media sites. Most communities have multiple moderators who filter out rude, inappropriate or harmful content in order to make the online spacessafe and positive experiences for members.

For some, moderating is a paid job. Facebook parent company Meta, blog sites and news organizations hire content management companies that employ hundreds of moderators whose sole job is to removeviolent and graphic content.

Those moderators are exposed to the worst parts of the internet, including videos showing murder and sexual assault.

Although volunteer moderators in Alberta don't deal with that level of gruesome content,the unpaid labour does comes with its own challenges.

Moderators often face hate, vitriol,death threats and doxxing when someone'spersonal information,such as where they live or work,is shared publicly online.

For these reasons, some moderators especially on Reddit prefer to remain anonymous.

They also monitor their online activityclosely to avoid revealing where they live or what they do for work.

Moderators do it because they care

Despite the negativity, many volunteer moderators persist because they care about the communities and know that without their efforts, these spaces would collapse.

Troy Pavlek, one of nine moderators for r/Edmonton, a subreddit dedicated to city-related discussions, has been doing the job for the past five years. Each day hespends two orthree hours or more sifting through thousands of comments.

He said most people have a very romanticized idea of moderating, which he said can be a thankless job.

"There's this pervasive theory that moderators are people who are power-tripping and are getting off on really destroying these communities," Pavleksaid.

A person working on a computer at a desk.
A Reddit moderator at work in Edmonton. (Nathan Gross/CBC)

"There's a lot of vitriolic comments. There's a lot of racism, especially nowthere's a lot of Islamophobia. And your job is to sift through that and remove it so that people who are just spending an afternoon and loading up the site, don't get bombarded with that."

Rufiange said although hate is directed towardher personally, just doing the job can be overwhelming. She reaches out to newcomers joining the rental group on Facebook to make them aware of how the process works in Canada, and help them avoid gettingscammed.

In the mom group, she has had to give direction toparents who should have been calling authorities instead of posting on Facebook. Mothers sometimes ask for advice aboutserious medical conditions, a child's drug useor even sexual abuse, she said.

"We have contacted the appropriate authorities after getting as much information as we could via the post itself," Rufiange said.

Bots and real people can make trouble

Often moderators deal with bots automated programs that post content and engage with users on social media by following them, liking their posts or responding to posts with comments.

But other problems come from real people who are disruptive and spread negativity.

According to Bree McEwan, an associate professor at the Institute of Communication, Culture, Information and Technology at the University of Toronto, those people can be categorized in two groups.

"One is people who are really, really ego-involved in a particular issue, which means that issue is central to their identity," she said, adding that when they post awful content, they feel it's their right to express their opinion.

The other group consists of people who post maliciously because they enjoy being sadistic, she said.

Facebook and Reddit have developed tools to help moderators remove content without viewing it directly. There are filters to automatically remove posts with certain words, but McEwan said people often find ways around these measures.

"It's really difficult for an algorithm to understand the context," she said.

Moderators have experienced this firsthand. One of them is a 35-year-oldEdmontonian who said moderators have written code aimed at preventing spam and hate speechfrom getting posted.

"We have to constantly update it to get whatever new and clever dog whistle term is used," said the moderator, whospoke to CBC on the condition that his name not be used, ashe hasbeen doxxedin the past.

Though some moderators admit to having thick skin, many said they rely on fellow moderators for support. They use group chats to ensure the job isn't becoming too overwhelming.

If the moderators leave, the community collapses.- Bree McEwan

McEwan suggested that members of social media groupsand forums can help by being patient and understanding of what moderators are dealing with.

"If the rest of the group can be supportive, patient and really appreciative towards the moderators, I think that can go a long way to helping them realize why they're doing what they're doing," she said.

"Because if the moderators leave, the community collapses."

Despite the tough, thankless and mentally draining nature of the job, many moderators continue because they are passionate about their online communities.

Others, like Rufiange, do it because they wantto help people. Rufiange saidit's important to her that people have access to resources without fear of being scammed and bullied.

"If we have prevented even one person from hardship and given them a safe space," she said, "it's worth it."